I have this native interface:
void CLASS_Version(char *Version);
I tried to import it with:
[DllImport("class.dll", EntryPoint = "CLASS")]
private static extern void CLASS_Version(ref string[] Version);
or
[DllImport("class.dll", EntryPoint = "CLASS")]
private static extern void CLASS_Version(ref char[] Version);
[DllImport("class.dll", EntryPoint = "CLASS")]
private static extern void CLASS_Version(out string[] Version);
[DllImport("class.dll", EntryPoint = "CLASS")]
private static extern void CLASS_Version(out char[] Version);
But I alway get "AccessViolation" error,
The only good run was made with
[DllImport("class.dll", EntryPoint = "CLASS")]
private static extern void CLASS_Version(ref char Version);
but this way I got only the first char of the strings... how to get all string?
char * is ambiguous, but it definitely isn't an array of strings. Most likely, it's a pointer to a string, so you'll use just a simple StringBuilder (no ref or out).
Also, make sure to use the proper marshalling attributes. .NET strings are always widechars, unlike your signature.
In general, a signature of a function isn't enough for proper interop with native code. You need to understand the meaning of the arguments and the return values, and you need to know the calling convention. You need to read the documentation, in other words :)
Basically pointers are represented with IntPtr type.
plus, the entry point should be the string representing the function name
try:
[DllImport("class.dll", EntryPoint = "CLASS_Version")]
private static extern void CLASS_Version(IntPtr Version);
try this:
[DllImport("class.dll", EntryPoint = "CLASS")]
private static extern void CLASS_Version([MarshalAs(UnmanagedType.VBByRefStr)] ref string Version);
And when you are going to call your method:
Version = Space(14);// first declare memory space requiered ¿14 bytes? ¿more?
CLASS_Version(Version);
Sadly the answer cannot be determined by type alone.
If it were that simple there would be parsers that could write the native wrapper for you.
The type you have to use depends on what the function is actually doing.
In your case the char * is not marked const, and it is being accepted as a parameter, which implies that it's intended to be a user-allocated area of memory available for the function to write to. As there is no size parameter, there is most likely a maximum size that the version string can be, which should be indicated in the documentation of the code.
Given that this is string handling, you also have to worry about the encoding. For the sake of simplicity I'm going to assume (and hope) your string is in ASCII/Windows-1252 and not UTF8, UTF7 or some other format.
Given these assumptions, you have several options, but I will be presenting just the preferred way of handling this based on the information provided. (It may be that your situation requires something different, but this is the best solution I can suggest based on asumptions inferred from the information in your question.)
[DllImport("class.dll", EntryPoint = "CLASS_Version", , CharSet = CharSet.Ansi)] // CharSet is important
private static extern void CLASS_Version(StringBuilder Version);
This is the 'correct' way to manage the situation - rely on the compiler to handle the marshalling for you. One small caveat however is that you must manually set the capacity of the StringBuilder before passing it to the method.
// be sure to allocate a correct size,
// there will be no error if it overflows because it's too small
StringBuilder buffer = new StringBuilder(size);
// automagically marshalled to and from unmanaged code byt the compiler
CLASS_Version(buffer);
string version = buffer.ToString();
I'd like to take this opportunity to point out that CLASS_Version shouldn't be private. All your native methods should be made public and grouped together in one internal static class.
Some additional resources about string marshalling that you may find handy:
https://limbioliong.wordpress.com/2011/11/01/using-the-stringbuilder-in-unmanaged-api-calls/
Related
I am trying to call a function that converts sql queries between dialects. I am using an open-source project's DLL to use the functions for conversion in a C# university project. The problem i am facing is that i am getting a access violation reading location.
I've read some posts here on stack overflow which suggest that there might be a bad pointer somewhere, but i cannot find where. My pointers are not corrupt
The function for the conversion is this :
int ConvertSql(void *parser, const char *input, int64_t size, const char
**output, int64_t *out_size, int64_t *lines)
{
if(parser == NULL)
return -1;
SqlParser *sql_parser = (SqlParser*)parser;
// Run conversion
sql_parser->Convert(input, size, output, out_size, lines);
return 0;
}
I am calling the function in C#
char *parentInput;
fixed(char *input = &inputStr.ToCharArray()[0])
{
parentInput = input;
}
char** output = null;
Int64 out_size = 0;
Int64 lines = 0;
Int64 size = inputStr.Length;
Console.WriteLine(new IntPtr(&out_size)+" "+ new IntPtr(&lines)+" "+new IntPtr(&parserObj)+" "+new IntPtr(output));
int result = ConvertSql(&parserObj, intputStr, size, output, &out_size, &lines);
i get my parser object from this code which works without errors:
IntPtr parserObj = CreateParserObject();
the dllimport for the funtions are using this code:
[DllImport(dllName: "PATHTODLLFOLDER\\sqlparser.dll", EntryPoint = "CreateParserObject", CallingConvention = CallingConvention.Cdecl)]
public static extern IntPtr CreateParserObject();
[DllImport(dllName: "PATHTODLLFOLDER\\sqlparser.dll", EntryPoint = "ConvertSql", CallingConvention = CallingConvention.Cdecl)]
public unsafe static extern int ConvertSql(void *parser, String input, Int64 size, char **output, Int64 *out_size, Int64 *lines);
In .NET, calling an unmanaged method through P/invoke (which is what happens when you call an extern method) involves various type conversions on the parameters, which is known as "marshalling" and is done automatically by a part of the runtime known as the "marshaller".
In general, it's a terrible idea to marshal pointers. Instead, use the CLR marshaller's ability to convert certain types to pointers for you by changing the signature of your P/invoked method:
// split on multiple lines for readability
[DllImport("PATHTODLLFOLDER\\sqlparser.dll", EntryPoint = "ConvertSql", CallingConvention = CallingConvention.Cdecl)]
public static extern int ConvertSql(
IntPtr parser,
[MarshalAs(UnmanagedType.LPStr)] string input,
long size,
out IntPtr output, // see explanation below
out long out_size,
out long lines);
A few things about the above. First, I took the liberty of using the C# type aliases (string, long), because that's more idiomatic C#, but it doesn't change the behavior. Also, since there are no pointers anymore, no need for unsafe.
First, I declared parser as an IntPtr because those get converted to void* automatically if needed and that's what's already returned by CreateParserObject() anyway.
Second, out parameters can be converted to pointers to uninitialized objects, so by marking both out_size and lines as out, you fix that other problem.
The input is a string, which has a specific format in .NET. Since your function takes a const char*, you need to tell the marshaller how to convert the characters. That's where the MarshalAs attribute comes in: it's when the default conversion doesn't work for you. UnmanagedType.LPStr means char* in this case, so the string gets converted. The runtime will manage the memory for you.
But here we hit a big snag in the road: the output. When marshalling things, there's always questions about lifetime, or, specifically: who releases the memory? The fact that output is a char** implies that the parser allocates a block of memory and then returns it through that, which means the caller has to release it. From the get go, that reeks of bad C++ design because the caller doesn't know how the memory was allocated. Was it with malloc? new[]? Platform specific APIs like LocalAlloc? Is it a pointer to a block of static memory? Usually these APIs come with documentation telling precisely what to do with the pointer once you're done with it. Good C++ APIs return smart pointers, or ask that the caller pass a block of previously allocated memory that they can then play with.
But, this is the thing you're playing with, so here's how to make it work. First, you'd think that you could declare output as [MarshalAs(UnmanagedType.LPStr)] out string: the marshaller will copy the characters into a managed string and return it... but then the native string's (on the C++ side) memory will leak because the runtime doesn't know how the string was allocated, so it prefers to not do anything about it. Also, this assumes the string is null-terminated, which might not always be the case.
So, another option would be to instead declare output as out IntPtr. Then you can use Marshal.PtrToStringAnsi to convert your pointer into a string, and then release it... but you'll need to know how it was allocated first.
Putting it all together:
var parserObj = CreateParserObject();
var output = IntPtr.Zero;
try
{
long lines;
long out_size;
int result = ConvertSql(parserObj, inputStr, inputStr.Length, out output, out out_size, out lines);
var outputStr = Marshal.PtrToStringAnsi(output, (int)out_size);
// do what you want with outputStr here
}
finally
{
if (output != IntPtr.Zero)
{
// release output here
}
}
Oh, also, one final thought: whatever is returned by CreateParserObject() will probably have to be freed at one point. You're probably going to need another function for it, probably like:
[DllImport(/* ... */)]
public static extern void DestroyParserObject(IntPtr parserObject);
It might even already exist in your DLL.
Good luck!
I am trying to call a DLL written in Rust from a C# program. The DLL has two simple functions that take stings (in different manner) and prints to the console.
Rust DLL code
#![crate_type = "lib"]
extern crate libc;
use libc::{c_char};
use std::ffi::CStr;
#[no_mangle]
pub extern fn printc(s: *const c_char){
let c_str : &CStr = unsafe {
assert!(!s.is_null());
CStr::from_ptr(s)
};
println!("{:?}", c_str.to_bytes().len()); //prints "1" if unicode
let r_str = std::str::from_utf8(c_str.to_bytes()).unwrap();
println!("{:?}", r_str);
}
#[no_mangle]
pub extern fn print2(string: String) {
println!("{:?}", string)
}
C# console program code
[DllImport("lib.dll", CharSet = CharSet.Unicode, CallingConvention = CallingConvention.Cdecl)]
static extern void print2(ref string str);
[DllImport("lib.dll", CallingConvention = CallingConvention.Cdecl)]
static extern void printc(string str);
static void Main(string[] args)
{
try
{
var graw = "yeyeye";
printc(graw);
print2(ref graw);
}
catch (Exception ex)
{
Console.WriteLine("calamity!, {0}", ex.Message);
}
Console.ReadLine();
}
For the print2 function it keep printing garbage on screen until it causes AccessViolationException
The 2nd printc function does print the string, but only if CharSet.Unicode is not set. If it is set, it will only print the first char, hence the println!("{:?}", c_str.to_bytes().len()); will print 1.
I believe that Cstr::from_ptr function does not support Unicode, that is why it returns only the first char of the string.
Any idea how to pass Unicode string as parameters to Rust DLLs? Is it possible to make things simpler like in print2 function?
If you check the documentation on CharSet, you'll see that CharSet.Unicode tells .NET to marshal strings as UTF-16 (i.e. two bytes per code point). Thus, .NET is trying to pass printc what should be a *const u16, not a *const libc::c_char. When CStr goes to compute the length of the string, what it sees is the following:
b"y\0e\0y\0e\0y\0e\0"
That is, it sees one code unit, then a null byte, so it stops; hence why it says the length is "1".
Rust has no standard support for UTF-16 strings, but if you're working on Windows, there are some conversion methods: search the docs for OsStrExt and OsStringExt. Note that you must use the docs that installed with the compiler; the ones online won't include it.
Sadly, there's nothing for dealing directly with null-terminated UTF-16 strings. You'll need to write some unsafe code to turn a *const u16 into a &[u16] that you can pass to OsStringExt::from_wide.
Now, Rust does use Unicode, but it uses UTF-8. Sadly, there is no direct way to get .NET to marshal a string as UTF-8. Using any other encoding would appear to lose information, so you either have to explicitly deal with UTF-16 on the Rust side, or explicitly deal with UTF-8 on the C# side.
It's much simpler to re-encode the string as UTF-8 in C#. You can exploit the fact that .NET will marshal an array as a raw pointer to the first element (just like C) and pass a null-terminated UTF-8 string.
First, a static method for taking a .NET string and producing a UTF-8 string stored in a byte array:
byte[] NullTerminatedUTF8bytes(string str)
{
return Encoding.GetBytes(str + "\0");
}
Then declare the signature of the Rust function like this:
[DllImport(dllname, CallingConvention = CallingConvention.Cdecl)]
static extern void printc([In] byte[] str);
Finally, call it like this:
printc(NullTerminatedUTF8bytes(str));
For bonus points, you can rework printc to instead take a *const u8 and a u32, passing the re-encoded string plus it's length; then you don't need the null terminator and can reconstruct the string using the std::slice::from_raw_parts function (but that's starting to go beyond the original question).
As for print2, that one is just unworkable. .NET knows nothing about Rust's String type, and it is in no way compatible with .NET strings. More than that, String doesn't even have a guaranteed layout, so binding to it safely is more or less not possible.
All that is a very long-winded way of saying: don't use String, or any other non-FFI-safe type, in cross-language functions, ever. If your intention here was to pass an "owned" string into Rust... I don't know if it's even possible to do in concert with .NET.
Aside: "FFI-safe" in Rust essentially boils down to: is either a built-in fixed-size type (i.e. not usize/isize), or is a user-defined type with #[repr(C)] attached to it. Sadly, the "FFI-safe"-ness of a type isn't included in the documentation.
I have a couple of questions regarding the following:
[DllImport("libmp3lame.dll", CharSet = CharSet.Ansi)]
static extern IntPtr get_lame_version();
public static string GetLameVersion()
{
IntPtr pVersion = get_lame_version();
string version = Marshal.PtrToStringAnsi(pVersion);
return version;
}
Where is the memory of the string pointed to by pVersion allocated?
Is this memory automatically freed when pVersion goes out of scope?
If yes, how does that happen?
If no, how do I free the memory?
The string returned by this function is statically allocated and you do not need to free that memory. This means that your current code is already exactly what you need.
This is an open source project and so a websearch leads to the source code for the implementation of this function to confirm this.
As an aside, your p/invoke is incorrect, although it is benign. It should be:
[DllImport("libmp3lame.dll", CallingConvention=CallingConvention.Cdecl)]
static extern IntPtr get_lame_version();
There's no need to specify CharSet since the function has no text parameters. And in any case Ansi is the default so you still would not need to specify that. The calling convention is, in general, important and will need to be set for all your LAME imports. It doesn't actually matter for a function with no parameters, but specifying the calling convention is a good habit to get in to.
I have a C++ function exported as api like this:
#define WIN322_API __declspec(dllexport)
WIN322_API char* Test(LPSTR str);
WIN322_API char* Test(LPSTR str)
{
return "hello";
}
the function is exported as API correctly by the .DEF file, cause i can see it in Dependency Walker tool.
Now i have a C# tester program:
[DllImport("c:\\win322.dll")]
public static extern string Test([MarshalAs(UnmanagedType.LPStr)] String str);
private void Form1_Load(object sender, EventArgs e)
{
string _str = "0221";
Test(_str); // runtime error here!
}
on calling the Test() method i get the error:
"A call to PInvoke function 'MyClient!MyClient.Form1::Test' has unbalanced the stack. This is likely because the managed PInvoke signature does not match the unmanaged target signature. Check that the calling convention and parameters of the PInvoke signature match the target unmanaged signature."
i tried many other data types and marshalings, but got nothing!
plz help me!
It is caused by a mismatch on the calling convention, the default for [DllImport] is Stdcall but the C compiler's default is Cdecl. Use the CallingConvention property in the declaration.
That's not the only problem though, this code will crash on Vista and Win7. Returning a string from a C function is quite troublesome, there's a memory management problem. It isn't clear who is responsible for freeing the string buffer. You are returning a literal now but that's going to stop being useful pretty soon. Next stop is using malloc() for the return string with the intent for the caller to call free(). That's not going to work, the pinvoke marshaller cannot call it since it doesn't know what heap the C code is using.
It will call Marshal.FreeCoTaskMem(). That's wrong, the string wasn't allocated by CoTaskMemAlloc(). That goes unnoticed on XP and earlier, other than the very hard to diagnose memory leak this causes. And goes kaboom on Vista and Win7, they have a much more stricter memory manager.
You need to rewrite the C function like this:
extern "C" __declspec(dllexport)
void __stdcall Test(const char* input, char* output, int outLen);
Now the caller supplies the buffer, through the output argument, there's no longer a guess who owns the memory. You use StringBuilder in the C# declaration.
[DllImport("foo.dll")]
private static extern void Test(string input, StringBuilder output, int outLen);
...
var sb = new StringBuilder(666);
test("bar", sb, sb.Capacity);
string result = sb.ToString();
Be careful to use the outLen argument in your C code so that you can be sure not to overflow the buffer. That corrupts the garbage collected heap, crashing the app with a Fatal Execution Engine Error.
Change your macro definition to
#define WIN322_API __declspec(dllexport) __stdcall
As an alternative, use CallingConvention.Cdecl when importing.
Read, for example, here for more info on calling conventions.
Make sure that you have the correct calling convention. Your DLL may use Cdecl, while C# defaults to StdCall. It's better to always explicitly define the calling convention.
Especially when using Windows functions which exist in an ANSI and wide char version (those with an A or W prefix), explicitly specify the CharSet so the correct version is used.
When the function returns a value, explictly marshal the return value. Otherwise, the compiler chooses a default, which may be wrong. I suspect this is (also) the problem here.
So change to this, for example:
[DllImport("c:\\win322.dll", CharSet = CharSet.Ansi, CallingConvention = CallingConvention.StdCall)]
[return: MarshalAs(UnmanagedType.LPStr)]
public static extern string Test([MarshalAs(UnmanagedType.LPStr)] String str);
Try returning LPSTR, not char*. You might also need to specify stdcall calling convention, which is default in .NET, but I'm not sure about your unmanaged project.
Use Unmanagedtype.LPTStr for the input. Notice the additional T
[MarshalAs(UnmanagedType.LPStr)] String str // your current code
[MarshalAs(UnmanagedType.LPTStr)] String str // try this code
Pass a StringBuilder from .Net as a parameter rather than returning a string (in this case it will be like an out parameter)
I'm working on a C# application that supports two communications interfaces, each supported by its own DLL. Each DLL contains the same function names, but their implementation varies slightly depending on the supported interface. As it is, users will typically have only one DLL installed on their machine, not both. The DLL for the old interface is imported like this:
[DllImport("myOldDll.dll",
CharSet = CharSet.Auto,
CallingConvention = CallingConvention.StdCall)]
public static extern int MyFunc1( void );
public static extern int MyFunc2( void );
public static extern int MyFunc3( void );
Would this be a valid way to attempt to bring in either DLL?
[DllImport("myOldDll.dll",
CharSet = CharSet.Auto,
CallingConvention = CallingConvention.StdCall)]
[DllImport("myNewDll.dll",
CharSet = CharSet.Auto,
CallingConvention = CallingConvention.StdCall)]
public static extern int MyFunc1( void );
public static extern int MyFunc2( void );
public static extern int MyFunc3( void );
Ideally, I suppose it would be nice to detect a missing DLL and load the second DLL if the attempt to load the first fails. Is there a graceful way to do that?
How about doing a P/Invoke to `LoadLibrary'?
In .NET 1.1 you would need to create a proxy unmanaged DLL (write it in C or Delphi or ... ) and call it's methods, and that unmanaged DLL would do the rest. In .NET 2.0 and later you use Assembly.LoadFile() and further. Not as elegant as just declarations you attempted to use, and requires quite a lot of coding. So I'd suggest a proxy way if possible.
Perhaps you should give the methods imported from either DLL different names, and then have a delegate in your program that you point to one or the other (whichever is appropriate), and only call the delegate.
It sounds like you would be best served re-architecting to a modular plugin style interface.
There are a billion and a half examples of this on the web, like this one. In a nutshell though, you use LoadAssembly() on a directory of DLLs, then cast back to your common base interface.
I think I found a workable solution:
C# check that a file destination is valid
Thanks, everyone, for your input!