I've got an unmanaged c++ DLL that I need to call from a Windows Mobile C# app.
I've got the C# wrapper and it works nicely in desktop. I can call the DLL functions from a C# desktop program and pass strings around with no problem.
However, when I compile the lib and the wrapper for the mobile platform, I get an error in the DllImport lines saying that the CharSet.ANSI is not recognized. The only options I'm allowed to write are CharSet.Auto and CharSet.Unicode.
The problem is that, regardless of this setting, the strings that are received in the c++ functions are wide char strings, and not plain char* strings that is what they expect.
We can use wcstombs() to translate all strings at the beginning of each c++ function, but I'd rather not modify the lib to such an extent...
Is there a way to fix the marshalling between C# and C that works with the .NET Compact Framework?
No, there isn't.
Microsoft documentation specifies that:
[...] the .NET Compact Framework
only supports Unicode, and consequently only includes the
CharSet.Unicode (and CharSet.Auto
which equals Unicode) value, and does
not support any of the clauses of the
Declare statement. This means that the
ExactSpelling property is also not
supported.
As a result, if your DLL function
expects an ANSI string, you'll need
to perform the conversion in the DLL,
or convert the string to a byte array
using the overloaded GetBytes method
of the ASCIIEncoding class, before
calling the function, since the .NET
Compact Framework will always pass a
pointer to the Unicode string. [...]
The solution is:
Functions in the DLL
int MARSHALMOBILEDLL_API testString(const char* value);
const char* MARSHALMOBILEDLL_API testReturnString(const char* value);
Wrapper
[DllImport("marshalMobileDll.dll")]
public static extern int testString(byte[] value);
[DllImport("marshalMobileDll.dll")]
public static extern System.IntPtr testReturnString(byte[] value);
Calling Code
string s1 = "1234567";
int v = Wrapper.testString( Encoding.ASCII.GetBytes(s1));
string s2 = "abcdef";
IntPtr ps3 = Wrapper.testReturnString(Encoding.ASCII.GetBytes(s2));
string s3 = IntPtrToString(ps3);
private string IntPtrToString(IntPtr intPtr)
{
string retVal = "";
byte b = 0;
int i = 0;
while ((b = Marshal.ReadByte(intPtr, i++)) != 0)
{
retVal += Convert.ToChar(b);
}
return retVal;
}
Windows CE is heavily biased toward Unicode (most Win32 APIs don't even have ANSI equivalents). As such, the CF doesn't really do well with ANSI either and it needs a little "help" in getting it right.
You can tell the marshaler that you want to pass the data as single-byte, null terminated values by using the MarshalAs attribute (the MSDN docs clearly show it is supported in the CF), something along these lines:
[DllImport("mydll.dll", SetLastError = true)]
public static extern void Foo([MarshalAs(UnmanagedType.LPStr)]string myString);
I find this marshal compiler useful even thou it is a bit buggy.
Related
I am trying to call a function that converts sql queries between dialects. I am using an open-source project's DLL to use the functions for conversion in a C# university project. The problem i am facing is that i am getting a access violation reading location.
I've read some posts here on stack overflow which suggest that there might be a bad pointer somewhere, but i cannot find where. My pointers are not corrupt
The function for the conversion is this :
int ConvertSql(void *parser, const char *input, int64_t size, const char
**output, int64_t *out_size, int64_t *lines)
{
if(parser == NULL)
return -1;
SqlParser *sql_parser = (SqlParser*)parser;
// Run conversion
sql_parser->Convert(input, size, output, out_size, lines);
return 0;
}
I am calling the function in C#
char *parentInput;
fixed(char *input = &inputStr.ToCharArray()[0])
{
parentInput = input;
}
char** output = null;
Int64 out_size = 0;
Int64 lines = 0;
Int64 size = inputStr.Length;
Console.WriteLine(new IntPtr(&out_size)+" "+ new IntPtr(&lines)+" "+new IntPtr(&parserObj)+" "+new IntPtr(output));
int result = ConvertSql(&parserObj, intputStr, size, output, &out_size, &lines);
i get my parser object from this code which works without errors:
IntPtr parserObj = CreateParserObject();
the dllimport for the funtions are using this code:
[DllImport(dllName: "PATHTODLLFOLDER\\sqlparser.dll", EntryPoint = "CreateParserObject", CallingConvention = CallingConvention.Cdecl)]
public static extern IntPtr CreateParserObject();
[DllImport(dllName: "PATHTODLLFOLDER\\sqlparser.dll", EntryPoint = "ConvertSql", CallingConvention = CallingConvention.Cdecl)]
public unsafe static extern int ConvertSql(void *parser, String input, Int64 size, char **output, Int64 *out_size, Int64 *lines);
In .NET, calling an unmanaged method through P/invoke (which is what happens when you call an extern method) involves various type conversions on the parameters, which is known as "marshalling" and is done automatically by a part of the runtime known as the "marshaller".
In general, it's a terrible idea to marshal pointers. Instead, use the CLR marshaller's ability to convert certain types to pointers for you by changing the signature of your P/invoked method:
// split on multiple lines for readability
[DllImport("PATHTODLLFOLDER\\sqlparser.dll", EntryPoint = "ConvertSql", CallingConvention = CallingConvention.Cdecl)]
public static extern int ConvertSql(
IntPtr parser,
[MarshalAs(UnmanagedType.LPStr)] string input,
long size,
out IntPtr output, // see explanation below
out long out_size,
out long lines);
A few things about the above. First, I took the liberty of using the C# type aliases (string, long), because that's more idiomatic C#, but it doesn't change the behavior. Also, since there are no pointers anymore, no need for unsafe.
First, I declared parser as an IntPtr because those get converted to void* automatically if needed and that's what's already returned by CreateParserObject() anyway.
Second, out parameters can be converted to pointers to uninitialized objects, so by marking both out_size and lines as out, you fix that other problem.
The input is a string, which has a specific format in .NET. Since your function takes a const char*, you need to tell the marshaller how to convert the characters. That's where the MarshalAs attribute comes in: it's when the default conversion doesn't work for you. UnmanagedType.LPStr means char* in this case, so the string gets converted. The runtime will manage the memory for you.
But here we hit a big snag in the road: the output. When marshalling things, there's always questions about lifetime, or, specifically: who releases the memory? The fact that output is a char** implies that the parser allocates a block of memory and then returns it through that, which means the caller has to release it. From the get go, that reeks of bad C++ design because the caller doesn't know how the memory was allocated. Was it with malloc? new[]? Platform specific APIs like LocalAlloc? Is it a pointer to a block of static memory? Usually these APIs come with documentation telling precisely what to do with the pointer once you're done with it. Good C++ APIs return smart pointers, or ask that the caller pass a block of previously allocated memory that they can then play with.
But, this is the thing you're playing with, so here's how to make it work. First, you'd think that you could declare output as [MarshalAs(UnmanagedType.LPStr)] out string: the marshaller will copy the characters into a managed string and return it... but then the native string's (on the C++ side) memory will leak because the runtime doesn't know how the string was allocated, so it prefers to not do anything about it. Also, this assumes the string is null-terminated, which might not always be the case.
So, another option would be to instead declare output as out IntPtr. Then you can use Marshal.PtrToStringAnsi to convert your pointer into a string, and then release it... but you'll need to know how it was allocated first.
Putting it all together:
var parserObj = CreateParserObject();
var output = IntPtr.Zero;
try
{
long lines;
long out_size;
int result = ConvertSql(parserObj, inputStr, inputStr.Length, out output, out out_size, out lines);
var outputStr = Marshal.PtrToStringAnsi(output, (int)out_size);
// do what you want with outputStr here
}
finally
{
if (output != IntPtr.Zero)
{
// release output here
}
}
Oh, also, one final thought: whatever is returned by CreateParserObject() will probably have to be freed at one point. You're probably going to need another function for it, probably like:
[DllImport(/* ... */)]
public static extern void DestroyParserObject(IntPtr parserObject);
It might even already exist in your DLL.
Good luck!
I am trying to call a DLL written in Rust from a C# program. The DLL has two simple functions that take stings (in different manner) and prints to the console.
Rust DLL code
#![crate_type = "lib"]
extern crate libc;
use libc::{c_char};
use std::ffi::CStr;
#[no_mangle]
pub extern fn printc(s: *const c_char){
let c_str : &CStr = unsafe {
assert!(!s.is_null());
CStr::from_ptr(s)
};
println!("{:?}", c_str.to_bytes().len()); //prints "1" if unicode
let r_str = std::str::from_utf8(c_str.to_bytes()).unwrap();
println!("{:?}", r_str);
}
#[no_mangle]
pub extern fn print2(string: String) {
println!("{:?}", string)
}
C# console program code
[DllImport("lib.dll", CharSet = CharSet.Unicode, CallingConvention = CallingConvention.Cdecl)]
static extern void print2(ref string str);
[DllImport("lib.dll", CallingConvention = CallingConvention.Cdecl)]
static extern void printc(string str);
static void Main(string[] args)
{
try
{
var graw = "yeyeye";
printc(graw);
print2(ref graw);
}
catch (Exception ex)
{
Console.WriteLine("calamity!, {0}", ex.Message);
}
Console.ReadLine();
}
For the print2 function it keep printing garbage on screen until it causes AccessViolationException
The 2nd printc function does print the string, but only if CharSet.Unicode is not set. If it is set, it will only print the first char, hence the println!("{:?}", c_str.to_bytes().len()); will print 1.
I believe that Cstr::from_ptr function does not support Unicode, that is why it returns only the first char of the string.
Any idea how to pass Unicode string as parameters to Rust DLLs? Is it possible to make things simpler like in print2 function?
If you check the documentation on CharSet, you'll see that CharSet.Unicode tells .NET to marshal strings as UTF-16 (i.e. two bytes per code point). Thus, .NET is trying to pass printc what should be a *const u16, not a *const libc::c_char. When CStr goes to compute the length of the string, what it sees is the following:
b"y\0e\0y\0e\0y\0e\0"
That is, it sees one code unit, then a null byte, so it stops; hence why it says the length is "1".
Rust has no standard support for UTF-16 strings, but if you're working on Windows, there are some conversion methods: search the docs for OsStrExt and OsStringExt. Note that you must use the docs that installed with the compiler; the ones online won't include it.
Sadly, there's nothing for dealing directly with null-terminated UTF-16 strings. You'll need to write some unsafe code to turn a *const u16 into a &[u16] that you can pass to OsStringExt::from_wide.
Now, Rust does use Unicode, but it uses UTF-8. Sadly, there is no direct way to get .NET to marshal a string as UTF-8. Using any other encoding would appear to lose information, so you either have to explicitly deal with UTF-16 on the Rust side, or explicitly deal with UTF-8 on the C# side.
It's much simpler to re-encode the string as UTF-8 in C#. You can exploit the fact that .NET will marshal an array as a raw pointer to the first element (just like C) and pass a null-terminated UTF-8 string.
First, a static method for taking a .NET string and producing a UTF-8 string stored in a byte array:
byte[] NullTerminatedUTF8bytes(string str)
{
return Encoding.GetBytes(str + "\0");
}
Then declare the signature of the Rust function like this:
[DllImport(dllname, CallingConvention = CallingConvention.Cdecl)]
static extern void printc([In] byte[] str);
Finally, call it like this:
printc(NullTerminatedUTF8bytes(str));
For bonus points, you can rework printc to instead take a *const u8 and a u32, passing the re-encoded string plus it's length; then you don't need the null terminator and can reconstruct the string using the std::slice::from_raw_parts function (but that's starting to go beyond the original question).
As for print2, that one is just unworkable. .NET knows nothing about Rust's String type, and it is in no way compatible with .NET strings. More than that, String doesn't even have a guaranteed layout, so binding to it safely is more or less not possible.
All that is a very long-winded way of saying: don't use String, or any other non-FFI-safe type, in cross-language functions, ever. If your intention here was to pass an "owned" string into Rust... I don't know if it's even possible to do in concert with .NET.
Aside: "FFI-safe" in Rust essentially boils down to: is either a built-in fixed-size type (i.e. not usize/isize), or is a user-defined type with #[repr(C)] attached to it. Sadly, the "FFI-safe"-ness of a type isn't included in the documentation.
I have the following struct defined in C++:
struct GraphicsAdapterDesc {
// ... Just some constructors / operators / destructor here
DEFINE_DEFAULT_CONSTRUCTOR(GraphicsAdapterDesc);
DEFINE_DEFAULT_DESTRUCTOR(GraphicsAdapterDesc);
ALLOW_COPY_ASSIGN_MOVE(GraphicsAdapterDesc);
std::wstring AdapterName;
int32_t AdapterNum;
std::wstring HardwareHash;
int64_t DedicatedVMEM;
int64_t DedicatedSMEM;
int64_t SharedSMEM;
int32_t NumOutputs;
};
In C#, I have a 'mirror' struct declared thusly:
[StructLayout(LayoutKind.Sequential)]
public struct GraphicsAdapterDesc {
string AdapterName;
int AdapterNum;
string HardwareHash;
long DedicatedVMEM;
long DedicatedSMEM;
long SharedSMEM;
int NumOutputs;
};
I've tried to be really careful about matching up the widths of the variables (although I'm a bit unsure on what to do with the strings exactly).
Anyway, I have the following exported C method:
extern "C" __declspec(dllexport) bool GetGraphicsAdapter(int32_t adapterIndex, GraphicsAdapterDesc& outAdapterDesc) {
outAdapterDesc = RENDER_COMPONENT.GetGraphicsAdapter(adapterIndex);
return true;
}
And, the following extern method in my C# app:
[DllImport(InteropUtils.RUNTIME_DLL, EntryPoint = "GetGraphicsAdapter", CallingConvention = CallingConvention.Cdecl)]
internal static extern bool _GetGraphicsAdapter(int adapterIndex, out GraphicsAdapterDesc adapterDesc);
However, this doesn't work right when I call it. I get a different result depending on whether or not I'm compiling in x64 or x86 mode (both the C++ DLL and the C# app are compiled as x86 or x64):
In x86 mode, the call returns, but the struct has 'nonsense' values in, and the strings are all null,
In x64 mode, the call throws a NullPointerException.
My expectation is that I'm doing something wrong marshalling the strings, and that I need to specify 'wide-mode' for the characters, but I don't know how (or if that's even the right option).
Thank you in advance.
C++ types are not compatible with C# unless they're wrapped in managed C++. and you're using std::wstring which cannot be marshaled into .NET.
To interop successfully you'll either need to use a wchar_t[] or a whar_t* and tell C# now to marshal it.
I don't know what your macros are doing but this will only work if your c++ type is POD. c++11 Has an expanded sense of POD but I don't think you meet the expanded criteria anyway. Otherwise you can't guarantee layout. If you want to export your c++ classes to C# I would suggest you use c++\cli. Also you have wstring defined in your stuct which are definitely not POD. When you use DLLImport think C constructs only or you are going to have a bad time.
I am trying to call a method in a Delphi DLL with the following signature:
function SMap4Ovr(const OverFileName : ShortString ;
const Aclay : Integer ;
const Acarbon : Double ;
out errstr : ShortString): WordBool;
I am using the following import in C#:
[DllImport("SMap.dll", CallingConvention = CallingConvention.StdCall, CharSet = CharSet.Ansi)]
public static extern bool SMap4Ovr(
string OverFileName,
int Aclay,
double Acarbon,
out string errstr
);
But I am getting a AccessViolationException.
I seem to be able to call into a couple of simpler methods in the DLL that have string parameters but not ints or doubles.
I have also tried with the CallingConvention = CallingConvention.Cdecl but this gives me the same error.
When writing interop code it is critical that both sides of the interface match in every way. Here are the main issues that you must make agree on both sides:
Calling conventions.
Parameters lists.
Parameter types and semantics.
The first observation is that your calling conventions do not match. You have register on the Delphi side and stdcall on the C# side. The Delphi register convention is private to Delphi and so you should use stdcall.
Secondly, your string parameter types do not match. The Delphi shortstring is a data type that became legacy when Delphi 2 was released and should be considered a relic from the previous century. It was never a valid interop type, and there's nothing in the p/invoke framework that can be used to match it. Whilst you could attempt to do the marshalling by hand, this is a lot of work that is simply not needed when there are simple solutions available. You should try to forget all about shortstring.
You need to use a string type that both sides of the interface can work with. You could use null-terminated C strings, but a better and simpler choice is the COM BSTR which is WideString in Delphi.
So, the final result is as follows.
Delphi
function SMap4Ovr(
OverFileName: WideString;
Aclay: Integer;
Acarbon: Double;
out errstr: WideString
): WordBool; stdcall;
C#
[DllImport("SMap.dll")]
public static extern bool SMap4Ovr(
[MarshalAs(UnmanagedType.BStr)]
string OverFileName,
int Aclay,
double Acarbon,
[MarshalAs(UnmanagedType.BStr)]
out string errstr
);
I did not bother specifying the calling convention on the DllImport since the default is stdcall. If you prefer you can be explicit about this.
Be careful when using WideString that you don't attempt to use it as a return value. Because Delphi uses non-standard semantics for return values, you can only use simple types that fit into a register as return values.
Default calling convention in Delphi is register, not stdcall. It seems calling conventions details show us that Microsoft fastcall is not the same as Borland fastcall (register)
And C# string type differs from Delphi ShortString (it contains internally one byte length + string body)
I need to use an unmanaged COM dll in c# program. Dll contains a function, say:
Open(char *name);
But when imported to c# (Project->Add Reference) it is available as:
mydll.Open(ref byte name)
How can I pass a string to this function?
When I do:
byte[] name = new byte[32];
mydll.Open(ref name);
I get compilation error "Cannot convert ref byte[] to ref byte".
If you mean for it to be a string, then in your IDL file, you have to specify that this point represents a string. See this article for information on the [string] attribute:
http://msdn.microsoft.com/en-us/library/d9a4wd1h%28v=VS.80%29.aspx
If you want to be CLS compliant (and interoperate with scripting languages, you might want to look into using BSTR instead of char* for passing strings). This way you'll get unicode support too.
Unless you give COM the hint that this is a string, you will have problems whenever COM has to marshal the parameters (i.e. across apartment or process boundaries).
This article may also give you a good starting point on C++ / C# / COM goodies:
COM Interop Part 1: C# Client Tutorial
Maybe you can do this...
byte[] bytes = Encoding.ASCII.GetBytes(myString);
You might try decorating that "name" variable with:
[System.Runtime.InteropServices.MarshalAs(System.Runtime.InteropServices.UnmanagedType.LPStr)]
That's a single byte, and I think is compatibel with a single char. If not, the answer is likely going to be using MarshalAs to make that variable look like type.
Keep in mind you could lose it if the array is not properly terminated. Anyhow, I would try passing in the pointer to the first element byte[0] try:
mydll.Open(ref name[0]);
I'm not sure how the interop will marshal this but it's worth a try.
The import is not correct. You can import it manually:
[DllImport("<Your COM Dll>")]
private static extern <Return Type> <"Function Name">();
Then, in your main method, or in the method where you initialize your dll object, you need:
[DllImport("kernel32.dll", CharSet = CharSet.Auto)]
private static extern IntPtr LoadLibrary(string lpFileName);
public MyDll()
{
Environment.CurrentDirectory = Path.GetDirectoryName(Assembly.GetExecutingAssembly().Location);
string dllPath = Environment.CurrentDirectory + #"<Location of Dll you are importing from>";
LoadLibrary(dllPath);
}
For example, check out the following COM Dll:
GOIO_DLL_INTERFACE_DECL gtype_int32 GoIO_GetNthAvailableDeviceName(
char *pBuf,
gtype_int32 bufSize,
gtype_int32 vendorId,
gtype_int32 productId,
gtype_int32 N);
I imported this Dll as the following:
[DllImport("GoIO_DLL.dll")]
private static extern int GoIO_GetNthAvailableDeviceName(
byte[] pBuf,
int bufSize,
int vendorId,
int productId,
int N);
As you can see, the char pointer becomes a byte[], just like you tried. There is no need for the ref keyword.