Passing String from Native C++ DLL to C# App - c#

I have written a DLL in C++. One of the functions writes to a character array.
C++ Function
EXPORT int xmain(int argc, char argv[], char argv2[])
{
char pTypeName[4096];
...
//Other pTypeName ends up populated with "Portable Network Graphics"
//This code verifies that pTypeName is populated with what I think it is:
char szBuff[64];
sprintf(szBuff, pTypeName, 0);
MessageBoxA(NULL, szBuff, szBuff, MB_OK);
//The caption and title are "Portable Network Graphics"
...
//Here, I attempt to copy the value in pTypeName to parameter 3.
sprintf(argv2, szBuff, 0);
return ret;
}
C# Import
//I believe I have to use CharSet.Ansi because by the C++ code uses char[],
[DllImport("FirstDll.dll", CharSet=CharSet.Ansi)]
public static extern int xmain(int argc, string argv, ref string zzz);
C# Function
private void button2_Click(object sender, EventArgs e)
{
string zzz = "";
int xxx = xmain(2, #"C:\hhh.bmp", ref zzz);
MessageBox.Show(zzz);
//The message box displays
//MessageBox.Show displays "IstuÈst¼ÓstÄstlÄstwÄstiÑstõÖstwÍst\
// aÖst[ÖstÃÏst¯ÄstÐstòÄstŽÐstÅstpÅstOleMainThreadWndClass"
}
I have attempted to pass a parameter from C# by reference and have the C++ DLL populate the parameter. Even though I have verified that the value is correct in the DLL, gibberish gets passed to the C# application.
What can I do to write the correct string value to the C# string?

Use a StringBuilder to pass a character array that native code can fill in (see Fixed-Length String Buffers).
Declare the function:
[DllImport("FirstDll.dll", CharSet=CharSet.Ansi)]
public static extern int xmain(int argc, string argv, StringBuilder argv2);
Use it:
// allocate a StringBuilder with enough space; if it is too small,
// the native code will corrupt memory
StringBuilder sb = new StringBuilder(4096);
xmain(2, #"C:\hhh.bmp", sb);
string argv2 = sb.ToString();

Give some other information to the DLLImport call. Look at the following example of my own:
[DllImport("tcpipNexIbnk.dll", EntryPoint = "SendData", CallingConvention = CallingConvention.Cdecl)]
public static extern int Send([MarshalAs(UnmanagedType.LPWStr)]string message);
Notice two things, the CallingConvention parameter:
CallingConvention = CallingConvention.Cdecl)
Use that as it is.
And then just behind the c# string type, you can play with the different Unmanaged types using the MarshalAS instruction, that will cast your C# string parameter to the native string type you have in your c++ program:
public static extern int Send([MarshalAs(UnmanagedType.LPWStr)]string message);
Hope it helps.

Related

How to pass a string or string equivalent between C# and C++ that is in a struct? [duplicate]

I thought the problem is inside my C++ function,but I tried this
C++ Function in C++ dll:
bool __declspec( dllexport ) OpenA(std::string file)
{
return true;
}
C# code:
[DllImport("pk2.dll")]
public static extern bool OpenA(string path);
if (OpenA(#"E:\asdasd\"))
I get an exception that the memory is corrupt,why?
If I remove the std::string parameter,it works great,but with std::string it doesnt work.
std::string and c# string are not compatible with each other. As far as I know the c# string corresponds to passing char* or wchar_t* in c++ as far as interop is concerned.
One of the reasons for this is that There can be many different implementations to std::string and c# can't assume that you're using any particular one.
Try something like this:
bool __declspec( dllexport ) OpenA(const TCHAR* pFile)
{
std::string filename(pFile);
...
return true;
}
You should also specify the appropriate character set (unicode/ansi) in your DllImport attribute.
As an aside, unrelated to your marshalling problem, one would normally pass a std:string as a const reference: const std:string& filename.
It's not possible to marshal a C++ std::string in the way you are attempting. What you really need to do here is write a wrapper function which uses a plain old const char* and converts to a std::string under the hood.
C++
extern C {
void OpenWrapper(const WCHAR* pName) {
std::string name = pName;
OpenA(name);
}
}
C#
[DllImport("pk2.dll")]
public static extern void OpenWrapper( [In] string name);
I know this topic is a tad old but for future googlers, this should also work (without using char* in C++)
C#:
public static extern bool OpenA([In, MarshalAs(UnmanagedType.LPStr)] path);
C++:
bool __declspec( dllexport ) OpenA(std::string file);
std::wstring and System.string can be compatible through below conversion:
C++ :
bool func(std::wstring str, int number)
{
BSTR tmp_str = SysAllocStringLen(str.c_str(), str.size());
VARIANT_BOOL ret = VARIANT_FALSE;
// call c# COM DLL
ptr->my_com_function(tmp_str, number, &ret);
SysFreeString(tmp_str);
return (ret != VARIANT_FALSE) ? true : false;
}

String to Char* Without memory leaks

I had try many ways to convert a string into a Char* but always got 2 errors when I import my DLL to a C# project.
the main function of my C++ DLL is like this:
//Example
extern "C" __declspec (dllexport) void Conv(std::string str)
{
FFileList file_list;
char temp_path[1024];
sprintf(temp_path,"%s*",arg_path);
GetFindFileListWin(temp_path,".mrs",file_list);
}
So, I Need to convert "str" to a char* because GetFindFileListWin is like this:
GetFindFileListWin(char* path,char* ext,FFileList& pList);
and pass it to arg_path.
I tried to do this:
char* arg_path = new char[str.length()+1];
strcpy(arg_path, str.c_str());
sprintf(temp_path,"%s*",arg_path);
delete[] arg_path;
but when I run Conv() in my C# program It says Windows has triggered a breakpoint in Program.exe.
This may be due to a corruption of the heap, which indicates a bug in Program.exe or any of the DLLs it has loaded. (Same happens if I use _strdup).
So, I tried in other way:
std::vector<char> Chr(str.size() + 1);
std::copy(str.begin(), str.end(), Chr.begin());
char *arg_path = &Chr[0];
sprintf(temp_path,"%s*",arg_path);
And I got Attempted to read or write protected memory message
My C# program do this:
[DllImport("Mrs.dll")]
public static extern void Conv(string str);
public void Convert(TextBox Tx)
{
Conv(Tx.Text);
}
Hope someone could help me to solve this error,
Thanks in advance.
I suspect the asterisk in the sprintf statements could be causing the problem. sprintf would expect a length parameter for the asterisk in the format string.
sprintf(temp_path,"%s*",arg_path);
^
string is c++ object.So in dll it will not work
What you should do as I mentioned above just to do such
extern "C" __declspec (dllexport) void Conv(const char* str)
{
//do whatever
}
If your p/invoke signature is
[DllImport("Mrs.dll", CharSet = CharSet.Ansi, CallingConvention = CallingConvention.Cdecl)]
public static extern void Conv(string str);
then your C++ declaration will be simply
extern "C" __declspec(dllexport) void Conv(const char* str);
and you won't need any std::string at all.
.NET and p/invoke do not know anything about std::string.

C# dll call not passing char* to C++ MFC regular dll

I have a C++ MFC regular DLL I am calling with the following:
public static class Access3rdPartyDLL
{
public static string FilePath;
[DllImport("3rdparty.dll")]
// I have also tried LPWStr
public static extern long Download([MarshalAs(UnmanagedType.LPStr)]string sDownloadFile,
int iDeviceNum
...);
public static long DownloadToDevice()
{
long result;
string FilePath = "C:\\myfile.txt"
result = Download(FilePath, 1, ...);
// check if success or error
if(result > 0)
...
}
}
I get an error back from the DLL saying "File: 'C:\myfile.txt' not found. But its there...
I have also tried using StringBuilder but this also fails.
Could this be a problem with the DLL or am I doing something wrong?
I found this current code here: SO: equivalent char* in C#
EDIT: I have done this in C++ before and this code works:
extern "C" __declspec(dllimport) HRESULT __stdcall Download(char* sDownloadFile, int ...
which I call with:
HRESULT result = Download(file_for_download, 1, .. // where file_for_download is a char*
The only thing wrong with the P/invoke is that you are using C# long which is 64 bits, but an HRESULT is only 32 bits.
You have matching calling conventions, default marshalling for managed string is char* on the unmanaged side.
Mismatching return value size would not explain why your C# code receives a string message File: 'C:\myfile.txt' not found so your main problem most likely lies in the code that you haven't shown us.
I don't see any reason why the following wouldn't work in this simple scenario:
[DllImport( "3rdparty.dll", CharSet = CharSet.Ansi )]
static extern long Download(string sDownloadFile, int iDeviceNum, ...)
long result = Download("C:\\myfile.txt", 1, ...);

Marshalling string from c# to c++

I am new in microsoft world.
I have lot of problem trying to pass a simple string from c# to dll/c++
I have read a lot of post and documentation about but the problem is the same.
C++ code
extern "C" __declspec(dllexport) int Init( long l , char* url );
C# code
[DllImport("MCRenderer.dll", CharSet = CharSet.Ansi, SetLastError = true, ExactSpelling = false)]
public static extern int Init(long a, StringBuilder url);
Init(hndl.ToInt64(), str );
what haeppen is that long value is passed correctly while string parameter is
0x00000000 <Bad Ptr>
can you help me ... Iam really confused
thanks!!
AG
You should pass a string, url should be of type string and not StringBuilder.
It's because you're marshalling incorrectly: long and int in C++ are both (usually) int in C#.
From MSDN, you need to:
The only caveat is that the StringBuilder must be allocated enough space for the return value, or the text will overflow, causing an exception to be thrown by P/Invoke
Also from Marshaling between Managed and Unmanaged Code
Don't pass StringBuilder by reference (using out or ref). Otherwise, the CLR will expect the signature of this argument to be wchar_t ** instead of wchar_t *, and it won't be able to pin StringBuilder's internal buffer. Performance will be significantly degraded.
Use StringBuilder when the unmanaged code is using Unicode. Otherwise, the CLR will have to make a copy of the string and convert it between Unicode and ANSI, thus degrading performance. Usually you should marshal StringBuilder as LPARRAY of Unicode characters or as LPWSTR.
Always specify the capacity of StringBuilder in advance and make sure the capacity is big enough to hold the buffer. The best practice on the unmanaged code side is to accept the size of the string buffer as an argument to avoid buffer overruns. In COM, you can also use size_is in IDL to specify the size.
So in your example, you need to specify the native size as the parameter of the StringBuilder like this StringBuilder str = new StringBuilder(SIZE_OF_NATIVE_STRING);
Try using LPCSTR in the dll function parameter
extern "C" __declspec(dllexport) int Init( long l , **LPCTSTR** url );
Here is a good example that I found on how to do this.
C++:
extern "C" __declspec(dllexport) void doSomething(LPCTSTR asString)
{
std::cout << "here" << (wchar_t)asString << endl;
system ("pause");
}
C#:
class Program
{
static void Main(string[] args)
{
String myString = "String in C#";
doSomething(myString);
}
private const String path = #"C:\testdll2.dll"; //Make sure that the DLL is in this location
[DllImport(path)]
private static extern void doSomething(string asString);
}

Calling DLL function with char* param from C#?

I have a C++ dll that I need to call from C#. One of the functions in the dll requires a char* for an input parameter, and another function uses a char* as an output parameter.
What is the proper way to call these from C#?
string should work if the parameter is read-only, if the method modifies the string you should use StringBuilder instead.
Example from reference below:
[DllImport ("libc.so")]
private static extern void strncpy (StringBuilder dest,
string src, uint n);
private static void UseStrncpy ()
{
StringBuilder sb = new StringBuilder (256);
strncpy (sb, "this is the source string", sb.Capacity);
Console.WriteLine (sb.ToString());
}
If you don't know how p/invoke marshaling works you could read http://www.mono-project.com/Interop_with_Native_Libraries
If you are only conserning with strings, read only the section: http://www.mono-project.com/Interop_with_Native_Libraries#Strings
Just using strings will work fine for input parameters, though you can control details about the string with the MarshalAs attribute. E.g.
[DllImport("somedll.dll", CharSet = CharSet.Unicode)]
static extern void Func([MarshalAs(UnmanagedType.LPWStr)] string wideString);
As for returning char* parameters, that's a little more complex since object ownership is involved. If you can change the C++ DLL you can use CoTaskMemAllocate, with something like:
void OutputString(char*& output)
{
char* toCopy = "hello...";
size_t bufferSize = strlen(toCopy);
LPVOID mem = CoTaskMemAlloc(bufferSize);
memcpy(mem, toCopy, bufferSize);
output = static_cast<char*>(mem);
}
The C# side then just uses an 'out string' parameter, and the garbage collector can pick up the ownership of the string.
Another way of doing it would be to use a StringBuilder, but then you need to know how big the string will be before you actually call the function.
Not sure this works, but have you tried with
StringObject.ToCharArray();
Not sure about initialising the String from char * tho. Mybe just assign to a string object, its worth a try.
Have you tried StringBuilder? I found this in a Google search:
[DllImport("advapi32.dll")]
public static extern bool GetUserName(StringBuilder lpBuffer, ref int nSize);
If you post the call you're making we can help you assemble it.
If the DLL function is expecting an allocated buffer of char* (not a wide/multibyte buffer) then the following will work:
[DllImport("somedll.dll", CharSet = CharSet.Ansi)]
static extern void TheFunc(byte[] someBuffer, int someSize);
Here a buffer allocated in c# is passed to TheFunc which fills it with a string of characters (of type char). Bytes aren't "interpreted" by C# they are treated like 8 bit integers, so are perfect for holding 8 bit characters.
An example code snipped would therefore be:
byte[] mybuffer;
int bufSize;
bufSize = 2048;
mybuffer = new byte[bufSize];
TheFunc(mybuffer, bufSize);
string value;
for(value = "", int ix = 0; (mybuffer[ix] != 0) && (ix < bufSize); ix++)
value += (char) mybuffer[ix];
DoSomethingWithTheReturnedString(value);

Categories