I would want to know how to marshall a C# string to a native C++ char*. I was trying it but nothing seems to work. Thanks in advance.
Remember that a C++ char is actually a byte so you need to pass it as a byte[] using something like
string str; // Contains string to pass to C++ DLL
byte[] bytes = Encoding.UTF8.GetBytes(str);
MyFun(bytes); // Call the C++ function with the string
See also Pass C# string to C++ and pass C++ result (string, char*.. whatever) to C# for a different perspective.
Related
I am trying to separate an encryption function from our legacy code to a dll which I can call from C#, but I am having issues getting it to work and I keep getting access violations when calling the dll.
I am not sure where the AV happens because delphi has a hard time hitting my breakpoints when the dll is attached to another process.
I got it to work yesterday using David Heffernan's answer here: Returning a string from delphi dll to C# caller in 64 bit
But my success was short-lived as I changed the string parameters to regular string's (delphi) saw it didn't work and changed them back to to AnsiString (our encryption routine expects Ansi). Since I changed these param types. I have not been able to get it to work again.
Here is my Delphi Code:
procedure Encrypt(const Source: AnsiString; const Key: AnsiString; var OutPut:PAnsiChar; const OutputLength: Integer);
var
EncryptedString, EncodedString: AnsiString;
begin
EncryptedString := Crypt(Source, Key);
EncodedString := Encode(EncryptedString);
if Length(EncodedString) <= OutputLength then
System.AnsiStrings.StrPCopy(Output, EncodedString);
end;
exports
Encrypt;
My C# caller:
[DllImport("AsmEncrypt.dll", CharSet = CharSet.Ansi)]
public static extern void Encrypt(string password, string key, StringBuilder output, int outputlength);
// using like this:
Encrypt(credentials.Password, myKey, str, str.Capacity);
My best bet right now is that I've goofed some of the arguments to the dll since it seems to crash before it reaches an OutputDebugStr() I had put on first line of Encrypt()
All help will be greatly appreciated
Change the Delphi function to
procedure Encrypt(Source, Key, OutPut: PAnsiChar; OutputLength: Integer); stdcall;
in order to make this code work.
You should probably also make the length argument IN/OUT so that the caller can resize the string builder object once the call returns. That would also allow the callee to signal any errors to the caller, another flaw in your current design.
I must also say that using AnsiString as a byte array is a recipe for failure. It's high time you started doing encryption right. If you have text, then encode it as a byte array with a specific encoding, usually this means UTF-8. Then encrypt that byte array to another byte array.
From this docs page:
The AnsiString structure contains a 32-bit length indicator, a 32-bit reference count, a 16-bit data length indicating the number of bytes per character, and a 16-bit code page.
So an AnsiString isn't simply a pointer to an array of characters -- it's a pointer to a special structure which encodes a bunch of information.
However, .NET's P/Invoke machinery is going to pass a pointer to an array of characters. Delphi is going to try and interpret that as a pointer to its special AnsiString structure, and things aren't going to go well.
I think you're going to have a hard time using AnsiString in interop. You're better off choosing a string type which both .NET and Delphi know about. If you then need to convert that to AnsiString, do that in Delphi.
I have a c# script I need to run in my C# application.
here is my c++ function,i hope can invoke it with c# in my c# application
the c++ prototype :
int ApplibUsbSimple_Login(UINT8 *buff)
i use c# invoke it:
[DllImport("test.dll", EntryPoint = "login")]
public static extern int Login(????? buff)
i have seach the answer in google and stackoverflow just now,but i could not
get the answer.
how should i replace the ????? with correct variable
As others already noted, if the UINT8 type used in your native function represents an 8-bit byte, you can map it to the byte type in C#.
Moreover, according to this MSDN doc page, if you take a look at the C-Style Arrays section, you can use this C# code for your byte array parameter:
[MarshalAs(UnmanagedType.LPArray)] byte[] buff
In addition, there are a few questions for you: How can the native C-interface function know the size of the input array? Is this array 0-terminated? Is there another parameter in that function that specifies the size of the array in bytes? Is the size of the array fixed and specified as part of the function documentation?
I think you can use byte[]
See https://learn.microsoft.com/en-us/dotnet/articles/csharp/language-reference/keywords/byte
Sure, C# doesn't have the exact type uint8, but the equivalent is byte.
I am calling a C++ function from C# code by passing a string to it. The C++ function is responsible for filling the string content. Following is the code:
C# side:
var abc = new StringBuilder(4096); // need to change this
var result = NativeMethods.SignCrcFile(abc);
C++ side:
bool __cdecl SignCrcFile(char* abc)
{
...
char* tempStr = "Hello All"; //for example
//copy it to abc string.
strcpy(abc,tempStr);
return true;
}
The problem is that the size of the tempStr in the C++ function is dynamic(the above assignment to tempStr is just an example).
In such a case, it is not good idea to hard code the size of abc string on C# side to 4096 bytes.
One solution could be to create 2 functions on C++ side to get the size of the string first. Then allocate that size using StringBuilder in C# side. Then use the next function to get the entire string from C++ to C# side. But this may not be the most optimal way. Any suggestions?
The stringbuilder has not been fixed to 4096 chars, it allows to hold up to 4096b initially, but the final size of the string when you do .ToString will be variable with a max length of 4096.
If you are afraid it will not be enough then you can raise it.
Else, you can add an int* to the function call and if the char* is null put the needed size on that int, in this way you can call the function one time without the stringbuilder, get the size, initialize it with such size and call again the function with the stringbuilder.
I want to read the constants in C#, these constants are defined in a .h file for C++. Can anyone tell me how to perform this in C# ? The C++ .h file looks like:
#define MYSTRING1 "6.9.24 (32 bit)"
#define MYSTRING2 "6.8.24 (32 bit)"
I want to read this in C# ?
Here is a really simple answer that you can use on each line of your .h file to extract the string value.
string GetConstVal(string line)
{
string[] lineParts = string.Split(line, ' ');
if (lineParts[0] == "#define")
{
return lineParts[2];
}
return null;
}
So any time it returns null, you don't have a constant. Now keep in mind that it is only getting the value of the constant, not the name, but you can easily modify the code to return that as well via out parameter, etc.
If you want to represent other data types, like integers, etc. you will have to think of something clever since macros in C++ don't really count as type safe.
You have two options:
1- Create a C++ wrapper code, which wraps these macros, export this code to a lib or dll and use it from C#.
2- read/parse the .h file from your code, and get the values at run-time.
I have a byte array in my C# code that I need to pass into a LuaInterface instance. I can use pack() in Lua, pass the resulting string to C# and convert it with System.Text.Encoding.UTF8.GetBytes(), but going the other way doesn't seem to work.
Is there a simple solution? I'm hoping I can avoid assigning the byte array to a global value.
Edit:
I tried a few new things this morning. I tried using LuaInterface.GetFunction(), and everything works until it hits lua_pushstring() in LuaDLL.cpp. At this point the C# string is converted to a char* via Marshal::StringToHGlobalAnsi().ToPointer(). It looks like this function expects a null terminated string, and my string's first byte is 0 so I get an empty string in my lua code.
Finally traced it down to a the call to ::lua_pushstring() in lapi.c. It called strlen() on the char* passed in. Since my first byte of data was 0, it returned 0. There is an alternate call, lua_pushlstring, that accepts the size of the string as an argument. Changing to call this function fixed the issue.
Try encoding your byte array with System.Text.ASCIIEncoding.ASCII.GetString to get a string that can be passed to Lua.