I've successfully used Roberts UnmanagedExportLibrary.zip to call a .NET 2/3.5 assembly from Delphi 2007.
However when I recompile the C# assembly to target .NET 4 using VS2010 the call crashes with a stack overflow exception in ntdll.dll. (ntdll calling ntdll) after loading mscorlib/mscoreei.
Has anybody else got this to work when targeting .NET 4? - Robert's documentation seems to imply that this should work.
Great work by Robert by the way - very useful.
Thanks
Myles.
Arrays are more tricky because you need to take more care over where the array is allocated and destroyed. The cleanest approach is always to allocate at the caller, pass the array to the callee to let it fill out the array. That approach would look like this in your context:
public struct Sample
{
[MarshalAs(UnmanagedType.BStr)]
public string Name;
}
[DllExport]
public static int func(
[Out, MarshalAs(UnmanagedType.LPArray, SizeParamIndex=1)]
Sample[] samples,
ref int len
)
{
// len holds the length of the array on input
// len is assigned the number of items that have been assigned values
// use the return value to indicate success or failure
for (int i = 0; i < len; i++)
samples[i].Name = "foo: " + i.ToString();
return 0;
}
You need to specify that the array needs to be marshalled in the out direction. If you wanted values marshalled both ways then you would use In, Out instead of Out. You also need to use MarshalAs with UnmanagedType.LPArray to indicate how to marshal the array. And you do need to specify the size param so that the marshaller knows how many items to marshal back to the unmanaged code.
And then on the Delphi side you declare the function like this:
type
TSample = record
Name: WideString;
end;
PSample = ^TSample;
function func(samples: PSample; var len: Integer): Integer; stdcall;
external dllname;
Call it like this:
var
samples: array of TSample;
i, len: Integer;
....
len := 10;
SetLength(samples, len);
if func(PSample(samples), len)=0 then
for i := 0 to len-1 do
Writeln(samples[i].Name);
Update
As AlexS [discovered][1] (see comments below), passing the size param index by reference is only supported on .net 4. On earlier versions you need to pass the size param index by value.
The reason I chose to pass it by reference here is to allow for the following protocol:
The caller passes in a value indicating how large the array is.
The callee passes out a value indicating how many elements have been populated.
This works well on .net 4, but on earlier versions you would need to use an extra parameter for step 2.
Reference
https://stackoverflow.com/a/22507948/4339857
Related
I am trying to separate an encryption function from our legacy code to a dll which I can call from C#, but I am having issues getting it to work and I keep getting access violations when calling the dll.
I am not sure where the AV happens because delphi has a hard time hitting my breakpoints when the dll is attached to another process.
I got it to work yesterday using David Heffernan's answer here: Returning a string from delphi dll to C# caller in 64 bit
But my success was short-lived as I changed the string parameters to regular string's (delphi) saw it didn't work and changed them back to to AnsiString (our encryption routine expects Ansi). Since I changed these param types. I have not been able to get it to work again.
Here is my Delphi Code:
procedure Encrypt(const Source: AnsiString; const Key: AnsiString; var OutPut:PAnsiChar; const OutputLength: Integer);
var
EncryptedString, EncodedString: AnsiString;
begin
EncryptedString := Crypt(Source, Key);
EncodedString := Encode(EncryptedString);
if Length(EncodedString) <= OutputLength then
System.AnsiStrings.StrPCopy(Output, EncodedString);
end;
exports
Encrypt;
My C# caller:
[DllImport("AsmEncrypt.dll", CharSet = CharSet.Ansi)]
public static extern void Encrypt(string password, string key, StringBuilder output, int outputlength);
// using like this:
Encrypt(credentials.Password, myKey, str, str.Capacity);
My best bet right now is that I've goofed some of the arguments to the dll since it seems to crash before it reaches an OutputDebugStr() I had put on first line of Encrypt()
All help will be greatly appreciated
Change the Delphi function to
procedure Encrypt(Source, Key, OutPut: PAnsiChar; OutputLength: Integer); stdcall;
in order to make this code work.
You should probably also make the length argument IN/OUT so that the caller can resize the string builder object once the call returns. That would also allow the callee to signal any errors to the caller, another flaw in your current design.
I must also say that using AnsiString as a byte array is a recipe for failure. It's high time you started doing encryption right. If you have text, then encode it as a byte array with a specific encoding, usually this means UTF-8. Then encrypt that byte array to another byte array.
From this docs page:
The AnsiString structure contains a 32-bit length indicator, a 32-bit reference count, a 16-bit data length indicating the number of bytes per character, and a 16-bit code page.
So an AnsiString isn't simply a pointer to an array of characters -- it's a pointer to a special structure which encodes a bunch of information.
However, .NET's P/Invoke machinery is going to pass a pointer to an array of characters. Delphi is going to try and interpret that as a pointer to its special AnsiString structure, and things aren't going to go well.
I think you're going to have a hard time using AnsiString in interop. You're better off choosing a string type which both .NET and Delphi know about. If you then need to convert that to AnsiString, do that in Delphi.
The function I'm calling is part of an obscure API that cannot be PInvoked because the library's only entry point is a structure of pointers to the functions within. That being said, I have the structure and C++ implementation sample code.
tl;dr I'm not PInvoking this function.
Here is what the implementation looks like in C++
// Definition
extern "C" typedef short (* GETBIT)(short,char*);
// Base function for using API
void anApi::GetData(short index, void* pData)
{
switch(internal logic...)
{
case bitData:
get_bit(index, (char *)(pData));
case (other data types)...
}
}
// Higher level API implementation
UCHAR u1Val = 0;
GetData( 1234, &u1Val );
if( u1Val == 1 )
{
doSomthing();
}
Presumably I should get back either 0 or 1 (even with the actual data type being what it is). I should mention that the API has error checking, which I left out for simplicity sake, and my code does not cause any error. There is also an initialization function which succeeds, so I know I'm calling the functions correctly.
Here is how I've implemented the same function in C#
// Definition
[UnmanagedFunctionPointer(CallingConvention.Cdecl)]
delegate short get_bit_delegate(short index, out byte data);
get_bit_delegate get_bit;
// Base function
public string GetDataBit(short index)
{
string ReturnData = "test";
// Using .NET type of Byte instead of char...
byte myByte = 0;
lRetCode = get_bit(index, out myByte);
// get a good look at what we returned
ReturnData = Convert.ToString(myByte, 2).PadLeft(8, '0');
return ReturnData;
// error checking snipped for brevity
}
// Implementation
textBox1.Text = GetDataBit(1234);
I've tried different data types including IntPtr and then Marshaling, but I keep getting the same result:
11111111 ( ΓΏ )
The way they use the result in the sample code, I would think that the return value should be '1' or '0'. In fact, in my test case I'm expecting to see 0.
0 - chr(48) - 00110000
1 - chr(49) - 00110001
It all boils down to the contract of the function that you are trying to call, which you have not told us much about: Is it supposed to return a character, a byte, a bit, a boolean?
You should have expected chr(48) and chr(49) if the function you are calling was supposed to return a character. But obviously, you are not receiving one. Pay no attention to the fact that char is used internally: char is used in old-style C code in place of byte, bit and boolean. So, you have to have some kind of spec to follow, instead of trying to do a literal reading of the internals of some other guy's code.
In the C family of languages, the if() statement checks whether something is zero (FALSE) or non-zero (TRUE). As a result, when something is supposed to evaluate to TRUE, it is not necessary that it be equal to 1, it can be any non-zero value. And actually, -1 is not uncommon.
A bit pattern of all 1s is a non-zero value, so if it is examined as a boolean, it corresponds to TRUE. (And in two's complement a bit pattern of all 1s is actually -1.) So, it is quite possible that the function you are calling is supposed to return a boolean value, and you are in fact receiving a TRUE.
(Of course, in light of this, code like if( u1Val == 1 ) is kind of weird, it ought to have been if( u1Val != 0 ), so there may be something fishy going on.)
Edit 3 describes the narrowed-down problem after debugging
Edit 4 contains the solution - it's all about type difference between C and C#
Today I came across a curious problem. In C I have the following struct:
typedef struct s_z88i2
{
long Node;
long DOF;
long TypeFlag;
double CValue;
}s_z88i2;
Furthermore I have a function (this is a simplified version):
DLLIMPORT int pass_i2(s_z88i2 inp, int total)
{
long nkn=0,ifg=0,iflag1=0;
double wert=0;
int val;
// Testplace 1
nkn=inp.Node;
ifg=inp.DOF;
iflag1=inp.TypeFlag;
wert=inp.CValue;
// Testplace 2
return 0;
}
The assigned values are used nowhere - I'm aware of that.
When I reach // Testplace 1 the following statement is executed:
char tmpstr[256];
sprintf(tmpstr,"RB, Node#: %li, DOF#: %li, Type#: %li, Value: %f", inp.Node, inp.DOF, inp.TypeFlag, inp.CValue);
tmpstr then is passed to a messagebox. It shows - as one would expect - the values given in my struct I passed to the function in an nice and orderly way. Moving on through the function the values inside the struct get assigned to some variables. On reaching Testplace 2 the following is executed:
sprintf(tmpstr,"RB, Node#: %li, DOF#: %li, Type#: %li, Value: %f",nkn, ifg, iflag1, wert);
Again, tmpstr is passed to a messagebox. However, this doesn't show what one would expect. The values for Node and Typeare still correct. For DOFand Value the displayed values are 0 which leads me to the conclusion that something is going terribly wrong during assigning the values. I somehow sometimes managed to get a way to long number for value whis was as incorrect as 0. But I have not been able to reproduce that mistake during my last tests.
Possible values for inp are e.g. {2,1,1,-451.387}, so the first 1 and -451.387are forgotten.
Does anyone know what I'm doing wrong or how to fix this?
Many thanks in advance!
Edit:
Changed %ito %li but the result did not change. Thank to unwind!
I'm developing this dll with Dev-Cpp using MinGW (unfortunately) because I wasn't able to convince Visual Studio 2012 Pro to compile this properly. Although the documentation of the original source says it is plain ANSI-C. This bugs me a bit because I cannot debug this dll properly with Dev-Cpp. Hence the messageboxes.
Edit 2:
As Neil Townsend suggested, I switched to passing a reference. But this also did not cure the problem. When I access the values in my struct directly everything is fine. When I assign them to variables some get lost.
A short notice on how I'm calling the function. The dll is to be accessed from C#, so I'm meddeling with P/Invoke (as I get it).
[DllImport("z88rDLL", CallingConvention = CallingConvention.Cdecl)]
public static extern int pass_i2(ref s_z88i2 inp, int total);
is my definition in C#. I have lots of other functions imported and they all work fine. It is this function I encounter these Problems for the first time. I call the function via:
s_z88i2 tmpi2 = FilesZ88.z88i2F.ConstraintsList[i];
int res = SimulationsCom.pass_i2(ref tmpi2, FilesZ88.z88i2F.ConstraintsList.Count);
First I set the struct, then I call the function.
Why Oh Why has VS to be picky when it comes to compiling ANSI-C? It certainly would make things easier.
Edit 3:
I can narrow the problem down to sprintf, I think. Having convinced VS to build my dll I was able to step through it. It appears that the values are assigned very nicely indeed to the variables they belong in. If, however I want to print these variables via sprintf they turn out rather empty (0). Curiously, that the value is always 0and not something else. I'm still interested in why sprintfbehaves that way, but I consider my initial problem solved/panic defeated. So thanks everyone!
Edit 4:
As supercat points out below, I had a rethink about type-compatibility between C and C#. I was aware that an int in C# evaluates as a long in C. But after double-checking I found that in C my variables are really FR_INT4 (which I kept out of the original question for reasons of clarity => bad idea). Internally FR_INT4 is defined as: #define FR_INT4 long long, so as a super-long-long. A quick test showed that passing a long from C# gives the best compatibility. So the sprintf-issue can maybe be simplified to the question: "What is the format-identifier of a long long?".
It is %lli which would is quite simple, actually. So I can announce drumroll that my problem really is solved!
sprintf(tmpstr,"RB, Node#: %lli, DOF#: %lli, Typ#: %lli, Wert: %f\n", inp.Node, inp.DOF, inp.TypeFlag, inp.CValue);
returns every value I want. Thank you very much everyone!
Formatting a value of type long with the format specifier %i is not valid. You should use %li.
In C, it is a better approach to pass a reference or pointer to the struct rather than the struct. So:
DLLIMPORT int pass_i2(s_z88i2 *inp, int total) {
long nkn=0,ifg=0,iflag1=0;
double wert=0;
int val;
// Testplace 1
nkn=inp->Node;
ifg=inp->DOF;
iflag1=inp->TypeFlag;
wert=inp->CValue;
// Testplace 2
return 0;
}
You will need to correct the sprintf liness accordingly, inp.X becomes inp->X. To use this function either:
// Option A - create it in a declaration, fill it, and send a pointer to that
struct s_z88i2 thing;
// fill out thing
// eg. thing.Node = 2;
pass_i2(&thing, TOTAL);
or:
// Option B - create a pointer; create the memory for the struct, fill it, and send the pointer
struct s_z88i2 *thing;
thing = malloc(sizeof(struct s_z88i2));
// fill out thing
// eg thing->Node = 2;
pass_i2(thing, TOTAL);
This way pass_i2 will operate on the struct you send it, and any changes it makes will be there on return from pass_i2.
To clarify this as answered:
My struct actually is:
typedef struct s_z88i2
{
long long Node;
long long DOF;
long long TypeFlag;
double CValue;
}s_z88i2;
which requires long to be passed from C# (and not int as I previously thought). Through debugging I found out that the assignment of values behaves as it should, the problem was within sprintf. If I use %lli as format-identifier even this problem is solved.
sprintf(tmpstr,"RB, Node#: %lli, DOF#: %lli, Typ#: %lli, Wert: %f\n", inp.Node, inp.DOF, inp.TypeFlag, inp.CValue);
Is the statement I need to use. So thanks again everyone who contributed!
I have an application that needs to take in several million char*'s as an input parameter (typically strings less than 512 characters (in unicode)), and convert and store them as .net strings.
It turning out to be a real bottleneck in the performance of my application. I'm wondering if there's some design pattern or ideas to make it more effecient.
There is a key part that makes me feel like it can be improved: There are a LOT of duplicates. Say 1 million objects are coming in, there might only be like 50 unique char* patterns.
For the record, here is the algorithm i'm using to convert char* to string (this algorithm is in C++, but the rest of the project is in C#)
String ^StringTools::MbCharToStr ( const char *Source )
{
String ^str;
if( (Source == NULL) || (Source[0] == '\0') )
{
str = gcnew String("");
}
else
{
// Find the number of UTF-16 characters needed to hold the
// converted UTF-8 string, and allocate a buffer for them.
const size_t max_strsize = 2048;
int wstr_size = MultiByteToWideChar (CP_UTF8, 0L, Source, -1, NULL, 0);
if (wstr_size < max_strsize)
{
// Save the malloc/free overhead if it's a reasonable size.
// Plus, KJN was having fits with exceptions within exception logging due
// to a corrupted heap.
wchar_t wstr[max_strsize];
(void) MultiByteToWideChar (CP_UTF8, 0L, Source, -1, wstr, (int) wstr_size);
str = gcnew String (wstr);
}
else
{
wchar_t *wstr = (wchar_t *)calloc (wstr_size, sizeof(wchar_t));
if (wstr == NULL)
throw gcnew PCSException (__FILE__, __LINE__, PCS_INSUF_MEMORY, MSG_SEVERE);
// Convert the UTF-8 string into the UTF-16 buffer, construct the
// result String from the UTF-16 buffer, and then free the buffer.
(void) MultiByteToWideChar (CP_UTF8, 0L, Source, -1, wstr, (int) wstr_size);
str = gcnew String ( wstr );
free (wstr);
}
}
return str;
}
You could use each character from the input string to feed a trie structure. At the leaves, have a single .NET string object. Then, when a char* comes in that you've seen previously, you can quickly find the existing .NET version without allocating any memory.
Pseudo-code:
start with an empty trie,
process a char* by searching the trie until you can go no further
add nodes until your entire char* has been encoded as nodes
at the leaf, attach an actual .NET string
The answer to this other SO question should get you started: How to create a trie in c#
There is a key part that makes me feel like it can be improved: There are a LOT of duplicates. Say 1 million objects are coming in, there might only be like 50 unique char* patterns.
If this is the case, you may want to consider storing the "found" patterns within a map (such as using a std::map<const char*, gcroot<String^>> [though you'll need a comparer for the const char*), and use that to return the previously converted value.
There is an overhead to storing the map, doing the comparison, etc. However, this may be mitigated by the dramatically reduced memory usage (you can reuse the managed string instances), as well as saving the memory allocations (calloc/free). Also, using malloc instead of calloc would likely be a (very small) improvement, as you don't need to zero out the memory prior to calling MultiByteToWideChar.
I think the first optimization you could make here would be to make your first try calling MultiByteToWideChar start with a buffer instead of a null pointer. Because you specified CP_UTF8, MultiByteToWideChar must walk over the whole string to determine the expected length. If there is some length which is longer than the vast majority of your strings, you might consider optimistically allocating a buffer of that size on the stack; and if that fails, then going to dynamic allocation. That is, move the first branch if your if/else block outside of the if/else.
You might also save some time by calculating the length of the source string once and passing it in explicitly -- that way MultiByteToWideChar doesn't have to do a strlen every time you call it.
That said, it sounds like if the rest of your project is C#, you should use the .NET BCL class libraries designed to do this rather than having a side by side assembly in C++/CLI for the sole purpose of converting strings. That's what System.Text.Encoding is for.
I doubt any kind of caching data structure you could use here is going to make any significant difference.
Oh, and don't ignore the result of MultiByteToWideChar -- not only should you never cast anything to void, you've got undefined behavior in the event MultiByteToWideChar fails.
I would probably use a cache based on a ternary tree structure, or similar, and look up the input string to see if it's already converted before even converting a single character to .NET representation.
Works fine if I don't return anything, or I return an integer. But if I try to return a PChar, ie..
result := PChar('') or result:= PChar('Hello')
The web app just freezes up and I watch its memory count gradually get higher and higher in task manager.
The odd thing is that the DLL works fine on the VStudio debug server, or through a C# app. The only thing I can think of that would make a difference is that the IIS server is running in 64bit Windows.
It doesn't appear to be a compatability issue though because I can successfully write to text files and do other things from the DLL... I just can NOT return a PChar string.
Tried using PWideChar, tried returning 'something\0', tried everything I could think of. No luck unfortunately.
[DllImport("TheLib.dll", CallingConvention = CallingConvention.StdCall, CharSet = CharSet.Ansi)]
private static extern string SomeFunction();
string result = SomeFunction();
delphi:
library TheLib;
function SomeFunction() : PChar export; stdcall;
begin
return PChar('');
end;
exports
SomeFunction
Dampsquid's analysis is correct so I will not repeat that. However, I prefer a different solution that I feel to be more elegant. My preferred solution for such a problem is to use Delphi Widestring which is a BSTR.
On the Delphi side you write it like this:
function SomeFunction: Widestring; stdcall;
begin
Result := 'Hello';
end;
And on the C# side you do it like this:
[DllImport(#"TheLib.dll")]
[return: MarshalAs(UnmanagedType.BStr)]
private static extern string SomeFunction();
And that's it. Because both parties use the same COM allocator for the memory allocation, it all just works.
Update 1
#NoPyGod interestingly points out that this code fails with a runtime error. Having looked into this I feel it to be a problem at the Delphi end. For example, if we leave the C# code as it is and use the following, then the errors are resolved:
function SomeFunction: PChar; stdcall;
begin
Result := SysAllocString(WideString('Hello'));
end;
It would seem that Delphi return values of type WideString are not handled as they should be. Out parameters and var parameters are handled as would be expected. I don't know why return values fail in this way.
Update 2
It turns out that the Delphi ABI for WideString return values is not compatible with Microsoft tools. You should not use WideString as a return type, instead return it via an out parameter. For more details see Why can a WideString not be used as a function return value for interop?
You cannot return a string like that, the string is local to the function and will be freed as soon as he function returns leaving the returned PChar pointing to an invalid location.
you need to pass in a pointer to be filled within the DLL, dynamically create the string and free it back in the c# code or create a static buffer in yout DLL and return that.
By far the safest way is to pass a pointer into the function ie
function SomeFunction( Buffer: PChar; MaxLength: PInteger ): wordbool; stdcall;
{
// fill in the buffer and set MaxLength to length of data
}
you should set MaxLength to the sixe of the buffer before calling your dll so that the dll can check there is enough space for the data to be returned.
try to enable 32-bit applications in application pool advanced settings :