Reading a stream from native lib to C# - c#

I have the following native c++ function:
// Decode binary format from file 'filename' into stream 'output'
bool read_private_format(const char * filename, std::ostringstream & output);
Reading previous post on SO on StringBuilder and delegate, I have created an intermediate C function to be exposed to the C# layer as:
extern "C" {
typedef char *(*StringBuilderCallback)(int len);
__attribute__ ((visibility ("default")))
bool c_read_private_format(const char * filename, StringBuilderCallback ensureCapacity, char *out, int len) {
std::ostringstream oss;
if( read_private_format(filename, oss) ) {
const std::string str = oss.str();
if( str.size() > len )
out = ensureCapacity(str.size());
strcpy(out, str.c_str());
return true;
}
return false;
}
}
while on the C# side:
private delegate System.Text.StringBuilder StringBuilderEnsureCapacity(int capacity);
[System.Runtime.InteropServices.DllImport(NativeLibraryName, EntryPoint="c_read_private_format")]
private static extern bool c_read_private_format(string filename, System.IntPtr aCallback, System.Text.StringBuilder data, int size);
private static System.Text.StringBuilder callback(int capacity)
{
buffer.EnsureCapacity( capacity );
return buffer;
}
public static string readIntoString(string filename) {
StringBuilderEnsureCapacity del = new StringBuilderEnsureCapacity(callback);
System.IntPtr ptr = System.Runtime.InteropServices.Marshal.GetFunctionPointerForDelegate(del)
if( c_read_private_format( ptr, buffer, buffer.Capacity ) ) {
string str = buffer.ToString();
return str;
}
return null;
}
For some reason this is not working as expected, when printing the adress of the char* as returned by callback it acts as if the pointer returned was the one before the call to EnsureCapacity (I can verify by doing a second call, in which case the char* in the C layer is different).
My questions is:
How can I efficiently retrieve a UTF-8 string from C in .NET SDK (5.0.202) ?
I do not know in advance how long the string will be. Technically I could overestimate the StringBuilder Capacity so that I can re-use across my files, but it feels as if there could a better approach to passing a growing stream to the c layer.

There is no point in trying to optimize the posted code since by definition the pinvoke layer is missing the most important point:
❌ AVOID StringBuilder parameters. StringBuilder marshaling always
creates a native buffer copy. As such, it can be extremely
inefficient.
https://learn.microsoft.com/en-us/dotnet/standard/native-interop/best-practices#string-parameters

Related

Trying to get strings from managed code back to C++

(sorry if this has been asked before, but most of the examples that I've seen are passing data from managed -> native, not the other way around).
Short question: How do I fetch a string from the managed world from within native C++ code?
Long question + background:
I'm working with some legacy C++ code that formerly had the capability to get and set name/value string pairs (to greatly simplify the design). I wanted to move that name/value pair mechanism up into C# managed code with the rest of our application, so I put in function callbacks in the C++ world that call up into the managed code for getting and setting. The C++ function pointer types are as follows:
typedef int (GetConfigParamCallback)(const char* paramName, char* value);
typedef GetConfigParamCallback* LPGetConfigParamCallback;
typedef int (SetConfigParamCallback)(const char* paramName, const char* value);
typedef SetConfigParamCallback* LPSetConfigParamCallback;
As you can see, the tricky one is the get, where I want to provide memory to the caller to fill up. This will be the managed code moving forward.
These callbacks are represented by delegates in the C# world like so:
[UnmanagedFunctionPointer(CallingConvention.Cdecl)]
delegate int Native_GetConfigParamCallBackMethodDelegate(
string paramName, StringBuilder paramValue);
[UnmanagedFunctionPointer(CallingConvention.Cdecl)]
delegate int Native_SetConfigParamCallBackMethodDelegate(
string paramName, string paramValue);
And then my GetConfig wrapper function in the managed code that acquires the values looks like this (and puts the correct value into paramValue as seen by my debugging):
static int GetConfigParamCallBackWrapper(
string paramName,
System.Text.StringBuilder paramValue)
{
string valueTemp = // Fetch the string value here
if (valueTemp == null)
{
return 0;
}
paramValue.Append(valueTemp);
return 1;
}
So when the managed C# starts up, it sets these callback functions in the native world. Then, I have the native code run a series of what amounts to unit test methods that are getting and setting these strings. Now, on desktop this works fine, but when I try to run this on iOS with a Xamarin built app, the string comes back to the native world as garbage, AFAICT.
I’ve tried doing manual marshaling with IntPtr as well, and also no luck.
You can marshal the string as a BSTR:
C++:
typedef int (GetConfigParamCallback)(const char* paramName, BSTR* value);
typedef GetConfigParamCallback* LPGetConfigParamCallback;
// Usage:
LPGetConfigParamCallback pCallback = // ...
CComBSTR value;
int result = (*pCallback)("...", &value);
C#:
[UnmanagedFunctionPointer(CallingConvention.Cdecl)]
delegate int Native_GetConfigParamCallBackMethodDelegate(
string paramName,
[MarshalAs(UnmanagedType.BStr)] ref string paramValue
);
static int GetConfigParamCallBackWrapper(
string paramName,
ref string paramValue
)
{
string valueTemp = // Fetch the string value here
if (valueTemp == null)
{
return 0;
}
paramValue = valueTemp;
return 1;
}
So I was able to find a slightly less pretty, but valid solution, thanks to this excellent blog post: http://randomencoding.tumblr.com/post/48564128118/returning-a-string-from-a-c-callback-to-c.
I used an IntPtr and did manual Marshaling:
static int GetConfigParamCallBackWrapper(
string paramName,
IntPtr paramValue)
{
string valueTemp = // Acquire valueTemp here.
IntPtr sPtr = Marshal.StringToHGlobalAnsi(valueTemp);
try
{
// Create a byte array to receive the bytes of the unmanaged string
var sBytes = new byte[valueTemp.Length + 1];
// Copy the the bytes in the unmanaged string into the byte array
Marshal.Copy(sPtr, sBytes, 0, valueTemp.Length);
// Copy the bytes from the byte array into the buffer passed into this callback
Marshal.Copy(sBytes, 0, paramValue, sBytes.Length);
// Free the unmanaged string
}
finally
{
Marshal.FreeHGlobal(sPtr);
}

How to return bytes to C#/.net from C?

Here's my C function:
DLL_PUBLIC void alve_ripemd320__finish(void* instance, uint32_t* out_hash)
{
...
for (uint32_t i=0, i_end=10; i<i_end; i++)
{
out_hash[i] = h[i];
}
}
and here is how I'm calling it from C#:
[DllImport(PlatformConstants.DllName)]
static extern void alve_ripemd320__finish (IntPtr instance_space, ref byte[] hash);
...
public byte[] Finish()
{
byte[] result = new byte[40];
alve_ripemd320__finish (c_instance, ref result);
return result;
}
and that produces an ugly SEGFAULT, which goes away if I comment the C-code above that writes to out_hash.... My question is, is this the correct way of passing a buffer of bytes using PInvoke?
Your C API is writing unsigned integers. I would typically expect this to be mapped as:
[DllImport(PlatformConstants.DllName, CallingConvention=CallingConvention.Cdecl)]
static extern void alve_ripemd320__finish(IntPtr instance_space, uint[] hash);
public uint[] Finish()
{
uint[] result = new uint[10];
alve_ripemd320__finish (c_instance, ref result);
return result;
}
There are three main changes here:
I switched the calling convention to Cdecl. This is the standard for the C++ compiler (unless you're explicitly switching to stdcall in DLL_PUBLIC).
I changed to match your C API, which uses 32 bit unsigned integers instead of bytes. You should be able to switch back to byte[] if you choose, however.
You shouldn't need to pass by ref. This would typically be the equivelent of a C API accepting uint32_t** out_hash, not uint32_t* out_hash, which should map to an array directly.

Calling C++ DLL with a callback function that contains a char* from C#

I have a C++ DLL (SimpleDLL.dll), with a exposed function (DllFunctionPoibnterGetName) that has a function pointer (getNameFP). The function pointer takes a char * as a parameter (*char * name*).
// C++
DllExport void DllFunctionPoibnterGetName( void (*getNameFP) (char * name, unsigned short * length ) ) {
char name[1024];
unsigned short length = 0 ;
getNameFP( name, &length );
printf( "length=[%d] name=[%s]\n", length, name );
}
I have a C# application that would like to use this C++ DLL.
// C#
public unsafe delegate void GetName( System.Char* name, System.UInt16* length);
unsafe class Program
{
[UnmanagedFunctionPointer(CallingConvention.Cdecl)]
public delegate void delegateGetName(System.Char* name, System.UInt16* length);
[DllImport("SimpleDLL.dll", CharSet = CharSet.Ansi )]
public static extern void DllFunctionPoibnterGetName([MarshalAs(UnmanagedType.FunctionPtr)] delegateGetName getName);
static void Main(string[] args)
{
DllFunctionPoibnterGetName(GetName);
}
static void GetName(System.Char* name, System.UInt16* length)
{
// name = "one two three";
*length = 10;
}
}
Currently I can set the length with out any problems, but I can't seem to find a way to set the name correctly.
My Question is
How do I set the char * name to a value correctly.
You don't need to use unsafe code. You can do it like this:
[UnmanagedFunctionPointer(CallingConvention.Cdecl)]
public delegate void delegateGetName(IntPtr name, out ushort length);
....
static void GetName(IntPtr name, out ushort length)
{
byte[] buffer = Encoding.Default.GetBytes("one two three");
length = (ushort)buffer.Length;
Marshal.Copy(buffer, 0, name, buffer.Length);
}
Although this interface design is just asking for a buffer overrun. How are you supposed to know how big the unmanaged buffer is? It would make more sense for the length parameter to be passed by ref. On input it would tell you how big the buffer is. On output you would have recorded how many bytes you copied into the buffer.
Cast the char* as a char[]. That should do the trick.
Casting the char will not do. The char * data is 'unmanaged', native data. And C# uses 'managed', .NET data.
You need to make a wrapper for your call and use marschall to convert the data from 'unmanaged' to 'managed'.

IntPtr does not contain native value

I have a native method that has to deliver a byte array to a .NET wrapper. The natove method looks like:
__declspec(dllexport) int WaitForData(unsigned char* pBuffer)
{
return GetData(pBuffer);
}
GetData allocates a memory region using malloc and copies some data (a byte stream) into it. This byte stream was received via a socket connection. The return value is the length of pBuffer.
This method has to be called from .NET. The import declaration looks as follows:
[DllImport("CommunicationProxy.dll")]
public static extern int WaitForData(IntPtr buffer);
[EDIT]
The the P/Invoke Interop Assistant, that dasblinkenlight advised, translates the prototype to the following import signature:
public static extern int WaitForData(System.IntPtr pBuffer)
The result is the same: ptr is 0 after calling the method.
[/EDIT]
Atfer the method was called, the result is extracted:
IntPtr ptr = new IntPtr();
int length = Wrapper.WaitForData(ref ptr);
byte[] buffer = new byte[length];
for(int i = 0;i<length;i++)
{
buffer[i] = System.Runtime.InteropServices.Marshal.ReadByte(ptr, i);
}
Wrapper.FreeMemory(ptr);
The problem is, that the managed variable ptr doesn't contain the value that the native varible pBuffer contains. ptr is always 0 when Wrapper.WaitForData returns although pBuffer pointed to an allocated memory area.
Is there a mistake in the prototype? How does a pointer to a byte array need to be marshalled?
you need to pass a reference to a pointer or 'double pointer' like that
__declspec(dllexport) int WaitForData(unsigned char** pBuffer)
and then change the value of the pointer(because it's passed by value)
*pBuffer = 'something'
other option - return the pointer(then you'll have to handle the int/length some other way)
btw that's why your automatically generated prototype looks like this(doesn't have out, ref modifiers)

Calling DLL function with char* param from C#?

I have a C++ dll that I need to call from C#. One of the functions in the dll requires a char* for an input parameter, and another function uses a char* as an output parameter.
What is the proper way to call these from C#?
string should work if the parameter is read-only, if the method modifies the string you should use StringBuilder instead.
Example from reference below:
[DllImport ("libc.so")]
private static extern void strncpy (StringBuilder dest,
string src, uint n);
private static void UseStrncpy ()
{
StringBuilder sb = new StringBuilder (256);
strncpy (sb, "this is the source string", sb.Capacity);
Console.WriteLine (sb.ToString());
}
If you don't know how p/invoke marshaling works you could read http://www.mono-project.com/Interop_with_Native_Libraries
If you are only conserning with strings, read only the section: http://www.mono-project.com/Interop_with_Native_Libraries#Strings
Just using strings will work fine for input parameters, though you can control details about the string with the MarshalAs attribute. E.g.
[DllImport("somedll.dll", CharSet = CharSet.Unicode)]
static extern void Func([MarshalAs(UnmanagedType.LPWStr)] string wideString);
As for returning char* parameters, that's a little more complex since object ownership is involved. If you can change the C++ DLL you can use CoTaskMemAllocate, with something like:
void OutputString(char*& output)
{
char* toCopy = "hello...";
size_t bufferSize = strlen(toCopy);
LPVOID mem = CoTaskMemAlloc(bufferSize);
memcpy(mem, toCopy, bufferSize);
output = static_cast<char*>(mem);
}
The C# side then just uses an 'out string' parameter, and the garbage collector can pick up the ownership of the string.
Another way of doing it would be to use a StringBuilder, but then you need to know how big the string will be before you actually call the function.
Not sure this works, but have you tried with
StringObject.ToCharArray();
Not sure about initialising the String from char * tho. Mybe just assign to a string object, its worth a try.
Have you tried StringBuilder? I found this in a Google search:
[DllImport("advapi32.dll")]
public static extern bool GetUserName(StringBuilder lpBuffer, ref int nSize);
If you post the call you're making we can help you assemble it.
If the DLL function is expecting an allocated buffer of char* (not a wide/multibyte buffer) then the following will work:
[DllImport("somedll.dll", CharSet = CharSet.Ansi)]
static extern void TheFunc(byte[] someBuffer, int someSize);
Here a buffer allocated in c# is passed to TheFunc which fills it with a string of characters (of type char). Bytes aren't "interpreted" by C# they are treated like 8 bit integers, so are perfect for holding 8 bit characters.
An example code snipped would therefore be:
byte[] mybuffer;
int bufSize;
bufSize = 2048;
mybuffer = new byte[bufSize];
TheFunc(mybuffer, bufSize);
string value;
for(value = "", int ix = 0; (mybuffer[ix] != 0) && (ix < bufSize); ix++)
value += (char) mybuffer[ix];
DoSomethingWithTheReturnedString(value);

Categories