Calling a dll with a void* from C# - c#

I've read most of the hints but I can't get it to work.
I have a native C dll with this prototype:
int utl_Conv_HexString(U8 u8_Mode, void* DataIn, void* DataOut, int *piInOutLen, int maxOutLen);
This dll converts several string formats in byte arrays:
The dll is used in a system with unmanaged code (written in C)
Now I would like to use this dll in a C# / WPF Enviroment.
I still use other dll's in C#, but all have prototypes with no void*.
Examples from C:
//ByteArr to Telegramm
u8_Dst[0] = 0xAA;
u8_Dst[1] = 0xBB;
u8_Dst[2] = 0xCC;
u32_InOutLen = 3;
s32_res = utl_Conv_HexString(UTL_CONV_BYTEARR_TO_TELEGRAM, u8_Dst, ac8_Src, &u32_InOutLen, sizeof(ac8_Src));
or
strcpy(ac8_Src, "0xAA,0xBB,0xCC");
memset(u8_Dst, 0, sizeof(u8_Dst));
s32_res = utl_Conv_HexString(UTL_CONV_TELEGRAM_TO_BYTEARR, ac8_Src, u8_Dst, &u32_InOutLen, sizeof(u8_Dst));
My problem is that I can not figure out how this could be used in C#

You should use it like:
public enum U8
{
UTL_CONV_BYTEARR_TO_TELEGRAM = 1, // TODO
UTL_CONV_TELEGRAM_TO_BYTEARR = 2,
}
(you will have to put here your constants...)
[DllImport("SomeDll.dll", CallingConvention = CallingConvention.Cdecl)]
public static extern int utl_Conv_HexString(U8 u8Mode, byte[] dataIn, byte[] dataOut, ref int piInOutLen, int maxOutLen);
(note that the CallingConvention could be StdCall... you'll have to check your code)
and then:
byte[] src = Encoding.UTF8.GetBytes("0xAA, 0xBB, 0xCC");
byte[] dest = new byte[64];
int lenSrc = src.Length;
int res = utl_Conv_HexString(U8.UTL_CONV_TELEGRAM_TO_BYTEARR, src, dest, ref lenSrc, dest.Length);
The void* is normally translated to a byte[].

Related

C# Marshalling unsigned char* array from C++ DLL

I am trying to marshal data from a C++ dll to C# that has an unsigned char* parameter.
I know this question has been asked before but I have tried most of the answers and still not getting far....the best I go was an array of bytes but that contained all 0000's
So here's the C++ function
int Hid_Read(int iIndex, unsigned char* pucaBuffer, int iSize);
Here's the C# DLLImport
[DllImport(#"The.dll", CallingConvention = CallingConvention.Cdecl)]
internal static extern int hid_Read(int iIndex, [Out] byte[] pucaBuffer, int iSize);
And the c# call:-
byte[] HID_ReceiveBuffer = new byte[128];
int intResult = hid_Read(HIDdeviceIndex, HID_ReceiveBuffer, 128);
for (int inteach = 0; inteach < 128; inteach++)
{
Debug.WriteLine("Data = " + HID_ReceiveBuffer[inteach].ToString());
}
This exits successfully but returns an array of 000's in the hid HID_ReceiveBuffer
Any help would be gratefully appreciated

Pass C# Byte[] to C++ API

I have to pass an Byte array containing an MAC-Address to a C++ Method. Since I don't have much experience with working with c
C++ APIsI don't know how to do this. I've tried to pass the array itself, but got an invalid parameter code as response from the API. I've also tried to create an IntPtr but to no avail.
I know that the problem is that C++ can't handle managed datatypes such as arrays, so I've to create a unmanaged array somehow, I think.
Here is the definition of the C++ Method:
ll_status_t LL_Connect(
ll_intf_t intf,
uint8_t address[6]);
The array in C# is defined the following way:
Byte[] addr = new Byte[6];
Of course, the array is not empty.
For example:
C++
extern "C"
{
__declspec(dllexport) void GetData(uint8_t* data, uint32_t length)
{
for (size_t i = 0; i < length; ++i)
data[i] = i;
}
}
C#
[DllImport("LibName.dll", CallingConvention = CallingConvention.Cdecl, CharSet = CharSet.Ansi)]
public static extern void GetData([In, Out] [MarshalAs(UnmanagedType.LPArray)] byte[] data, uint length);
And use in C#
byte[] data = new byte[4];
GetData(data, (unit)data.Lenght);
If you have an array fixed length, for example:
C++
extern "C"
{
__declspec(dllexport) void GetData(uint8_t data[6])
{
for (size_t i = 0; i < 6; ++i)
data[i] = i;
}
}
C#
[DllImport("LibName.dll", CallingConvention = CallingConvention.Cdecl, CharSet = CharSet.Ansi)]
public static extern void GetData([In, Out] [MarshalAs(UnmanagedType.LPArray, SizeConst = 6)] byte[] data);
And use in C#
byte[] data = new byte[6];
GetData(data);
For your case:
[DllImport("LibName.dll", CallingConvention = CallingConvention.Cdecl, CharSet = CharSet.Ansi)]
public static extern int LL_Connect(byte intf, [In, Out] [MarshalAs(UnmanagedType.LPArray, SizeConst = 6)] byte[] address);

Unable to retrieve wchar_t* from C++ to C#

I have been trying to call an API from DLL like below:
[DllImport(#"TELCompress.dll", EntryPoint = "TELMonDecode", CallingConvention = CallingConvention.Cdecl, CharSet = CharSet.Unicode)]
public static extern int TELMonDecode(ref bool a, ref bool b, byte[] ab, System.IntPtr pDestBuf, int j, int byteCount);
Call from C# code
int returnval = TELMonDecode(ref a, ref b, bytes, destPnt, k, bytesRec);
C++ code in the DLL
__declspec(dllexport) int TELMonDecode(bool *bUnicode, bool *bCompress, BYTE *pSourceBuf, wchar_t* pDestBuf, int pDestBufSize,int byteCount)
{
...
CString decodedMsg = _T("<Empty>");
int erc = DecodeByteStream(bUnicode, bCompress, pSourceBuf, &decodedMsg);
::MessageBox(NULL,L"Decoding byte done",L"Caption",0);
pDestBuf = decodedMsg.GetBuffer();
::MessageBox(NULL,pDestBuf,L"Caption in TELMonDecode",0);
...
}
I have referred to many links here but still I am unable to figure out what wrong I am doing.
Please guide.
Thanks for the comments. It was helpful.
Now the code works as below
C# code
[DllImport(#"TELCompress.dll", EntryPoint = "TELMonDecode", CallingConvention = CallingConvention.Cdecl, CharSet = CharSet.Unicode)]
public static extern int TELMonDecode(ref bool a, ref bool b, byte[] ab, ref String pDestBuf, int j, int byteCount);
... //Some code here
//Call to the C++ function
int returnval = TELMonDecode(ref a, ref b, bytes, ref receiveStr, k, bytesRec);
C++ code in the TELCompress.dll
__declspec(dllexport) int TELMonDecode(bool *bUnicode, bool *bCompress, BYTE *pSourceBuf, BSTR* pDestBuf, int pDestBufSize,int byteCount)
{
CString decodedMsg = _T("<Empty>");
//Code to copy data in decodedMsg
CComBSTR tempBstrString(decodedMsg.GetBuffer()); //test
tempBstrString.CopyTo(pDestBuf);
.... //Some more code
return 0;
}
And it works, the string is seen in the C# code which was earlier showing an empty string.
Thanks a lot for all valuable feedback and comments.
-Megha
Use BSTR* instead on wchar_t* and you should be able to use ref String at C# side.

C# call C++ DLL passing pointer-to-pointer argument

Could you guys please help me solve the following issue?
I have a C++ function dll, and it will be called by another C# application.
One of the functions I needed is as follow:
struct DataStruct
{
unsigned char* data;
int len;
};
DLLAPI int API_ReadFile(const wchar_t* filename, DataStruct** outData);
I wrote the following code in C#:
class CS_DataStruct
{
public byte[] data;
public int len;
}
[DllImport("ReadFile.dll", CallingConvention = CallingConvention.Cdecl, CharSet = CharSet.Unicode)]
private static extern int API_ReadFile([MarshalAs(UnmanagedType.LPWStr)]string filename, ref CS_DataStruct data);
Unfortunately, the above code is not working... I guess that is due to the C++ func takes a pointer-to-pointer of DataStruct, while I just passed a reference of CS_DataStruct in.
May I know how can I pass a pointer-to-pointer to the C++ func? If it is not possible, is there any workaround? (the C++ API is fixed, so changing API to pointer is not possible)
Edit:
Memory of DataStruct will be allocated by c++ function. Before that, I have no idea how large the data array should be.
(Thanks for the comments below)
I used the following test implementation:
int API_ReadFile(const wchar_t* filename, DataStruct** outData)
{
*outData = new DataStruct();
(*outData)->data = (unsigned char*)_strdup("hello");
(*outData)->len = 5;
return 0;
}
void API_Free(DataStruct** pp)
{
free((*pp)->data);
delete *pp;
*pp = NULL;
}
The C# code to access those functions are as follows:
[StructLayout(LayoutKind.Sequential)]
struct DataStruct
{
public IntPtr data;
public int len;
};
[DllImport("ReadFile.dll", CallingConvention = CallingConvention.Cdecl, CharSet = CharSet.Unicode)]
unsafe private static extern int API_ReadFile([MarshalAs(UnmanagedType.LPWStr)]string filename, DataStruct** outData);
[DllImport("ReadFile.dll", CallingConvention = CallingConvention.Cdecl)]
unsafe private static extern void API_Free(DataStruct** handle);
unsafe static int ReadFile(string filename, out byte[] buffer)
{
DataStruct* outData;
int result = API_ReadFile(filename, &outData);
buffer = new byte[outData->len];
Marshal.Copy((IntPtr)outData->data, buffer, 0, outData->len);
API_Free(&outData);
return result;
}
static void Main(string[] args)
{
byte[] buffer;
ReadFile("test.txt", out buffer);
foreach (byte ch in buffer)
{
Console.Write("{0} ", ch);
}
Console.Write("\n");
}
The data is now transferred to buffer safely, and there should be no memory leaks. I wish it would help.
It isn't necessary to use unsafe to pass a pointer to an array from a DLL. Here is an example (see the 'results' parameter). The key is to use the ref attribute. It also shows how to pass several other types of data.
As defined in C++/C:
#ifdef __cplusplus
extern "C" {
#endif
#ifdef BUILDING_DLL
#define DLLCALL __declspec(dllexport)
#else
#define DLLCALL __declspec(dllimport)
#endif
static const int DataLength = 10;
static const int StrLen = 16;
static const int MaxResults = 30;
enum Status { on = 0, off = 1 };
struct Result {
char name[StrLen]; //!< Up to StrLen-1 char null-terminated name
float location;
Status status;
};
/**
* Analyze Data
* #param data [in] array of doubles
* #param dataLength [in] number of floats in data
* #param weight [in]
* #param status [in] enum with data status
* #param results [out] array of MaxResults (pre-allocated) DLLResult structs.
* Up to MaxResults results will be returned.
* #param nResults [out] the actual number of results being returned.
*/
void DLLCALL __stdcall analyzeData(
const double *data, int dataLength, float weight, Status status, Result **results, int *nResults);
#ifdef __cplusplus
}
#endif
As used in C#:
private const int DataLength = 10;
private const int StrLen = 16;
private const int MaxThreatPeaks = 30;
public enum Status { on = 0, off = 1 };
[StructLayout(LayoutKind.Sequential, CharSet = CharSet.Ansi)]
public struct Result
{
[MarshalAs(UnmanagedType.ByValTStr, SizeConst = StrLen)] public string name; //!< Up to StrLen-1 char null-terminated name
public float location;
public Status status;
}
[DllImport("dllname.dll", CallingConvention = CallingConvention.StdCall, EntryPoint = "analyzeData#32")] // "#32" is only used in the 32-bit version.
public static extern void analyzeData(
double[] data,
int dataLength,
float weight,
Status status,
[MarshalAs(UnmanagedType.LPArray, SizeConst = MaxResults)] ref Result[] results,
out int nResults
);
Without the extern "C" part, the C++ compiler would mangle the export name in a compiler dependent way. I noticed that the EntryPoint / Exported function name matches the function name exactly in a 64-bit DLL, but has an appended '#32' (the number may vary) when compiled into a 32-bit DLL. Run dumpbin /exports dllname.dll to find the exported name for sure. In some cases you may also need to use the DLLImport parameter ExactSpelling = true. Note that this function is declared __stdcall. If it were not specified, it would be __cdecl and you'd need CallingConvention.Cdecl.
Here is how it might be used in C#:
Status status = Status.on;
double[] data = { -0.034, -0.05, -0.039, -0.034, -0.057, -0.084, -0.105, -0.146, -0.174, -0.167};
Result[] results = new Result[MaxResults];
int nResults = -1; // just to see that it changes (input value is ignored)
analyzeData(data, DataLength, 1.0f, status, ref results, out nResults);
If you do call native code, make sure your structs are alligned in the memory. CLR does not guarantee alignment unless you push it.
Try
[StructLayout(LayoutKind.Explicit)]
struct DataStruct
{
string data;
int len;
};
More info:
http://www.developerfusion.com/article/84519/mastering-structs-in-c/

C# callback receiving UTF8 string

I have a C# function, a callback, called from a Win32 DLL written in C++. The caller gives me a UTF8 string, but I can't receive it properly, all the hungarian special characters go wrong.
[UnmanagedFunctionPointer(CallingConvention.Cdecl)]
public delegate int func_writeLog(string s);
When I changed the parameter type to IntPtr, and wrote the code, it writes properly. But I find this is a very slow solution:
byte[] bb = new byte[1000];
int i = 0;
while (true)
{
byte b = Marshal.ReadByte(pstr, i);
bb[i] = b;
if (b == 0) break;
i++;
}
System.Text.UTF8Encoding encodin = new System.Text.UTF8Encoding();
var sd = encodin.GetString(bb, 0, i);
I tried to write some attribute to string parameter, like:
[UnmanagedFunctionPointer(CallingConvention.Cdecl)]
public delegate int func_writeLog([In, MarshalAs(UnmanagedType.LPTStr)] string s);
no one was working. Any advice please? Thanks in advance!
There's no decent way to do this fast in pure managed code, it always requires copying the string and that's very awkward because you don't know the required buffer size. You'll want to pinvoke a Windows function to do this for you, MultiByteToWideChar() is the work-horse converter function. Use it like this:
using System.Text;
using System.Runtime.InteropServices;
...
public static string Utf8PtrToString(IntPtr utf8) {
int len = MultiByteToWideChar(65001, 0, utf8, -1, null, 0);
if (len == 0) throw new System.ComponentModel.Win32Exception();
var buf = new StringBuilder(len);
len = MultiByteToWideChar(65001, 0, utf8, -1, buf, len);
return buf.ToString();
}
[DllImport("kernel32.dll", CharSet = CharSet.Auto, SetLastError = true)]
private static extern int MultiByteToWideChar(int codepage, int flags, IntPtr utf8, int utf8len, StringBuilder buffer, int buflen);

Categories