Pass pointer to DWORD from c# to c++ DLL - c#

I'm trying to call a unmanaged function from a DLL written in C++ from a C# console application. I have managed to do so for a couple of simple method calls however one of the functions takes a parameter of void* and I'm not sure what to pass it to get it to work.
C++ method signature
BOOL, SetData, (INT iDevice, BYTE iCmd, VOID* pData, DWORD nBytes)
C# method signature
[DllImport("NvUSB.dll", CallingConvention = CallingConvention.Cdecl, CharSet = CharSet.Unicode)]
public static unsafe extern bool SetData(int iDevice, byte iCmd, void* pData, uint nBytes);
Working C++ Call
DWORD pDatatoHi[] = { 'R','E','B','O','O','T' };
SetData(0, 0, pDatatoHi, sizeof(pDatatoHi))
Not working C#
uint* cmd = stackalloc uint[6];
cmd[0] = 'R';
cmd[1] = 'E';
cmd[2] = 'B';
cmd[3] = 'O';
cmd[4] = 'O';
cmd[5] = 'T';
SetData(address, 0, cmd, 6);
The method I am trying to call from the unmanaged DLL should reboot a USB device, when the function is called as per the C++ example above the device reboots correctly. When I call the C# version above the code executes and returns true, however the device does not reboot. I suspect this is due to the way in which I am passing the pData parameter?

When you call SetData(address, 0, cmd, 6) the number of bytes is 24, not 6. 6 is the number of items, not the size of the array, which is what you're passing in the C++ example you've given

Related

Issue with native C++ dll in C#

I have native C++ dll with function that finds the number of cameras connected to the computer and returns their serial number. I am trying to use native C++ dll in C# application but I keep getting the Access Violation error(Attempted to read or write protected memory).
The function in question is
uint32_t GetSerialNumList(char** theBufList, int theBufSize, int theListLength);
The way I am using PInvoke is as follows:
[DllImport(CameraDll, EntryPoint = "GetSerialNumList", CallingConvention = CallingConvention.Cdecl)]
private static extern uint GetSerialNumList(out byte[] pBuf, int BufSize, int ListLength);
If I create native C++ application to use the dll and use the function as follows:
char* theSerialNumb;
theSerialNumb = (char *) malloc(sizeof(char)* 8);
status = TRI_GetSerialNumList(&theSerialNumb, 8, 1);
It works fine however, if I use as follows in C# it give me above mentioned error:
byte[] BufList;
BufList = new byte[8];
rv = GetSerialNumList(out BufList, 8, 1);
The parameter you're passing in c# is a pointer to a byte array. What you're passing in c++ is a pointer to a pointer to a byte array. Also, in the C++ example, you're passing data to the function, but in the C# example, you're passing it as an out instead of a ref.
Although I'm not sure this would work, I would try to create a struct containing a byte array and pass the struct to the external function.
To answer some of the above comments, these functions typically modify memory passed to it rather than try to allocate additional memory due to the different ways programs create heaps.
The first thing I'd check is the C# import signature being used. There's the P/Invoke Interop Assistant tool available for free here.
Loading your function signature into the tool, translates it to:
public partial class NativeMethods {
/// Return Type: unsigned int
///theBufList: char**
///theBufSize: int
///theListLength: int
[System.Runtime.InteropServices.DllImportAttribute("<Unknown>", EntryPoint="GetSerialNumList")]
public static extern uint GetSerialNumList(ref System.IntPtr theBufList, int theBufSize, int theListLength) ;
}
The second thing, is that since you are allocating memory for the buffer in the C++/native version; perhaps you need to pass a pre-allocated buffer as well, when using C#.
Hope this helps.
Okay, I took pointers from Russell and kvr and did some digging around and following is the scheme that I came up with.
Original native function call:
uint32_t GetSerialNumList(char** theBufList, int theBufSize, int theListLength);
The way I am using PInvoke is as follows:
[DllImport(CameraDll, EntryPoint = "GetSerialNumList", CallingConvention = CallingConvention.Cdecl)]
private static extern int GetSerialNumList(ref IntPtr pBuf, int BufSize, int ListLength);
byte[] BufIn;
BufIn = new byte[8 * ListLength];
IntPtr pBuf = IntPtr.Zero;
pBuf = Marshal.AllocHGlobal(8 * ListLength);
Console.WriteLine("Calling GetSerialNumList");
rv = GetSerialNumList(ref pBuf, 8, ListLength);
Marshal.Copy(pBuf, BufIn, 0, 8*ListLength);
I feel this is somewhat long, but it gives me the desired result.

Marshaling C# struct for C++ dll function

I have a c++ dll function that takes as a parameter a pointer to this struct:
struct tLBCSHREP_PARAMS
{
BYTE Ps;
char* Shift;
char* Cashier;
char* CashRegNr;
};
, where BYTE is an 8 bit integer.
I am calling this c++ function in C# code. I have created C# equivalent for that c++ struct:
[StructLayout(LayoutKind.Sequential, CharSet = CharSet.Ansi)]
public struct tLBCSHREP_PARAMS
{
public byte Ps;
public IntPtr Shift;
public IntPtr Cashier;
public IntPtr CashRegNr;
};
I am creating an instatce of this struct and then a pointer:
tLBCSHREP_PARAMS p_tLBCSHREP_PARAMS = new tLBCSHREP_PARAMS();
p_tLBCSHREP_PARAMS = rapkas;
rapkas.Ps = (byte)ps;
rapkas.Shift = Marshal.StringToHGlobalAnsi(shift);
rapkas.Cashier = Marshal.StringToHGlobalAnsi(cashier);
rapkas.CashRegNr = Marshal.StringToHGlobalAnsi(cashregnr);
IntPtr ptrLBTSRLN = Marshal.AllocHGlobal(Marshal.SizeOf(rapkas));
Marshal.StructureToPtr(rapkas, ptrLBTSRLN, false);
After passing pointer to the dll funcion I get error: 0xC0000001. To my knowledge it means segmentation fault, so probably struct is not created in right way.
I have tried many times adding differrent attributes to the struct,
adding '[MarshalAs(UnmanagedType.U1)]' phrase before 'public byte Ps;' variable and many more. Nothing worked ;(

IntPtr does not contain native value

I have a native method that has to deliver a byte array to a .NET wrapper. The natove method looks like:
__declspec(dllexport) int WaitForData(unsigned char* pBuffer)
{
return GetData(pBuffer);
}
GetData allocates a memory region using malloc and copies some data (a byte stream) into it. This byte stream was received via a socket connection. The return value is the length of pBuffer.
This method has to be called from .NET. The import declaration looks as follows:
[DllImport("CommunicationProxy.dll")]
public static extern int WaitForData(IntPtr buffer);
[EDIT]
The the P/Invoke Interop Assistant, that dasblinkenlight advised, translates the prototype to the following import signature:
public static extern int WaitForData(System.IntPtr pBuffer)
The result is the same: ptr is 0 after calling the method.
[/EDIT]
Atfer the method was called, the result is extracted:
IntPtr ptr = new IntPtr();
int length = Wrapper.WaitForData(ref ptr);
byte[] buffer = new byte[length];
for(int i = 0;i<length;i++)
{
buffer[i] = System.Runtime.InteropServices.Marshal.ReadByte(ptr, i);
}
Wrapper.FreeMemory(ptr);
The problem is, that the managed variable ptr doesn't contain the value that the native varible pBuffer contains. ptr is always 0 when Wrapper.WaitForData returns although pBuffer pointed to an allocated memory area.
Is there a mistake in the prototype? How does a pointer to a byte array need to be marshalled?
you need to pass a reference to a pointer or 'double pointer' like that
__declspec(dllexport) int WaitForData(unsigned char** pBuffer)
and then change the value of the pointer(because it's passed by value)
*pBuffer = 'something'
other option - return the pointer(then you'll have to handle the int/length some other way)
btw that's why your automatically generated prototype looks like this(doesn't have out, ref modifiers)

How does one create structures for C# originally written in C++

I am working on an embedded ARM platform and wish to have control over some of the GPIO pins. I've experience in C and am a newcomer to C#. It seems that controlling low-level hardware is difficult from managed-code applications. I have Windows CE6.0 and .NET Compact Framework 2 running on my hardware.
I've found an example, written in C++ that would allow me access to GPIO port pins, however, I am struggling to implement the example in C#.
The following snippet shows how DeviceIOcontrol is used to control port pins:
const struct pio_desc hw_pio[] =
{
{"LED1", AT91C_PIN_PA(13), 0, PIO_DEFAULT, PIO_OUTPUT},
{"LED2", AT91C_PIN_PA(14), 0, PIO_DEFAULT, PIO_OUTPUT},
};
T_GPIOIOCTL_STATE * pSetState;
T_GPIOIOCTL_STATE * pGetState;
// Configure PIOs
bSuccessDevIOC = DeviceIoControl(hGPIO, IOCTL_GPIO_CONFIGURE, (LPBYTE*)hw_pio, sizeof(hw_pio), NULL, 0, NULL, NULL);
Other definitions:
struct pio_desc
{
const char *pin_name; /* Pin Name */
unsigned int pin_num; /* Pin number */
unsigned int dft_value; /* Default value for outputs */
unsigned char attribute;
enum pio_type type;
};
/* I/O type */
enum pio_type
{
PIO_PERIPH_A,
PIO_PERIPH_B,
PIO_INPUT,
PIO_OUTPUT,
PIO_UNDEFINED
};
/* I/O attributes */
#define PIO_DEFAULT (0 << 0)
#define PIO_PULLUP (1 << 0)
#define PIO_DEGLITCH (1 << 1)
#define PIO_OPENDRAIN (1 << 2)
The following is the C# definition of DeviceIOControl. The problem parameter is lpInBuffer.
[DllImport("coredll.dll", EntryPoint = "DeviceIoControl", SetLastError = true)]
internal static extern bool DeviceIoControlCE(int hDevice,
int dwIoControlCode,
byte[] lpInBuffer,
int nInBufferSize,
byte[] lpOutBuffer,
int nOutBufferSize,
ref int lpBytesReturned,
IntPtr lpOverlapped);
The questions:
How does one create these equivalent structures in C#?
How does one pass these as a byte array (byte[] lpInBuffer) to the DeviceIOControl function?
Any assistance appreciated!
Use some interop decoration, for instance, a structure like following:
typedef struct
{
char Data[MAXCHARS];//assuming a #define MAXCHARS 15
int Values[MAXCHARS];
} StSomeData;
would look like following in C#:
[StructLayout(LayoutKind.Sequential)]
private struct StSomeData
{
[System.Runtime.InteropServices.MarshalAs(System.Runtime.InteropServices.UnmanagedType.ByValTStr, SizeConst = 15)]
public string Data;
[System.Runtime.InteropServices.MarshalAs(System.Runtime.InteropServices.UnmanagedType.ByValArray, SizeConst = 15)]
public int[] Values;
}
And use it like: StSomeData[] array = new StSomeData[3];
Note that you can use IntPtr when dealing with pointers.
For instance your call to:
DeviceIoControl(hGPIO, IOCTL_GPIO_CONFIGURE, (LPBYTE*)hw_pio
, sizeof(hw_pio), NULL, 0, NULL, NULL);
may look something like following:
IntPtr ipByte;
Marshal.StructureToPtr(StLPByte, ipByte,false);
IntPtr ipConfig;
Marshal.StructureToPtr(StIOCTL_GPIO_CONFIGURE, ipConfig, false);
IntPtr iphGPIO;
Marshal.StructureToPtr(SthGPIO, iphGPIO, false);
bool bSuccessDevIOC = DeviceIoControl(iphGPIO
, ipConfig
, ipByte
, Marshal.SizeOf(typeof(StLPByte))
, IntPtr.Zero
, IntPtr.Zero
, IntPtr.Zero
, IntPtr.Zero);
Also, you can look into the usage of unsafe keyword and try put your code within unsafe blocks; this may be a dirty solution since this code wont be a managed code.
a byte[] is converted to LPBYTE automatically
a char* in c# is equivalent to unsigned short* in c++
a byte* in c# is a unsigned char* in c++
c#-enums behave similar enough to c++ enums. you may just write it
one important thing:
[StructLayout(LayoutKind.Sequential,Pack=1)]

calling win32 dll api from C# application

I have created a win32 dll which sends and recieves ssl data packets from our server, I am calling a dll function using P/Invoke mechanism from my C# app which does all necessary tasks.
When I call Connect(char* lpPostData)
function and I use static char postData [] array as a posting request it works fine , if I use char* lpPostData as sent parameter from my C# app for posting request it doesn't works. Is it something with conversion of C# string to char * ?? if that is the case how do i do it ?? How to debug the in Win32 dll ???
Calling the exported function from C# app:
[DllImport("testdllwm6.dll", EntryPoint = "Connect")] public
static extern int pConnect(string postdata);
string postdata="<SyncML><SyncHdr><VerDTD>1.2</VerDTD><VerProto>SyncML/1.2</VerProto><SessionID>33622878</SessionID><MsgID>1</MsgID><Target><LocURI>http://sync.com</LocURI></Target><Source><LocURI>IMEI::358997011403172</LocURI><LocName>syncfdg</LocName></Source><Meta><MaxMsgSize
xmlns=\"syncml:metinf\">10000</MaxMsgSize></Meta></SyncHdr><SyncBody><Alert><CmdID>1</CmdID><Data>201</Data><Item><Target><LocURI>contacts</LocURI></Target><Source><LocURI>./contacts</LocURI></Source><Meta><Anchor
xmlns=\"syncml:metinf\"><Last>000000T000000Z</Last><Next>20091125T122400Z</Next></Anchor></Meta></Item></Alert><Final></Final></SyncBody></SyncML>";
int j = pConnect(postdata);
Declaration is:
__declspec(dllexport) int Connect(char* lpPostData);
The function is defined as:
__declspec(dllexport) int Connect(char* lpPostData) {
LPCTSTR lpszAgent = _T("CeHttp");
DWORD dwError; DWORD sizeInResult,
sizeOutResult, sizeToWrite,
sizeWritten,dwRead; HINTERNET
hInternet=NULL; HINTERNET
hConnect=NULL; HINTERNET
hRequest=NULL; LPDWORD
pSizeInResult = &sizeInResult;
LPDWORD pSizeOutResult = &sizeOutResult;
LPDWORD pSizeToWrite = &sizeToWrite;
LPDWORD pSizeWritten = &sizeWritten; int read = 0;
char postData[637]
="<SyncML><SyncHdr><VerDTD>1.2</VerDTD><VerProto>SyncML/1.2</VerProto><SessionID>66622878</SessionID><MsgID>1</MsgID><Target><LocURI>http://sync.com</LocURI></Target><Source><LocURI>IMEI::358997011403172</LocURI><LocName>new123</LocName></Source><Meta><MaxMsgSize
xmlns=\"syncml:metinf\">10000</MaxMsgSize></Meta></SyncHdr><SyncBody><Alert><CmdID>1</CmdID><Data>201</Data><Item><Target><LocURI>contacts</LocURI></Target><Source><LocURI>./contacts</LocURI></Source><Meta><Anchor
xmlns=\"syncml:metinf\"><Last>000000T000000Z</Last><Next>20091125T122400Z</Next></Anchor></Meta></Item></Alert><Final></Final></SyncBody></SyncML>";
LPCWSTR lpszHeaders =_T("Content-Type: application/vnd.sync+xml");
BOOL bResult;
if(!HttpSendRequest(hRequest,lpszHeaders,wcslen(lpszHeaders),
lpPostData,strlen(lpPostData)))
{
dwError = GetLastError();
printf(" not HttpSendRequest");
return read;
}
return read;
The failure point is very obvious. Windows CE is Unicode. The string in C# is a wide-character array, the char[] in C is a multibyte. You're mixing the two, and that is bad, bad, bad.
I mean you're mixing them in the same call, sending wide headers and multibyte postData to HttpSendRequest? That certainly can't be right.
Change the Connect function to look like this:
int Connect(TCHAR* lpPostData)
try it again, and come back with the results.
Of course this also means you need to change the strlen call as well.
As a side note, I don't understand why you would call into C++ for this call anyway. You could do it right from your C# app.
seems the dll is a MFC extension dll, maybe only can be callec by MFC application. i am not sure.
Is it something with conversion of C# string to char * ??
The default CharSet used by the .Net Interop Marshaler is Ansi.
If you want use Unicode(LPCWSTR) parameters,
you can try:
[DllImport("testdllwm6.dll", EntryPoint = "Connect", CharSet=CharSet.Unicode)]
public static extern int pConnect(string postdata);
BTW, you can refer to .Net 2.0 Interoperability Recipes: A Problem-Solution Approach
for more information.
You need to add the MarshalAs attribute to the string parameter so the .NET runtime knows how to marshal it; by default strings are marshaled as Unicode, but you want ANSI:
[DllImport("testdllwm6.dll", EntryPoint="Connect")]
public static extern int Connect([MarshalAs(UnmanagedType.LPStr)] string lpPostData);
use
System.Text.StringBuilder
pass that to your function

Categories