I am using Mono/C# on Linux and have the following C# code:
[DllImport("libaiousb")]
extern static ResultCode QueryDeviceInfo(uint deviceIndex,
ref uint PID, ref uint nameSize, StringBuilder name,
ref uint DIOBytes, ref uint counters);
And I call a Linux shared library call defined as follows:
unsigned long QueryDeviceInfo(
unsigned long DeviceIndex
, unsigned long *pPID
, unsigned long *pNameSize
, char *pName
, unsigned long *pDIOBytes
, unsigned long *pCounters
)
I have set the parameters to known values before calling the Linux function. I've also put a printf at the beginning of the Linux function and all the parameters are printing values as expected. So the parameters seem to be passed from C# to Linux ok. The return value is also good.
However, all the other parameters that are passed by reference come back garbage.
I modified the Linux function so it simply modifies the values and returns. Here's that code:
unsigned long QueryDeviceInfo(
unsigned long DeviceIndex
, unsigned long *pPID
, unsigned long *pNameSize
, char *pName
, unsigned long *pDIOBytes
, unsigned long *pCounters
) {
printf ("PID = %d, DIOBYtes = %d, Counters = %d, Name= %s", *pPID, *pDIOBytes, *pCounters, pName);
*pPID = 9;
*pDIOBytes = 8;
*pCounters = 7;
*pNameSize = 6;
return AIOUSB_SUCCESS;
All the ref parameters still come back as garbage.
Any ideas?
libaiousb.c
unsigned long QueryDeviceInfo(
unsigned long deviceIndex
, unsigned long *pPID
, unsigned long *pNameSize
, char *pName
, unsigned long *pDIOBytes
, unsigned long *pCounters
)
{
*pPID = 9;
*pDIOBytes = 8;
*pCounters = 7;
*pNameSize = 6;
return 0;
}
libaiousb.so
gcc -shared -o libaiousb.so libaiousb.c
Test.cs
using System;
using System.Runtime.InteropServices;
using System.Text;
class Test
{
[DllImport("libaiousb")]
static extern uint QueryDeviceInfo(uint deviceIndex,
ref uint pid, ref uint nameSize, StringBuilder name,
ref uint dioBytes, ref uint counters);
static void Main()
{
uint deviceIndex = 100;
uint pid = 101;
uint nameSize = 102;
StringBuilder name = new StringBuilder("Hello World");
uint dioBytes = 103;
uint counters = 104;
uint result = QueryDeviceInfo(deviceIndex,
ref pid, ref nameSize, name,
ref dioBytes, ref counters);
Console.WriteLine(deviceIndex);
Console.WriteLine(pid);
Console.WriteLine(nameSize);
Console.WriteLine(dioBytes);
Console.WriteLine(counters);
Console.WriteLine(result);
}
}
Test.exe
gmcs Test.cs
Run:
$ mono Test.exe
100
9
6
8
7
0
Somewhat unrelated, but something to keep in mind is that the sizes of C and C++ types are not fixed in stone. Specifically, sizeof(unsigned long) will vary between 32-bits on 32-bit platforms (ILP32 systems) and 64-bits on 64-bit platforms (LP64 platforms).
Then there's Win64, which is a P64 platform, so sizeof(unsigned long) == 4 (32 bits).
The short of it is that your P/Invoke signature:
[DllImport("libaiousb")]
static extern uint QueryDeviceInfo(uint deviceIndex,
ref uint pid, ref uint nameSize, StringBuilder name,
ref uint dioBytes, ref uint counters);
Is broken -- it will only work correctly on 32-bit platforms (because C# uint is always 32-bits, while the unsigned long will be 64-bits on LP64 platforms), and will FAIL (rather horribly) on 64-bit platforms.
There are three fixes:
IFF you will always be on Unixy platforms (e.g. ILP32 and LP64 platforms only, not P64 Win64), you can use UIntPtr for unsigned long. This will cause it to be 32-bits on ILP32 platforms, and 64-bits on LP64 platforms -- the desired behavior.
Alternatively, you can provide multiple sets of P/Invoke signatures in your C# code, and perform a runtime check to determine which ABI you're running on to determine which set of signatures to use. Your runtime check could use IntPtr.Size and Environment.OSVersion.Platform to see if you're on Windows (P64) or Unix (ILP32 when IntPtr.Size == 4, LP64 when IntPtr.Size == 8).
Otherwise, you need to provide an ABI-neutral C binding to P/Invoke to, which would export functions using e.g. uint64_t (C# ulong) instead of exporting unsigned long. This would allow you to use a single ABI from C# (64-bits everywhere), but requires that you provide a wrapping C library that sits between your C# code and the actual C library you care about. Mono.Posix.dll and MonoPosixHelper follow this route to bind ANSI C and POSIX functions.
Related
I am trying to create a simple dll in C++ and call it from C# using PInvoke. I want to pass in a byte array, do some manipulation on it, and send back another byte array. I figured I would pass in the frame and size then create an unmanaged unsigned char*. I would pass that back in an out IntPtr and return the size. Then later I would free that memory with another function. Everything is working ok except I cannot get the out IntPtr to work. I always just get 0 back. I created my C++ dll in qt. Here is the code I have so far.
#pragma once
#ifdef TOOLS_EXPORTS
#define TOOLS_API __declspec(dllexport)
#else
#define TOOLS_API __declspec(dllimport)
#endif
class Tools
{
public:
int TOOLS_API TestFunction(const unsigned char *inData, int inSize, unsigned char *outData);
};
#include "Tools.h"
int Tools::TestFunction(const unsigned char *inData, int inSize, unsigned char* outData)
{
outData = (unsigned char*)malloc(sizeof(unsigned char) * inSize);
memcpy(outData, inData, sizeof(unsigned char) * inSize);
return inSize;
}
[DllImport("Tools.dll", EntryPoint = "?TestFunction#Tools##QAEHPBEHPAE#Z", CallingConvention = CallingConvention.StdCall)]
public static extern int TestFunction(byte[] inData, int inSize, out IntPtr outData);
IntPtr outData;
int test = TestFunction(data, data.Length, out outData);
You have the final parameter as:
unsigned char* outData
This is a pointer, passed by value. Any modifications to the address are not seen by the caller, because the address is passed by value.
You need to return the address of the memory you allocate to the caller. So you need:
unsigned char** outData
Then in the implementation of the function in your C++ code you replace references to outData with *outData.
By definition sizeof(unsigned char) == 1 and it would be idiomatic to replace sizeof(unsigned char) * inSize with inSize.
After searching, I heard that UInt32 was the C# equivalent of C++ DWORD.
I tested results by performing the arithmetic
*(DWORD*)(1 + 0x2C) //C++
(UInt32)(1 + 0x2C) //C#
They produce completely different results. Can someone please tell me the correct match for DWORD in C#?
Your example is using the DWORD as a pointer, which is most likely an invalid pointer. I'm assuming you meant DWORD by itself.
DWORD is defined as unsigned long, which ends up being a 32-bit unsigned integer.
uint (System.UInt32) should be a match.
#import <stdio.h>
// I'm on macOS right now, so I'm defining DWORD
// the way that Win32 defines it.
typedef unsigned long DWORD;
int main() {
DWORD d = (DWORD)(1 + 0x2C);
int i = (int)d;
printf("value: %d\n", i);
return 0;
}
Output: 45
public class Program
{
public static void Main()
{
uint d = (uint)(1 + 0x2C);
System.Console.WriteLine("Value: {0}", d);
}
}
Output: 45
DWord definition from microsoft:
typedef unsigned long DWORD, *PDWORD, *LPDWORD;
https://msdn.microsoft.com/en-us/library/cc230318.aspx
Uint32 definition from microsoft
typedef unsigned int UINT32;
https://msdn.microsoft.com/en-us/library/cc230386.aspx
now you can see the difference.... one is unsigned long and the other is unsigned int
Your two snippets do completely different things. In your C++ code, you are, for some strange reason, converting the value (1 + 0x2C) (a strange way to write 45) to a DWORD*, and then dereferencing it, as if that address is actually a valid memory location. With the C#, you are simply converting between integer types.
I have a C-DLL + header file and try to p/invoke a function from C#. I also have some example C++ code of how to use the function. Here is the function definition:
int GetData(unsigned char* buffer, long bufferSize);
The more interesting part is the example code and how the function can be called:
if(dataSize == 16)
{
unsigned short* usData = new unsigned short[m_numX * m_numY * 3 / 2];
GetData( (unsigned char*)usData, m_numX * m_numY * sizeof(unsigned short) );
}
else if (dataSize == 32)
{
unsigned long* ulData = new unsigned long[m_numX * m_numY];
GetData( (unsigned char*)ulData, m_numX * m_numY * sizeof(unsigned long) );
}
So, depending on the dataSize variable, the actual data array can be an ushort or an ulong. However, it is always passed as an unsigned char pointer.
For the sake of simpleness I just tried to get at least one of the variants to work. Here's the code I tried for dataSize = 16
[DllImport("External.dll", EntryPoint = "GetData", CharSet = CharSet.Unicode, CallingConvention = CallingConvention.Cdecl)]
public static extern int GetData(ref ushort[] pBuffer, long lBufferSize);
long bufferSize1 = numX * numY * 3 / 2;
long bufferSize2 = numX * numY * sizeof(ushort);
ushort[] data = new ushort[bufferSize1];
GetData(ref data, bufferSize2)
If I run above code, the application quits with an 'Access Violation' Exception. That usually means, that the unsafe code tried to write over the buffer limits or that the p/invoke declaration has an error. I tried huge buffers (which would be able to hold any kind of data I'm expecting) but my guess would be, that my mistake is in the declaration.
I also tried to declare the buffer as byte[] (since the example casts it as unsigned char*) and ulong in the p/invoke declaration. same for the actual buffer I pass as reference. the error remains the same.
How can I make this work?
A couple of mistakes:
The array must not be passed as ref. That is because ref ushort[] matches unsigned short**.
C++ long does not match C# long on Windows. The former is 32 bits, the latter 64 bits.
You need to import like this:
[DllImport("External.dll", CallingConvention = CallingConvention.Cdecl)]
public static extern int GetData(
[Out] ushort[] pBuffer,
int lBufferSize
);
It would be perfectly reasonable, for convenience, to use an overload for the 32 bit data variant:
[DllImport("External.dll", CallingConvention = CallingConvention.Cdecl)]
public static extern int GetData(
[Out] uint[] pBuffer,
int lBufferSize
);
Likewise, and overload for an array of byte would also be valid should you need it.
With C/C++ DLL SDK fun,like this:
INT CmdGetAllLog( BYTE *bStream, UINT16 *nCount, const UINT8 nblk )
but in project use c#,I do it with:
[DllImport("C:\\PrBioApi.dll", EntryPoint = "CmdGetAllLog")]
private static extern bool CmdGetAllLog(IntPtr bStream, ref UInt16 nCount, byte nblk);
and I use it with:
int nMallocSize = Marshal.SizeOf(new LOG_RECORD()) * stuSystem.wLogCnt + 4096;
byte[] pRecord = new byte[nMallocSize];
IntPtr p = Marshal.AllocHGlobal(Marshal.SizeOf(nMallocSize));
Marshal.Copy(pRecord, 0, p, pRecord.Length);
bGetSucc = CmdGetAllLog(p, ref nGet, nBlk++);
Marshal.FreeHGlobal(p);
but it did not work.
would anyone can help me ?thanks.
Your code which copies between the managed array, and the unmanaged pointer, is in the wrong place. It would need to be after the call to the unmanaged function.
But you may as well let the p/invoke marshaller do the work for you:
[DllImport(#"C:\PrBioApi.dll")]
private static extern bool CmdGetAllLog(
byte[] bStream,
ref ushort nCount,
byte nblk
);
int nMallocSize = ...;
byte[] pRecord = new byte[nMallocSize];
bool bGetSucc = CmdGetAllLog(pRecord, ref nGet, nBlk++);
Because a byte array is blittable then the marshaller will just pin your array during the call and hand it off to the native code.
I'm assuming that the other two parameters are passed correctly. Since you did not specify any more details of the interface, they could well be wrong. I'd guess that nGet is used to tell the function how big the buffer is, and to return how much was copied to it by the function. I cannot see where you specify nGet in the question. I'm trusting that you got that bit right.
Some other comments:
You may need to specify a calling convention in the DllImport attribute. Is the native code cdecl perhaps?
The return value is INT in the native code but you've mapped it to bool. That probably is fine if the protocol is that non-zero return means success. But if the return value indicates more than that then you'd clearly need to use int. Personally I'd be inclined to use int and stay true to the native.
I want to use a function in a C++ DLL in my C# application using DLLImport:
BOOL IsEmpty( DWORD KeyID, BOOL *pFlag )
I tried many combinations but in vain, like:
public extern static bool IsEmpty(int KeyID, ref bool pFlag);
The method returns false (that means an error).
Any idea how to do that?
To quote "Willy" (with amendments):
Beware the booleans!
Win32 defines different versions of booleans.
1) BOOL used by most Win32 API's, is an unsigned int a signed int (4 bytes)
2) BOOLEAN is a single byte, only used by a few win32 API's!!
3) and C/C++ has it's builtin 'bool' which is a single byte
...and to add what #tenfour pointed out:
4) the even more bizarre VARIANT_BOOL
typedef short VARIANT_BOOL;
#define VARIANT_TRUE ((VARIANT_BOOL)-1)
#define VARIANT_FALSE ((VARIANT_BOOL)0)
The signed or unsigned nature shouldn't matter for BOOL, as the only "false" pattern is 0. So try treating it as a 4 byte quantity...however you interface with a DWORD may be satisfactory, (I've not dealt with Windows 64-bit conventions.)
BOOL in Win32 is a typedef of int, so you should just change bool to Int32, so the definition is int IsEmpty(uint KeyID, ref int pFlag)
because in c++ BOOL is defined as int. You should use
[return: System.Runtime.InteropServices.MarshalAsAttribute(System.Runtime.InteropServices.UnmanagedType.Bool)]
public static extern bool IsEmpty(uint KeyID, ref int pFlag) ;
Thank you for your help!
finally this works for me
public extern static int IsEmpty(
int KeyID,
int[] pFlag)