I need to send data from C++ to C#.
On C++ side, under Linux, I am using ZMQ library version 4.1.4.
C# side is using clrzmq4 library based on 4.1.5 version.
So the part where I send message in C++:
char tempStr[] = "ABCD";
zmq_msg_t zmsg;
zmq_msg_init_size(&zmsg, 4);
memcpy(zmq_msg_data(&zmsg), tempStr, 4);
int rc = zmq_send(reqSocket, &zmsg, sizeof(zmsg), 0);
zmq_msg_close(&zmsg);
C# code to retrieve message:
ZFrame request = responder.ReceiveFrame();
byte[] reqBytes = new byte[100];
int n = request.Read(reqBytes, 0, 100);
The problem is that byte array is including all 64 bytes of zmq_msg_t. The actual data is starting from offset 16.
Question - how to properly extract data in this case? Extracting by hard coding the offset in my code is simply ugly, because one day zmq_msg_t may be changed from sender side and data will be located somewhere else. Other option is to avoid using zmq_msg_t, when both sides are not using same platform/framework. In the clrzmq4 framework I can see there are delegates for zmq_msg_t types, but not sure how to use them and whether they intended for public usage.
Your mixing the send types on the c++ side, if using zmq_msg_t you dont need the variant with a size, thats for sending a buffer.
If using the buffer function
int zmq_send (void *socket, void *buf, size_t len, int flags)
Then you should do
zmq_send(reqSocket, tempStr, 4, 0);
However if using zmq_msg_t variant
int zmq_msg_send (zmq_msg_t *msg, void *socket, int flags)
Then you should do :
zmq_msg_send (&zmsg, reqSocket, 0);
Related
I am trying to prepare a simple GUI based AES-CMAC calculator.For this I have decided to create c dll out of open ssl libraries.[I Dont want to use .net for calculating AES-CMAC].This DLL ,I have tested with test application created in c++(console) and value generated are as per test vectors. But when I am trying to call this function from c#.I get wrong values.Here I am using byte[] instead of unsigned char*.
My code snippet for c function is
double calc_AES_CMAC(unsigned char* message ,unsigned char* key,unsigned char* cmac_16)
{
size_t mactlen;
CMAC_CTX *ctx = CMAC_CTX_new();
CMAC_Init(ctx, key, 16, EVP_aes_128_cbc(), NULL);
CMAC_Update(ctx, message, sizeof(message));
CMAC_Final(ctx, cmac_16, &mactlen);
CMAC_CTX_free(ctx);
return 0;
}
And my calling C# code is
Firstly Function import
[DllImport("C:\\Users\\Sudhanwa\\Documents\\Visual Studio 2010\\Projects\\Ccsharpdll\\Debug\\Ccsharpdll.dll", CallingConvention = CallingConvention.Cdecl)]
public static extern double calc_AES_CMAC(byte[] message, byte[] key, byte[] output);
Secondly Button click event
byte [] null_arr = new byte[16];
// K: 2b7e1516 28aed2a6 abf71588 09cf4f3c
byte[] key = { 0x2b,0x7e,0x15,0x16,
0x28,0xae,0xd2,0xa6,
0xab,0xf7,0x15,0x88,
0x09,0xcf,0x4f,0x3c };
// M: 6bc1bee2 2e409f96 e93d7e11 7393172a Mlen: 128
byte[] message= { 0x6b,0xc1,0xbe,0xe2,
0x2e,0x40,0x9f,0x96,
0xe9,0x3d,0x7e,0x11,
0x73,0x93,0x17,0x2a };
byte [] cmac = new byte [16];
c = calc_AES_CMAC(message, key, cmac);
string ans = ByteArrayToString(cmac);
MessageBox.Show(ans);
In this code, I get 16 Byte hex output but this does not match with correct result.
You need to indicate to the marshaller that you expect that data is returned (and how much data) in the output parameter:
public static extern double calc_AES_CMAC(byte[] message, byte[] key,
[In, Out, MarshalAs(UnmanagedType.LPArray, SizeConst=16)] byte[] output);
Otherwise a copy of the current content of the array will be passed to the C++ function but any modifications will not be copied back to the C# caller.
I have some C++/CLI code that creates a simple CImg image and draws a circle on it. I want to pass it to C#, but I'm not sure how. I thought of using a byte array to pass it to C#, but I can't get the length of the array, which is needed for any conversion from byte* to byte[] or for passing into the unmanaged memory stream. I've tried using strlen, but that just returns 0.
Here is my C++ code:
unsigned char* calculateFrame::ReadImage() {
CImg<unsigned char> testImage(1920, 1080, 1, 3, 0);
const unsigned char white[3] = { 255,255,255 };
testImage.draw_circle(256, 256, 200, white, 1.0f, ~0U);
testImage.draw_point(500, 500, white, 255);
unsigned char* charArray = (unsigned char*)testImage;
return charArray;
}
C# code:
Bitmap testBmp;
using(var test = new FrameCalculator.calculateFrame())
{
Console.WriteLine(test.calculateOneFrame(3));
unsafe
{
byte* imageArray = test.ReadImage();
using(var ms = new UnmanagedMemoryStream(imageArray , /* length of byte* (unknown) */))
{
testBmp = new Bitmap(ms);
}
}
}
If you have any tricks to get around unsafe code without sacrificing performance, that would be nice, but I'm not opposed to using unsafe code if it's necessary.
I ended up deciding that in the future, I would need a frame buffer, which necessitated writing the frames to the disk, so that they weren't lost in a restart.
Anyways, my solution was to write the image to disk as a .BMP and access it using Image.FromFile in C#. This isn't a great way to do this in most cases, because it adds a lot of overhead, but it made sense in my program.
I am using DllImport to receive data from an external DLL. I receive the data using the following structure in C#
public struct EventBuffer
{
[MarshalAs(UnmanagedType.ByValArray, SizeConst = CSTA_MAX_HEAP)]
public byte[] data;
};
So far I have been able to work with everything I have received from this DLL. However one the structures I receive has pointer inside (C++)
typedef struct ConnectionList_t {
_Int count;
//Connection_t FAR *connection;
Connection_t FAR * POINTER_32 connection;
} ConnectionList_t;
When I parse the byte[] in C# I get an address instead of the array itself. So I have tried to access that memory address with the code below. But the result is not what I expected, I know that the data is valid as my C++ test program receives right values
Int32 pointerToAddress = BitConverter.ToInt32(buffer, 4);
IntPtr intPtr = new IntPtr(pointerToAddress);
byte[] luckyYou = new byte[2048];
Marshal.Copy(intPtr, luckyYou, 0, lenghtOfMarshalledStructure);
Am I missing something to access a memory address received from C++?
I am fairly new to using p/invoke calls and am wondering if someone can guide me on how to retrieve the raw pixel data (unsigned char*) from an hbitmap.
This is my scenario:
I am loading a .NET Bitmap object on the C# side and sending it's IntPtr to my unmanaged c++ method. Once I receive the hbitmap ptr on the C++ side, I would like to access the Bitmaps' pixel data. I already made a method that accepts an unsigned char* which represents the raw pixel data from c# however I found extracting the byte[] from the c# is fairly slow. This is why I want to send in the Bitmap ptr instead of converting the Bitmap into a byte[] and sending that to my C++ method.
C# code for getting Bitmap IntPtr
Bitmap srcBitmap = new Bitmap(m_testImage);
IntPtr hbitmap = srcBitmap.GetHbitmap();
C# code for importing c++ method
[SuppressUnmanagedCodeSecurityAttribute()]
[DllImport("MyDll.dll", CharSet = CharSet.Unicode, CallingConvention = CallingConvention.Cdecl)]
public static extern int ResizeImage(IntPtr srcImg);
C++ method that will receive the Hbitmap handler
int Resize::ResizeImage(unsigned char* srcImg){
//access srcImgs raw pixel data (preferably in unsigned char* format)
//do work with that
return status;
}
Questions:
1) Since I am sending in an IntPrt, can my C++ method parameter be an unsigned char* ?
2) If not, how can I access the bitmap's raw data from c++?
The GetHbitmap method does not retrieve pixel data. It yields a GDI bitmap handle, of type HBITMAP. Your unmanaged code would receive that as a parameter of type HBITMAP. You can obtain the pixel data from that using GDI calls. But it is not, in itself, the raw pixels.
In fact, I'm pretty sure you are attacking this problem the wrong way. You are probably heading this way because GetPixel and SetPixel are slow. This quite true. Indeed, their GDI equivalents are too. What you need to do is to use LockBits. This will allow you to operate on the entire pixel data in C# in an efficient way. A good description of the subject can be found here: https://web.archive.org/web/20141229164101/http://bobpowell.net/lockingbits.aspx. Note that, for efficiency, this is one type of C# code where unsafe code and pointers is often the best solution.
If, for whatever reason, you still wish to operate on the pixel data using C++ code, then you can still use LockBits as the simplest way to get a pointer to the pixel data. It's certainly much easier than the unmanaged GDI equivalents.
First, an HBITMAP shouldn't be a unsigned char*. If you are passing an HBITMAP to C++ then the parameter should be an HBITMAP:
int Resize::ResizeImage(HBITMAP hBmp)
Next to convert from HBITMAP to pixels:
std::vector<unsigned char> ToPixels(HBITMAP BitmapHandle, int &width, int &height)
{
BITMAP Bmp = {0};
BITMAPINFO Info = {0};
std::vector<unsigned char> Pixels = std::vector<unsigned char>();
HDC DC = CreateCompatibleDC(NULL);
std::memset(&Info, 0, sizeof(BITMAPINFO)); //not necessary really..
HBITMAP OldBitmap = (HBITMAP)SelectObject(DC, BitmapHandle);
GetObject(BitmapHandle, sizeof(Bmp), &Bmp);
Info.bmiHeader.biSize = sizeof(BITMAPINFOHEADER);
Info.bmiHeader.biWidth = width = Bmp.bmWidth;
Info.bmiHeader.biHeight = height = Bmp.bmHeight;
Info.bmiHeader.biPlanes = 1;
Info.bmiHeader.biBitCount = Bmp.bmBitsPixel;
Info.bmiHeader.biCompression = BI_RGB;
Info.bmiHeader.biSizeImage = ((width * Bmp.bmBitsPixel + 31) / 32) * 4 * height;
Pixels.resize(Info.bmiHeader.biSizeImage);
GetDIBits(DC, BitmapHandle, 0, height, &Pixels[0], &Info, DIB_RGB_COLORS);
SelectObject(DC, OldBitmap);
height = std::abs(height);
DeleteDC(DC);
return Pixels;
}
Apparently sending in the Pointer from Scan0 is equivalent to what I was searching for. I am able to manipulate the data as expected by sending in an IntPtr retrieved from the bitmapData.Scan0 method.
Bitmap srcBitmap = new Bitmap(m_testImage);
Rectangle rect = new Rectangle(0, 0, srcBitmap.Width, srcBitmap.Height);
BitmapData bmpData = srcBitmap.LockBits(rect, ImageLockMode.ReadWrite, srcBitmap.PixelFormat);
//Get ptr to pixel data of image
IntPtr ptr = bmpData.Scan0;
//Call c++ method
int status = myDll.ResizeImage(ptr);
srcBitmap.UnlockBits(bmpData);
To further help clarify, the only code I changed from my initial post was the first block of code. All the rest remained the same. (C++ method still accepts unsigned char * as a param)
The question title is basically what I'd like to ask:
[MarshalAs(UnmanagedType.LPStr)] - how does this convert utf-8 strings to char* ?
I use the above line when I attempt to communicate between c# and c++ dlls;
more specifically, between:
somefunction(char *string) [c++ dll]
somefunction([MarshalAs(UnmanagedType.LPStr) string text) [c#]
When I send my utf-8 text (scintilla.Text) through c# and into my c++ dll,
I'm shown in my VS 10 debugger that:
the c# string was successfully converted to char*
the resulting char* properly reflects the corresponding utf-8 chars (including the bit in Korean) in the watch window.
Here's a screenshot (with more details):
As you can see, initialScriptText[0] returns the single byte(char): 'B' and the contents of char* initialScriptText are displayed properly (including Korean) in the VS watch window.
Going through the char pointer, it seems that English is saved as one byte per char, while Korean seems to be saved as two bytes per char. (the Korean word in the screenshot is 3 letters, hence saved in 6 bytes)
This seems to show that each 'letter' isn't saved in equal size containers, but differs depending on language. (possible hint on type?)
I'm trying to achieve the same result in pure c++: reading in utf-8 files and saving the result as char*.
Here's an example of my attempt to read a utf-8 file and convert to char* in c++:
observations:
loss in visual when converting from wchar_t* to char*
since result, s8 displays the string properly, I know I've converted the utf-8 file content in wchar_t* successfully to char*
since 'result' retains the bytes I've taken directly from the file, but I'm getting a different result from what I had through c# (I've used the same file), I've concluded that the c# marshal has put the file contents through some other procedure to further mutate the text to char*.
(the screenshot also shows my terrible failure in using wcstombs)
note: I'm using the utf8 header from (http://utfcpp.sourceforge.net/)
Please correct me on any mistakes in my code/observations.
I'd like to be able to mimic the result I'm getting through the c# marshal and I've realised after going through all this that I'm completely stuck. Any ideas?
[MarshalAs(UnmanagedType.LPStr)] - how does this convert utf-8 strings to char* ?
It doesn't. There is no such thing as a "utf-8 string" in managed code, strings are always encoded in utf-16. The marshaling from and to an LPStr is done with the default system code page. Which makes it fairly remarkable that you see Korean glyphs in the debugger, unless you use code page 949.
If interop with utf-8 is a hard requirement then you need to use a byte[] in the pinvoke declaration. And convert back and forth yourself with System.Text.Encoding.UTF8. Use its GetString() method to convert the byte[] to a string, its GetBytes() method to convert a string to byte[]. Avoid all this if possible by using wchar_t[] in the native code.
While the other answers are correct, there has been a major development in .NET 4.7. Now there is an option that does exactly what UTF-8 needs: UnmanagedType.LPUTF8Str. I tried it and it works like a Swiss chronometre, doing exactly what it sounds like.
In fact, I even used MarshalAs(UnmanagedType.LPUTF8Str) in one parameter and MarshalAs(UnmanagedType.LPStr) in another. Also works. Here is my method (takes in string parameters and returns a string via a parameter):
[DllImport("mylib.dll", ExactSpelling = true, CallingConvention = CallingConvention.StdCall)]
public static extern void ProcessContent([MarshalAs(UnmanagedType.LPUTF8Str)]string content,
[MarshalAs(UnmanagedType.LPUTF8Str), Out]StringBuilder outputBuffer,[MarshalAs(UnmanagedType.LPStr)]string settings);
Thanks, Microsoft! Another nuisance is gone.
ICustomMarshaler can be used, in case of using .NET Framework earlier than 4.7.
class UTF8StringCodec : ICustomMarshaler
{
public static ICustomMarshaler GetInstance(string cookie) => new UTF8StringCodec();
public void CleanUpManagedData(object ManagedObj)
{
// nop
}
public void CleanUpNativeData(IntPtr pNativeData)
{
Marshal.FreeCoTaskMem(pNativeData);
}
public int GetNativeDataSize()
{
throw new NotImplementedException();
}
public IntPtr MarshalManagedToNative(object ManagedObj)
{
var text = $"{ManagedObj}";
var bytes = Encoding.UTF8.GetBytes(text);
var ptr = Marshal.AllocCoTaskMem(bytes.Length + 1);
Marshal.Copy(bytes, 0, ptr, bytes.Length);
Marshal.WriteByte(ptr, bytes.Length, 0);
return ptr;
}
public object MarshalNativeToManaged(IntPtr pNativeData)
{
if (pNativeData == IntPtr.Zero)
{
return null;
}
var bytes = new MemoryStream();
var ofs = 0;
while (true)
{
var byt = Marshal.ReadByte(pNativeData, ofs);
if (byt == 0)
{
break;
}
bytes.WriteByte(byt);
ofs++;
}
return Encoding.UTF8.GetString(bytes.ToArray());
}
}
P/Invoke declaration:
[DllImport("native.dll", CallingConvention = CallingConvention.Cdecl)]
private extern static int NativeFunc(
[MarshalAs(UnmanagedType.CustomMarshaler, MarshalTypeRef = typeof(UTF8StringCodec))] string path
);
Usage inside callback:
[StructLayout(LayoutKind.Sequential)]
struct Options
{
[MarshalAs(UnmanagedType.FunctionPtr)]
public CallbackFunc callback;
}
[UnmanagedFunctionPointer(CallingConvention.Cdecl)]
public delegate int CallbackFunc(
[MarshalAs(UnmanagedType.CustomMarshaler, MarshalTypeRef = typeof(UTF8StringCodec))] string path
);
If you need to marshal UTF-8 string do it manually.
Define function with IntPtr instead of string:
somefunction(IntPtr text)
Then convert text to zero-terminated UTF8 array of bytes and write them to IntPtr:
byte[] retArray = Encoding.UTF8.GetBytes(text);
byte[] retArrayZ = new byte[retArray.Length + 1];
Array.Copy(retArray, retArrayZ, retArray.Length);
IntPtr retPtr = AllocHGlobal(retArrayZ.Length);
Marshal.Copy(retArrayZ, 0, retPtr, retArrayZ.Length);
somefunction(retPtr);