I am using DllImport to receive data from an external DLL. I receive the data using the following structure in C#
public struct EventBuffer
{
[MarshalAs(UnmanagedType.ByValArray, SizeConst = CSTA_MAX_HEAP)]
public byte[] data;
};
So far I have been able to work with everything I have received from this DLL. However one the structures I receive has pointer inside (C++)
typedef struct ConnectionList_t {
_Int count;
//Connection_t FAR *connection;
Connection_t FAR * POINTER_32 connection;
} ConnectionList_t;
When I parse the byte[] in C# I get an address instead of the array itself. So I have tried to access that memory address with the code below. But the result is not what I expected, I know that the data is valid as my C++ test program receives right values
Int32 pointerToAddress = BitConverter.ToInt32(buffer, 4);
IntPtr intPtr = new IntPtr(pointerToAddress);
byte[] luckyYou = new byte[2048];
Marshal.Copy(intPtr, luckyYou, 0, lenghtOfMarshalledStructure);
Am I missing something to access a memory address received from C++?
Related
I am trying to prepare a simple GUI based AES-CMAC calculator.For this I have decided to create c dll out of open ssl libraries.[I Dont want to use .net for calculating AES-CMAC].This DLL ,I have tested with test application created in c++(console) and value generated are as per test vectors. But when I am trying to call this function from c#.I get wrong values.Here I am using byte[] instead of unsigned char*.
My code snippet for c function is
double calc_AES_CMAC(unsigned char* message ,unsigned char* key,unsigned char* cmac_16)
{
size_t mactlen;
CMAC_CTX *ctx = CMAC_CTX_new();
CMAC_Init(ctx, key, 16, EVP_aes_128_cbc(), NULL);
CMAC_Update(ctx, message, sizeof(message));
CMAC_Final(ctx, cmac_16, &mactlen);
CMAC_CTX_free(ctx);
return 0;
}
And my calling C# code is
Firstly Function import
[DllImport("C:\\Users\\Sudhanwa\\Documents\\Visual Studio 2010\\Projects\\Ccsharpdll\\Debug\\Ccsharpdll.dll", CallingConvention = CallingConvention.Cdecl)]
public static extern double calc_AES_CMAC(byte[] message, byte[] key, byte[] output);
Secondly Button click event
byte [] null_arr = new byte[16];
// K: 2b7e1516 28aed2a6 abf71588 09cf4f3c
byte[] key = { 0x2b,0x7e,0x15,0x16,
0x28,0xae,0xd2,0xa6,
0xab,0xf7,0x15,0x88,
0x09,0xcf,0x4f,0x3c };
// M: 6bc1bee2 2e409f96 e93d7e11 7393172a Mlen: 128
byte[] message= { 0x6b,0xc1,0xbe,0xe2,
0x2e,0x40,0x9f,0x96,
0xe9,0x3d,0x7e,0x11,
0x73,0x93,0x17,0x2a };
byte [] cmac = new byte [16];
c = calc_AES_CMAC(message, key, cmac);
string ans = ByteArrayToString(cmac);
MessageBox.Show(ans);
In this code, I get 16 Byte hex output but this does not match with correct result.
You need to indicate to the marshaller that you expect that data is returned (and how much data) in the output parameter:
public static extern double calc_AES_CMAC(byte[] message, byte[] key,
[In, Out, MarshalAs(UnmanagedType.LPArray, SizeConst=16)] byte[] output);
Otherwise a copy of the current content of the array will be passed to the C++ function but any modifications will not be copied back to the C# caller.
I have some C++/CLI code that creates a simple CImg image and draws a circle on it. I want to pass it to C#, but I'm not sure how. I thought of using a byte array to pass it to C#, but I can't get the length of the array, which is needed for any conversion from byte* to byte[] or for passing into the unmanaged memory stream. I've tried using strlen, but that just returns 0.
Here is my C++ code:
unsigned char* calculateFrame::ReadImage() {
CImg<unsigned char> testImage(1920, 1080, 1, 3, 0);
const unsigned char white[3] = { 255,255,255 };
testImage.draw_circle(256, 256, 200, white, 1.0f, ~0U);
testImage.draw_point(500, 500, white, 255);
unsigned char* charArray = (unsigned char*)testImage;
return charArray;
}
C# code:
Bitmap testBmp;
using(var test = new FrameCalculator.calculateFrame())
{
Console.WriteLine(test.calculateOneFrame(3));
unsafe
{
byte* imageArray = test.ReadImage();
using(var ms = new UnmanagedMemoryStream(imageArray , /* length of byte* (unknown) */))
{
testBmp = new Bitmap(ms);
}
}
}
If you have any tricks to get around unsafe code without sacrificing performance, that would be nice, but I'm not opposed to using unsafe code if it's necessary.
I ended up deciding that in the future, I would need a frame buffer, which necessitated writing the frames to the disk, so that they weren't lost in a restart.
Anyways, my solution was to write the image to disk as a .BMP and access it using Image.FromFile in C#. This isn't a great way to do this in most cases, because it adds a lot of overhead, but it made sense in my program.
I need to send data from C++ to C#.
On C++ side, under Linux, I am using ZMQ library version 4.1.4.
C# side is using clrzmq4 library based on 4.1.5 version.
So the part where I send message in C++:
char tempStr[] = "ABCD";
zmq_msg_t zmsg;
zmq_msg_init_size(&zmsg, 4);
memcpy(zmq_msg_data(&zmsg), tempStr, 4);
int rc = zmq_send(reqSocket, &zmsg, sizeof(zmsg), 0);
zmq_msg_close(&zmsg);
C# code to retrieve message:
ZFrame request = responder.ReceiveFrame();
byte[] reqBytes = new byte[100];
int n = request.Read(reqBytes, 0, 100);
The problem is that byte array is including all 64 bytes of zmq_msg_t. The actual data is starting from offset 16.
Question - how to properly extract data in this case? Extracting by hard coding the offset in my code is simply ugly, because one day zmq_msg_t may be changed from sender side and data will be located somewhere else. Other option is to avoid using zmq_msg_t, when both sides are not using same platform/framework. In the clrzmq4 framework I can see there are delegates for zmq_msg_t types, but not sure how to use them and whether they intended for public usage.
Your mixing the send types on the c++ side, if using zmq_msg_t you dont need the variant with a size, thats for sending a buffer.
If using the buffer function
int zmq_send (void *socket, void *buf, size_t len, int flags)
Then you should do
zmq_send(reqSocket, tempStr, 4, 0);
However if using zmq_msg_t variant
int zmq_msg_send (zmq_msg_t *msg, void *socket, int flags)
Then you should do :
zmq_msg_send (&zmsg, reqSocket, 0);
I've been using FFmpeg.AutoGen https://github.com/Ruslan-B/FFmpeg.AutoGen wrapper to decode my H264 video for sometime with great success and now have to add AAC audio decoding (previous I was using G711 and NAudio for this).
I have the AAC stream decoding using avcodec_decode_audio4, however the output buffer or frame is in floating point format FLT and I need it to be in S16. For this I have found unmanaged examples using swr_convert and FFmpeg.AutoGen does have this function P/Invoked as;
[DllImport(SWRESAMPLE_LIBRARY, EntryPoint="swr_convert", CallingConvention = CallingConvention.Cdecl, CharSet = CharSet.Ansi)]
public static extern int swr_convert(SwrContext* s, byte** #out, int out_count, byte** #in, int in_count);
My trouble is that I can't find a successful way of converting/fixing/casting my managed byte[] in to a byte** to provide this as the destination buffer.
Has anyone doing this before?
My non-working code...
packet.ResetBuffer(m_avFrame->linesize[0]*2);
fixed (byte* pData = packet.Payload)
{
byte** src = &m_avFrame->data_0;
//byte** dst = *pData;
IntPtr d = new IntPtr(pData);
FFmpegInvoke.swr_convert(m_pConvertContext, (byte**)d.ToPointer(), packet.Length, src, (int)m_avFrame->linesize[0]);
}
Thanks for any help.
Cheers
Dave
The function you are trying to call is documented here: http://www.ffmpeg.org/doxygen/2.0/swresample_8c.html#a81af226d8969df314222218c56396f6a
The out_arg parameter is declare like this:
uint8_t* out_arg[SWR_CH_MAX]
That is an length SWR_CH_MAX array of byte arrays. Your translation renders that as byte** and so forces you to use unsafe code. Personally I think I would avoid that. I would declare the parameter like this:
[MarshalAs(UnmanagedType.LPArray)]
IntPtr[] out_arg
Declare the array like this:
IntPtr[] out_arg = new IntPtr[channelCount];
I am guessing that the CH in SWR_CH_MAX is short-hand for channel.
Then you need to allocate memory for the output buffer. I'm not sure how you want to do that. You could allocate one byte array per channel and pin those arrays to get hold of a pointer to pass down to the native code. That would be my preferred approach because upon return you'd have your channels in nice managed arrays. Another way would be a call to Marshal.AllocHGlobal.
The input buffer would need to be handled in the same way.
I would not use the automated pinvoke translation that you are currently using. It seems he'll bent on forcing you to use pointers and unsafe code. Not massively helpful. I'd translate it by hand.
I'm sorry not to give more specific details but it's a little hard because your question did not contain any information about the types used in your code samples. I hope the general advice is useful.
Thanks to #david-heffernan answer I've managed to get the following working and I'm posting as an answer as examples of managed use of FFmpeg are very rare.
fixed (byte* pData = packet.Payload)
{
IntPtr[] in_buffs = new IntPtr[2];
in_buffs[0] = new IntPtr(m_avFrame->data_0);
in_buffs[1] = new IntPtr(m_avFrame->data_1);
IntPtr[] out_buffs = new IntPtr[1];
out_buffs[0] = new IntPtr(pData);
FFmpegInvoke.swr_convert(m_pConvertContext, out_buffs, m_avFrame->nb_samples, in_buffs, m_avFrame->nb_samples);
}
In in the complete context of decoding a buffer of AAC audio...
protected override void DecodePacket(MediaPacket packet)
{
int frameFinished = 0;
AVPacket avPacket = new AVPacket();
FFmpegInvoke.av_init_packet(ref avPacket);
byte[] payload = packet.Payload;
fixed (byte* pData = payload)
{
avPacket.data = pData;
avPacket.size = packet.Length;
if (packet.KeyFrame)
{
avPacket.flags |= FFmpegInvoke.AV_PKT_FLAG_KEY;
}
int in_len = packet.Length;
int count = FFmpegInvoke.avcodec_decode_audio4(CodecContext, m_avFrame, out frameFinished, &avPacket);
if (count != packet.Length)
{
}
if (count < 0)
{
throw new Exception("Can't decode frame!");
}
}
FFmpegInvoke.av_free_packet(ref avPacket);
if (frameFinished > 0)
{
if (!mConverstionContextInitialised)
{
InitialiseConverstionContext();
}
packet.ResetBuffer(m_avFrame->nb_samples*4); // need to find a better way of getting the out buff size
fixed (byte* pData = packet.Payload)
{
IntPtr[] in_buffs = new IntPtr[2];
in_buffs[0] = new IntPtr(m_avFrame->data_0);
in_buffs[1] = new IntPtr(m_avFrame->data_1);
IntPtr[] out_buffs = new IntPtr[1];
out_buffs[0] = new IntPtr(pData);
FFmpegInvoke.swr_convert(m_pConvertContext, out_buffs, m_avFrame->nb_samples, in_buffs, m_avFrame->nb_samples);
}
packet.Type = PacketType.Decoded;
if (mFlushRequest)
{
//mRenderQueue.Clear();
packet.Flush = true;
mFlushRequest = false;
}
mFirstFrame = true;
}
}
The question title is basically what I'd like to ask:
[MarshalAs(UnmanagedType.LPStr)] - how does this convert utf-8 strings to char* ?
I use the above line when I attempt to communicate between c# and c++ dlls;
more specifically, between:
somefunction(char *string) [c++ dll]
somefunction([MarshalAs(UnmanagedType.LPStr) string text) [c#]
When I send my utf-8 text (scintilla.Text) through c# and into my c++ dll,
I'm shown in my VS 10 debugger that:
the c# string was successfully converted to char*
the resulting char* properly reflects the corresponding utf-8 chars (including the bit in Korean) in the watch window.
Here's a screenshot (with more details):
As you can see, initialScriptText[0] returns the single byte(char): 'B' and the contents of char* initialScriptText are displayed properly (including Korean) in the VS watch window.
Going through the char pointer, it seems that English is saved as one byte per char, while Korean seems to be saved as two bytes per char. (the Korean word in the screenshot is 3 letters, hence saved in 6 bytes)
This seems to show that each 'letter' isn't saved in equal size containers, but differs depending on language. (possible hint on type?)
I'm trying to achieve the same result in pure c++: reading in utf-8 files and saving the result as char*.
Here's an example of my attempt to read a utf-8 file and convert to char* in c++:
observations:
loss in visual when converting from wchar_t* to char*
since result, s8 displays the string properly, I know I've converted the utf-8 file content in wchar_t* successfully to char*
since 'result' retains the bytes I've taken directly from the file, but I'm getting a different result from what I had through c# (I've used the same file), I've concluded that the c# marshal has put the file contents through some other procedure to further mutate the text to char*.
(the screenshot also shows my terrible failure in using wcstombs)
note: I'm using the utf8 header from (http://utfcpp.sourceforge.net/)
Please correct me on any mistakes in my code/observations.
I'd like to be able to mimic the result I'm getting through the c# marshal and I've realised after going through all this that I'm completely stuck. Any ideas?
[MarshalAs(UnmanagedType.LPStr)] - how does this convert utf-8 strings to char* ?
It doesn't. There is no such thing as a "utf-8 string" in managed code, strings are always encoded in utf-16. The marshaling from and to an LPStr is done with the default system code page. Which makes it fairly remarkable that you see Korean glyphs in the debugger, unless you use code page 949.
If interop with utf-8 is a hard requirement then you need to use a byte[] in the pinvoke declaration. And convert back and forth yourself with System.Text.Encoding.UTF8. Use its GetString() method to convert the byte[] to a string, its GetBytes() method to convert a string to byte[]. Avoid all this if possible by using wchar_t[] in the native code.
While the other answers are correct, there has been a major development in .NET 4.7. Now there is an option that does exactly what UTF-8 needs: UnmanagedType.LPUTF8Str. I tried it and it works like a Swiss chronometre, doing exactly what it sounds like.
In fact, I even used MarshalAs(UnmanagedType.LPUTF8Str) in one parameter and MarshalAs(UnmanagedType.LPStr) in another. Also works. Here is my method (takes in string parameters and returns a string via a parameter):
[DllImport("mylib.dll", ExactSpelling = true, CallingConvention = CallingConvention.StdCall)]
public static extern void ProcessContent([MarshalAs(UnmanagedType.LPUTF8Str)]string content,
[MarshalAs(UnmanagedType.LPUTF8Str), Out]StringBuilder outputBuffer,[MarshalAs(UnmanagedType.LPStr)]string settings);
Thanks, Microsoft! Another nuisance is gone.
ICustomMarshaler can be used, in case of using .NET Framework earlier than 4.7.
class UTF8StringCodec : ICustomMarshaler
{
public static ICustomMarshaler GetInstance(string cookie) => new UTF8StringCodec();
public void CleanUpManagedData(object ManagedObj)
{
// nop
}
public void CleanUpNativeData(IntPtr pNativeData)
{
Marshal.FreeCoTaskMem(pNativeData);
}
public int GetNativeDataSize()
{
throw new NotImplementedException();
}
public IntPtr MarshalManagedToNative(object ManagedObj)
{
var text = $"{ManagedObj}";
var bytes = Encoding.UTF8.GetBytes(text);
var ptr = Marshal.AllocCoTaskMem(bytes.Length + 1);
Marshal.Copy(bytes, 0, ptr, bytes.Length);
Marshal.WriteByte(ptr, bytes.Length, 0);
return ptr;
}
public object MarshalNativeToManaged(IntPtr pNativeData)
{
if (pNativeData == IntPtr.Zero)
{
return null;
}
var bytes = new MemoryStream();
var ofs = 0;
while (true)
{
var byt = Marshal.ReadByte(pNativeData, ofs);
if (byt == 0)
{
break;
}
bytes.WriteByte(byt);
ofs++;
}
return Encoding.UTF8.GetString(bytes.ToArray());
}
}
P/Invoke declaration:
[DllImport("native.dll", CallingConvention = CallingConvention.Cdecl)]
private extern static int NativeFunc(
[MarshalAs(UnmanagedType.CustomMarshaler, MarshalTypeRef = typeof(UTF8StringCodec))] string path
);
Usage inside callback:
[StructLayout(LayoutKind.Sequential)]
struct Options
{
[MarshalAs(UnmanagedType.FunctionPtr)]
public CallbackFunc callback;
}
[UnmanagedFunctionPointer(CallingConvention.Cdecl)]
public delegate int CallbackFunc(
[MarshalAs(UnmanagedType.CustomMarshaler, MarshalTypeRef = typeof(UTF8StringCodec))] string path
);
If you need to marshal UTF-8 string do it manually.
Define function with IntPtr instead of string:
somefunction(IntPtr text)
Then convert text to zero-terminated UTF8 array of bytes and write them to IntPtr:
byte[] retArray = Encoding.UTF8.GetBytes(text);
byte[] retArrayZ = new byte[retArray.Length + 1];
Array.Copy(retArray, retArrayZ, retArray.Length);
IntPtr retPtr = AllocHGlobal(retArrayZ.Length);
Marshal.Copy(retArrayZ, 0, retPtr, retArrayZ.Length);
somefunction(retPtr);