PInvoke and EntryPointNotFoundException - c#

I can't understand what is wrong with a pinvoke below which results into an EntryPointNotFoundException:
A function in C with a structure declaration:
extern "C"__declspec (dllimport) __stdcall
LONG NET_DVR_Login_V30 (char *sDVRIP, WORD wDVRPort, char *sUserName,
char *sPassword, LPNET_DVR_DEVICEINFO_V30 lpDeviceInfo);
typedef struct
{
BYTE sSerialNumber[48];
BYTE byAlarmInPortNum;
BYTE byAlarmOutPortNum;
BYTE byDiskNum;
BYTE byDVRType;
BYTE byChanNum;
BYTE byStartChan;
BYTE byAudioChanNum;
BYTE byIPChanNum;
BYTE byZeroChanNum;
BYTE byMainProto;
BYTE bySubProto;
BYTE bySupport;
BYTE byRes1[20];
}NET_DVR_DEVICEINFO_V30, *LPNET_DVR_DEVICEINFO_V30;
The import in C#, the structure declaration and the pinvoke:
[DllImport("SDK.dll", SetLastError = true,
CallingConvention = CallingConvention.StdCall)]
public extern static int NET_DVR_Login_V30(
[MarshalAs(UnmanagedType.LPStr)] string sDVRIP,
ushort wDVRPort,
[MarshalAs(UnmanagedType.LPStr)] string sUserName,
[MarshalAs(UnmanagedType.LPStr)] string sPassword,
ref NET_DVR_DEVICEINFO_V30 lpDeviceInfo);
[StructLayout(LayoutKind.Sequential,
CharSet = CharSet.Ansi)]
public struct NET_DVR_DEVICEINFO_V30
{
[MarshalAs(UnmanagedType.ByValTStr, SizeConst = 48)]
public string sSerialNumber;
public byte byAlarmOutPortNum;
public byte byDiskNum;
public byte byDVRType;
public byte byChanNum;
public byte byStartChan;
public byte byAudioChanNum;
public byte byIPChanNum;
public byte byZeroChanNum;
public byte byMainProto;
public byte bySubProto;
public byte bySupport;
[MarshalAs(UnmanagedType.ByValTStr, SizeConst = 20)]
public string byRes1;
}
NET_DVR_DEVICEINFO_V30 deviceInfo = new NET_DVR_DEVICEINFO_V30();
int result = Functions.NET_DVR_Login_V30(ip, port, user,
password, ref deviceInfo);
I inspected the function name via dumpbin and it is not mangled. So I wonder why an EntryPointNotFoundException occurs, if anything were wrong with the parameters for example, a PInvokeStackImbalance error would occur, let's say.
Any ideas what could be wrong with this pinvoke?

There is a tool called Dependency Walker (depends.exe) that will help debug this issue by displaying the import/export table of your SDK.DLL - I'd take a look at that. One other thing that might (this seems suspect to me) be happening is, that since you're using char*, .NET is adding an "A" on the end of your function name. That could be balderdash though.

Clearly there is a name mismatch. You therefore need to make sure that both sides of the interface use the same name:
When exporting the function from the DLL as stdcall it will be decorated. You can avoid this decoration by using a .def file.
When importing using P/Invoke you need to suppress the addition of a W or A suffix. Do so by setting the ExactSpelling field of the DllImportAttribute to true.

Related

How to correctly marshal unsigned char* from c dll to c#

I am trying to prepare a simple GUI based AES-CMAC calculator.For this I have decided to create c dll out of open ssl libraries.[I Dont want to use .net for calculating AES-CMAC].This DLL ,I have tested with test application created in c++(console) and value generated are as per test vectors. But when I am trying to call this function from c#.I get wrong values.Here I am using byte[] instead of unsigned char*.
My code snippet for c function is
double calc_AES_CMAC(unsigned char* message ,unsigned char* key,unsigned char* cmac_16)
{
size_t mactlen;
CMAC_CTX *ctx = CMAC_CTX_new();
CMAC_Init(ctx, key, 16, EVP_aes_128_cbc(), NULL);
CMAC_Update(ctx, message, sizeof(message));
CMAC_Final(ctx, cmac_16, &mactlen);
CMAC_CTX_free(ctx);
return 0;
}
And my calling C# code is
Firstly Function import
[DllImport("C:\\Users\\Sudhanwa\\Documents\\Visual Studio 2010\\Projects\\Ccsharpdll\\Debug\\Ccsharpdll.dll", CallingConvention = CallingConvention.Cdecl)]
public static extern double calc_AES_CMAC(byte[] message, byte[] key, byte[] output);
Secondly Button click event
byte [] null_arr = new byte[16];
// K: 2b7e1516 28aed2a6 abf71588 09cf4f3c
byte[] key = { 0x2b,0x7e,0x15,0x16,
0x28,0xae,0xd2,0xa6,
0xab,0xf7,0x15,0x88,
0x09,0xcf,0x4f,0x3c };
// M: 6bc1bee2 2e409f96 e93d7e11 7393172a Mlen: 128
byte[] message= { 0x6b,0xc1,0xbe,0xe2,
0x2e,0x40,0x9f,0x96,
0xe9,0x3d,0x7e,0x11,
0x73,0x93,0x17,0x2a };
byte [] cmac = new byte [16];
c = calc_AES_CMAC(message, key, cmac);
string ans = ByteArrayToString(cmac);
MessageBox.Show(ans);
In this code, I get 16 Byte hex output but this does not match with correct result.
You need to indicate to the marshaller that you expect that data is returned (and how much data) in the output parameter:
public static extern double calc_AES_CMAC(byte[] message, byte[] key,
[In, Out, MarshalAs(UnmanagedType.LPArray, SizeConst=16)] byte[] output);
Otherwise a copy of the current content of the array will be passed to the C++ function but any modifications will not be copied back to the C# caller.

C# Pinvoke Delphi ShortString System.AccessViolationException

My client just sent me a delphi dll to be consumed for my asp.net app, and below is the dll's signature:
function GerarChave(pChave: ShortString; pData: ShortString; pAcao: ShortString): PAnsiChar; stdcall;
How should I call it? I've tried everything like
[DllImport("CEIINT.dll", CallingConvention = CallingConvention.StdCall, EntryPoint = "GerarChave")]
public static extern string GerarChave([MarshalAs(UnmanagedType.BStr)]string pChave, [MarshalAs(UnmanagedType.BStr)]string pData, [MarshalAs(UnmanagedType.BStr)]string pAcao);
string chave = "ABC123";
string data = "19/09/2019";
string acao = "0";
GerarChave(chave, data, acao);
but I always get a System.AccessViolationException error which says:
System.AccessViolationException... Attempted to read or write protected memory. This is often an indication that other memory is corrupt
Could anybody help me please? Thanks in advance!
Like David Heffernan said, you should try to get back to your client to request a DLL with more interoperable types.
If you have no other choice you could try to do the conversions manually, first by changing the signature to an byte array:
[DllImport("CEIINT.dll", CallingConvention = CallingConvention.StdCall, EntryPoint = "GerarChave")]
public static extern string GerarChave(
[MarshalAs(UnmanagedType.ByValArray, SizeConst=256)] byte[] pChave,
[MarshalAs(UnmanagedType.ByValArray, SizeConst=256)] byte[] pData,
[MarshalAs(UnmanagedType.ByValArray, SizeConst=256)] byte[] pAcao);
Then define the following method to convert a string to a Delphi ShortString:
public byte[] GetDelphiShortString(string str)
{
var bytes = new byte[256];
bytes[0] = (byte)Encoding.Default.GetBytes(str, 0, str.Length, bytes, 1);
return bytes;
}
Finally, you should be able to call the Delphi function via:
GerarChave(GetDelphiShortString(chave), GetDelphiShortString(data), GetDelphiShortString(acao));

MarshalAs(UnmanagedType.LPStr) - how does this convert utf-8 strings to char*

The question title is basically what I'd like to ask:
[MarshalAs(UnmanagedType.LPStr)] - how does this convert utf-8 strings to char* ?
I use the above line when I attempt to communicate between c# and c++ dlls;
more specifically, between:
somefunction(char *string) [c++ dll]
somefunction([MarshalAs(UnmanagedType.LPStr) string text) [c#]
When I send my utf-8 text (scintilla.Text) through c# and into my c++ dll,
I'm shown in my VS 10 debugger that:
the c# string was successfully converted to char*
the resulting char* properly reflects the corresponding utf-8 chars (including the bit in Korean) in the watch window.
Here's a screenshot (with more details):
As you can see, initialScriptText[0] returns the single byte(char): 'B' and the contents of char* initialScriptText are displayed properly (including Korean) in the VS watch window.
Going through the char pointer, it seems that English is saved as one byte per char, while Korean seems to be saved as two bytes per char. (the Korean word in the screenshot is 3 letters, hence saved in 6 bytes)
This seems to show that each 'letter' isn't saved in equal size containers, but differs depending on language. (possible hint on type?)
I'm trying to achieve the same result in pure c++: reading in utf-8 files and saving the result as char*.
Here's an example of my attempt to read a utf-8 file and convert to char* in c++:
observations:
loss in visual when converting from wchar_t* to char*
since result, s8 displays the string properly, I know I've converted the utf-8 file content in wchar_t* successfully to char*
since 'result' retains the bytes I've taken directly from the file, but I'm getting a different result from what I had through c# (I've used the same file), I've concluded that the c# marshal has put the file contents through some other procedure to further mutate the text to char*.
(the screenshot also shows my terrible failure in using wcstombs)
note: I'm using the utf8 header from (http://utfcpp.sourceforge.net/)
Please correct me on any mistakes in my code/observations.
I'd like to be able to mimic the result I'm getting through the c# marshal and I've realised after going through all this that I'm completely stuck. Any ideas?
[MarshalAs(UnmanagedType.LPStr)] - how does this convert utf-8 strings to char* ?
It doesn't. There is no such thing as a "utf-8 string" in managed code, strings are always encoded in utf-16. The marshaling from and to an LPStr is done with the default system code page. Which makes it fairly remarkable that you see Korean glyphs in the debugger, unless you use code page 949.
If interop with utf-8 is a hard requirement then you need to use a byte[] in the pinvoke declaration. And convert back and forth yourself with System.Text.Encoding.UTF8. Use its GetString() method to convert the byte[] to a string, its GetBytes() method to convert a string to byte[]. Avoid all this if possible by using wchar_t[] in the native code.
While the other answers are correct, there has been a major development in .NET 4.7. Now there is an option that does exactly what UTF-8 needs: UnmanagedType.LPUTF8Str. I tried it and it works like a Swiss chronometre, doing exactly what it sounds like.
In fact, I even used MarshalAs(UnmanagedType.LPUTF8Str) in one parameter and MarshalAs(UnmanagedType.LPStr) in another. Also works. Here is my method (takes in string parameters and returns a string via a parameter):
[DllImport("mylib.dll", ExactSpelling = true, CallingConvention = CallingConvention.StdCall)]
public static extern void ProcessContent([MarshalAs(UnmanagedType.LPUTF8Str)]string content,
[MarshalAs(UnmanagedType.LPUTF8Str), Out]StringBuilder outputBuffer,[MarshalAs(UnmanagedType.LPStr)]string settings);
Thanks, Microsoft! Another nuisance is gone.
ICustomMarshaler can be used, in case of using .NET Framework earlier than 4.7.
class UTF8StringCodec : ICustomMarshaler
{
public static ICustomMarshaler GetInstance(string cookie) => new UTF8StringCodec();
public void CleanUpManagedData(object ManagedObj)
{
// nop
}
public void CleanUpNativeData(IntPtr pNativeData)
{
Marshal.FreeCoTaskMem(pNativeData);
}
public int GetNativeDataSize()
{
throw new NotImplementedException();
}
public IntPtr MarshalManagedToNative(object ManagedObj)
{
var text = $"{ManagedObj}";
var bytes = Encoding.UTF8.GetBytes(text);
var ptr = Marshal.AllocCoTaskMem(bytes.Length + 1);
Marshal.Copy(bytes, 0, ptr, bytes.Length);
Marshal.WriteByte(ptr, bytes.Length, 0);
return ptr;
}
public object MarshalNativeToManaged(IntPtr pNativeData)
{
if (pNativeData == IntPtr.Zero)
{
return null;
}
var bytes = new MemoryStream();
var ofs = 0;
while (true)
{
var byt = Marshal.ReadByte(pNativeData, ofs);
if (byt == 0)
{
break;
}
bytes.WriteByte(byt);
ofs++;
}
return Encoding.UTF8.GetString(bytes.ToArray());
}
}
P/Invoke declaration:
[DllImport("native.dll", CallingConvention = CallingConvention.Cdecl)]
private extern static int NativeFunc(
[MarshalAs(UnmanagedType.CustomMarshaler, MarshalTypeRef = typeof(UTF8StringCodec))] string path
);
Usage inside callback:
[StructLayout(LayoutKind.Sequential)]
struct Options
{
[MarshalAs(UnmanagedType.FunctionPtr)]
public CallbackFunc callback;
}
[UnmanagedFunctionPointer(CallingConvention.Cdecl)]
public delegate int CallbackFunc(
[MarshalAs(UnmanagedType.CustomMarshaler, MarshalTypeRef = typeof(UTF8StringCodec))] string path
);
If you need to marshal UTF-8 string do it manually.
Define function with IntPtr instead of string:
somefunction(IntPtr text)
Then convert text to zero-terminated UTF8 array of bytes and write them to IntPtr:
byte[] retArray = Encoding.UTF8.GetBytes(text);
byte[] retArrayZ = new byte[retArray.Length + 1];
Array.Copy(retArray, retArrayZ, retArray.Length);
IntPtr retPtr = AllocHGlobal(retArrayZ.Length);
Marshal.Copy(retArrayZ, 0, retPtr, retArrayZ.Length);
somefunction(retPtr);

Readprocessmemory into string

I know everything about process and what address i want to read, but i don't know how to use Readprocessmemory function. Do i need to add some usings or something?
I made this in C++, but how can i do it in C#?
char* ReadMemoryText(DWORD address,int size)
{
char ret[size];
DWORD processId;
HWND hwnd = FindWindow("WindowX",NULL);
if(tibia!=NULL)
{
GetWindowThreadProcessId(hwnd,&processId);
HANDLE phandle = OpenProcess(PROCESS_VM_READ, 0, processId);
if(!phandle)
{
cout<<GetLastError()<<endl;
cout <<"Could not get handle!\n";
cin.get();
}
ReadProcessMemory(phandle, (LPVOID)address, &ret,size,0);
char * rt = ret;
for(int i=0;i<size && ret[i]!=0;++i)
cout << ret[i];
return rt;
}
return NULL;
}
Here is an example of using C# that reads a char array from memory. In this case it's the local player's name string from Assault Cube.
[DllImport("kernel32.dll", SetLastError = true)]
public static extern bool ReadProcessMemory(
IntPtr hProcess, IntPtr lpBaseAddress, byte[] lpBuffer, Int32 nSize, out IntPtr lpNumberOfBytesRead);
var nameAddr = ghapi.FindDMAAddy(hProc, (IntPtr)(modBase2 + 0x10f4f4), new int[] { 0x225 });
byte[] name = new byte[16];
ghapi.ReadProcessMemory(hProc, nameAddr, name, 16, out _);
Console.WriteLine(Encoding.Default.GetString(name));
We use pinvoke to get access to ReadProcessMemory exported from kernel32.dll
We use FindDMAAddy to get the address of the name variable. The char array is a fixed size of 16 bytes.
We use ReadProcessMemory using source and destination variables, size 16 and the last argument we just use "out _" because we don't care about bytesRead argument.
Then we need to convert that char array to a string type with proper encoding for which we use Encoding.Default.GetString().
Then write that line to the console.

How to pass a string containing extended ascii chracters to an unmanaged C++ dll

I have a C# application that talks to a USB peripheral through a C DLL.
The C DLL implements a function :
long WriteText(char* data, long length);
If calling this from C/C++ I can send it regular ASCII text but also some extended characters such as '£' (0x9C hex).
However, I have wrapped this up in a C# class
[DllImport("c:\\USBPD.DLL", EntryPoint = "WriteText")]
public static extern int WriteText(string data, int length);
However, when I send it a string with a "£" I get a 'u^' in it's place. The rest of the string is fine. I have played around with the encoding types but still seem to be having problems.
Thank
Anand
Have you tried:
[DllImport("c:\\USBPD.DLL")]
private static extern int WriteText([In] byte[] text, int length);
public static int WriteText(string text)
{
Encoding enc = Encoding.GetEncoding(437); // 437 is the original IBM PC code page
byte[] bytes = enc.GetBytes(text);
return WriteText(bytes, bytes.Length);
}
Note that EntryPoint is not necessary if the name of the method is the same.

Categories