Speed up C# native call pinvoke - c#

mod note: I do not believe this to be a duplicate, due to me having already tried several mentioned solutions as described below.
Is there a way I can speed this up? I've already followed the Microsoft guides on this, and here's what I've done:
Added SuppressUnmanagedCodeSecurity
Placed this in a file named UnsafeNativeMethods.cs
Defined specific types for the method stub
Here is the method:
[DllImport("kernel32.dll")]
[SuppressUnmanagedCodeSecurity]
public static extern bool DeviceIoControl(
IntPtr hDevice,
uint IoControlCode,
[In] MemoryManager.MemOperation InBuffer,
int nInBufferSize,
[Out] byte[] OutBuffer,
uint nOutBufferSize,
ref int pBytesReturned,
IntPtr Overlapped
);
Here is the contents of MemOperation (the inbuffer that has to be marshal'd I'm guessing):
public struct MemOperation
{
public int Pid;
public int UserPid;
public int Size;
public int protection_mode;
public int allocation_type;
public IntPtr Addr;
public IntPtr WriteBuffer;
[MarshalAs(UnmanagedType.LPWStr)]
public string module_selection;
}
Here's is an example of usage:
public UnsafeNativeMethods.MEMORY_BASIC_INFORMATION QueryVirtualMemory(IntPtr address) {
var memOperation = new MemOperation();
byte[] buffer = new byte[48]; // 8 + 8 + 4 + 8 + 4 + 4 + 4 MEMORY_BASIC_INFORMATION
memOperation.Pid = this.Pid;
memOperation.Addr = address;
int bytes = 0;
bool res = UnsafeNativeMethods.DeviceIoControl(this.Handle, CtlCode(0x00000022, this.IOCTL_QUERY, 2, 0), memOperation, Marshal.SizeOf(memOperation), buffer, (uint)buffer.Length, ref bytes, IntPtr.Zero);
return GetStructure<UnsafeNativeMethods.MEMORY_BASIC_INFORMATION>(buffer);
}
In the profiler, my hotpath is the pinvoke. My application runs incredibly fast, as fast I think it can in C#. However, almost a third of all execution time is spent pinvoking due to how many memory operations the application is doing. I would like to cut this time down in any way possible, including unsafe ways.
I have seen that you can instead write a DeviceIoControl wrapper and import it from a C++ dll, however this did not lead to any changes for me, it appeared to function the exact same. Here's the source for that:
devicecontrol.cpp
#include <iostream>
#include "DeviceControl.h"
bool __cdecl DeviceIoRequestWrapper(HANDLE hDevice, DWORD dwIoControlCode, LPVOID lpInBuffer, DWORD nInBufferSize, LPVOID lpOutBuffer, DWORD nOutBufferSize, LPDWORD lpBytesReturned, LPOVERLAPPED lpOverlappedk)
{
return DeviceIoControl(hDevice, dwIoControlCode, lpInBuffer, nInBufferSize, lpOutBuffer, nOutBufferSize, lpBytesReturned, lpOverlappedk);
}
devicecontrol.h
#pragma once
#include <Windows.h>
extern "C" {
__declspec(dllexport) bool __cdecl DeviceIoRequestWrapper(
HANDLE hDevice,
DWORD dwIoControlCode,
LPVOID lpInBuffer,
DWORD nInBufferSize,
LPVOID lpOutBuffer,
DWORD nOutBufferSize,
LPDWORD lpBytesReturned,
LPOVERLAPPED lpOverlappedk);
}
I am on .net 6.0 and the latest version of C#.
Proposed change to out buffer:
public unsafe UnsafeNativeMethods.MEMORY_BASIC_INFORMATION QueryVirtualMemory(IntPtr address) {
var memOperation = new MemOperation();
byte* buffer = stackalloc byte[48];
memOperation.Pid = this.Pid;
memOperation.Addr = address;
int bytes = 0;
bool res = UnsafeNativeMethods.DeviceIoControl(this.Handle, CtlCode(0x00000022, this.IOCTL_QUERY, 2, 0), memOperation, Marshal.SizeOf(memOperation), (IntPtr)buffer, 48, ref bytes, IntPtr.Zero);
return GetStructure<UnsafeNativeMethods.MEMORY_BASIC_INFORMATION>(buffer);
}
Where GetStructure is now:
public static unsafe T GetStructure<T>(byte* bytes) where T: unmanaged {
T structure = *(T*)bytes;
return structure;
}

Here's what helped the most and I'll accept this as the answer to my own thread in a few hours.
In addition to the above steps I mentioned, I have now done the following:
Converted MemOperation to be fully blittable. The string "module_selection" was only ever used once per program start, so I have instead now added a fully blittable version of MemOperation. Thank you #Flydog57
Changed buffer to be quickly allocated (non zero'd out), and pass a PTR, rather than allocating and then marshalling a byte[] to void/char* (which is what I think likely happens when you pass it a byte[] object). Thank you #Charlieface
Running the profiler in visual studio, it appears to have eliminated the hotpath. Now the total CPU % correctly lies within kernel32.dll (the deviceiocontrol call to the device). Whether this leads to tangible performance benefits, I don't know, don't have an easy way to setup a full benchmark for this.

Related

Writing a byte array to an address

Deleted my old post and decided to reupload with more direct questions.
I'm trying to write bytes to a memory address for my C# menu, writing singular a singular byte or int isn't an issue. I have an issue trying to write multiple bytes.
The code below is writing a singular byte to increase the player speed to 555, which works perfectly fine.
[DllImport("kernel32.dll", SetLastError = true)]
public static extern bool WriteProcessMemory(IntPtr hProcess, IntPtr lpBaseAddress, byte[] lpBuffer, int nSize, out IntPtr lpNumberOfBytesWritten);
byte[] memory = { 555 };
WriteProcessMemory(Game.hProc, Player.PlayerSpeedOffSet, memory, memory.Length, out _);
The part I'm having issues with is when I try and write multiple bytes to an address.
[DllImport("kernel32.dll", SetLastError = true)]
public static extern bool WriteProcessMemory(IntPtr hProcess, IntPtr lpBaseAddress, byte[] lpBuffer, int nSize, out IntPtr lpNumberOfBytesWritten);
byte[] memory = { 90, 90, 90 };
WriteProcessMemory(Game.hProc, Player.UnlimitedAmmoOffSet, memory, memory.Length, out _);
"90 90 90" being unlimted ammo and to disable it, i'd write to the same address with "89 50 04".
This worked just fine with memory.dll so the handle and offset is just fine, it's just something to do with my writing to memory.
Look at the definition of the function WriteProcessMemory:
BOOL WriteProcessMemory(
HANDLE hProcess,
LPVOID lpBaseAddress,
LPCVOID lpBuffer,
SIZE_T nSize,
SIZE_T *lpNumberOfBytesWritten
);
Note the type of the nSize parameter, it's SIZE_T. How wide is this type? You can look it up here: https://learn.microsoft.com/en-us/windows/win32/winprog/windows-data-types
typedef ULONG_PTR SIZE_T;
Okay, then what is the width of ULONG_PTR? The same documentation tells:
#if defined(_WIN64)
typedef unsigned __int64 ULONG_PTR;
#else
typedef unsigned long ULONG_PTR;
#endif
So, its either 32 bits or 64 bits wide, depending on whether the process calling this function is a 32-bit or 64-bit process. (In Windows, unsigned long is 32-bit wide.)
Now look at your P/Invoke definition:
[DllImport("kernel32.dll", SetLastError = true)]
public static extern bool WriteProcessMemory(IntPtr hProcess, IntPtr lpBaseAddress, byte[] lpBuffer, int nSize, out IntPtr lpNumberOfBytesWritten);
You have defined nSize as a 32-bit integer. If your program is compiled to run as a 32-bit process, you got lucky and the 32-bit integer matches the 32-bit SIZE_T.
But if your program is running as a 64-bit process, your 32-bit integer does not match the 64-bit SIZE_T.
I hope using IntPtr instead of int for the nSize parameter should fix your issue.

Access violation on passing byte[] from c# to plain c dll

I'm trying to use the SOIL library for my Unity3d project. I slightly modified the code to compile it to a DLL.
I have a C-function with signature:
__declspec(dllexport)
unsigned int SOIL_load_OGL_texture_from_memory
(
const unsigned char *const buffer,
int buffer_length,
int force_channels,
unsigned int reuse_texture_ID,
unsigned int flags
);
I declared it in my c# script:
[DllImport("SOIL", CallingConvention = CallingConvention.Cdecl)]
private static extern uint SOIL_load_OGL_texture_from_memory(
System.IntPtr buffer,
int buffer_length,
int force_channels,
uint reuse_texture_ID,
uint flags);
Try to call:
GCHandle pinnedArray = GCHandle.Alloc(bytes, GCHandleType.Pinned);
System.IntPtr pointer = pinnedArray.AddrOfPinnedObject();
uint id = SOIL_load_OGL_texture_from_memory(pointer, bytes.Length, 3, 0, 0);
pinnedArray.Free();
And get an Access Violation. So I try to pass IntPtr as const unsigned char *. Maybe I need to use something different from GCHandle?
Edit: It crash the whole Unity 3D in play mode: Access Violation at MSVCR120.dll.
Try Marshal.Copy(Byte[], Int32, IntPtr, Int32) and passing the pointer to your C call
It was a stupid mistake absolutely unrelated to passing byte array. The problem was that I started Unity without -force-opengl (Direct3D by default I think) and SOIL works with OpenGL only.

Can using UnmanagedMemory.LPTStr instead of .ByValTStr result in memory corruption? Why?

We have a tree view in a windows forms app that shows files using the appropriate file icon using the following code. My problem is that call to GetIcon() appears to corrupt my memory as I start getting various program crashes that I can't catch with a debugger after this call.
The program work when I change managedType.LPTStr to managedType.ByValTStr. Is this a true fix or just masking the problem?
This code appeared to be working in our last product release and I can't see anything that has changed. Using .NET 4.0. I only see the issue in Release mode.
[DllImport("Shell32.dll")]
private static extern int SHGetFileInfo(string pszPath, uint dwFileAttributes, out SHFILEINFO psfi, uint cbfileInfo, SHGFI uFlags);
[StructLayout(LayoutKind.Sequential)]
private struct SHFILEINFO
{
public SHFILEINFO(bool b)
{
hIcon=IntPtr.Zero;
iIcon=0;
dwAttributes=0;
szDisplayName = "";
szTypeName = "";
}
public IntPtr hIcon;
public int iIcon;
public uint dwAttributes;
[MarshalAs(UnmanagedType.LPTStr, SizeConst = 260)]//works if .ByValTStr is used instead
public string szDisplayName;
[MarshalAs(UnmanagedType.LPTStr, SizeConst = 80)]//works if .ByValTStr is used instead
public string szTypeName;
};
public static Icon GetIcon(string strPath, bool bSmall)
{
SHFILEINFO info = new SHFILEINFO(true);
int cbFileInfo = Marshal.SizeOf(info);
SHGFI flags;
if (bSmall)
flags = SHGFI.Icon|SHGFI.SmallIcon|SHGFI.UseFileAttributes;
else
flags = SHGFI.Icon|SHGFI.LargeIcon|SHGFI.UseFileAttributes;
SHGetFileInfo(strPath, 256, out info,(uint)cbFileInfo, flags);
return Icon.FromHandle(info.hIcon);
}
Well, it's not a proper LPStr in the struct, so you can't try to marshal it as one and expect it to work:
typedef struct _SHFILEINFO {
HICON hIcon;
int iIcon;
DWORD dwAttributes;
TCHAR szDisplayName[MAX_PATH];
TCHAR szTypeName[80];
} SHFILEINFO;
LPTStr you use when you've allocated a special block of memory just to hold this string (usually in a Marshal.AllocHGlobal or similar), then you've copied over your string to that unmanaged memory area.
ByValTStr you use when you are literally passing in the actual string by value, not by reference to another area in memory.
The struct wants the proper value, not a pointer.
I realize this is an old question, but this helped me solve a crash that seemed to suddenly start popping up more often than not. It appears that I started running into these issues after a .NET 4.5.2 update was rolled out via Windows Update. LPTStr worked before the update and ByValTStr worked after.

How to initialise an unsafe pointer in C# and convert it to a byte[]?

I put a post up yesterday, How does one create structures for C# originally written in C++.
Thank you for your responses.
I'm trying, without much success, to use DeviceIOControl on an ARM platform running WinCE 6.0 and .NET Compact framework 2.0 All I am trying to achieve is the control of a port pin and it's proving to be a nightmare.
The following is the PInvoke declaration:
[DllImport("coredll.dll", EntryPoint = "DeviceIoControl", SetLastError = true)]
internal static extern bool DeviceIoControlCE(int hDevice,
int dwIoControlCode,
byte[] lpInBuffer,
int nInBufferSize,
byte[] lpOutBuffer,
int nOutBufferSize,
ref int lpBytesReturned,
IntPtr lpOverlapped);
The PInvoke declaration suggests a byte[] may be passed to it simply. Surely it's an easy matter to write the values to each member of a structure, convert it to an array of bytes and pass it to the dll.
I have the following:
[StructLayout(LayoutKind.Sequential)]
public struct pio_desc
{
unsafe byte* pin_name; //Length???
public uint pin_number; //4 bytes
public uint default_value; //4 bytes
public byte attribute; //1 byte
public uint pio_type; //4 bytes
}
and
pio_desc PA13 = new pio_desc();
So surely now it's a matter of doing something like:
PA13.pin_number = AT91_PIN_PA13; //Length 4 bytes
PA13.default_value = 0; //Length 4 bytes
PA13.attribtue = PIO_DEFAULT; //Length 1 byte
PA13.pio_type = PIO_OUTPUT; //Length 4 bytes
and to convert (pin_number for example) to a byte[]:
byte[] temp = BitConverter.GetBytes(PA13.pin_number); //uints are 4 bytes wide
byteArray[++NumberOfChars] = temp[0];
byteArray[++NumberOfChars] = temp[1];
byteArray[++NumberOfChars] = temp[2];
byteArray[++NumberOfChars] = temp[3]; //Will need to check on Endianess
Questions:
In the structure PA13, how do I initialise the unsafe pointer pin_name? The author of the driver notes that this is not used, presumably by his driver. Will Windows need this to be some value?
PA13.pin_name = ??????
Then, how do I convert this pointer to a byte to fit into my byte[] array to be passed to DeviceIOControl?
I've become quite disappointed and frustrated at how difficult it is to change the voltage level of a port pin - I've been struggling with this problem for days now. Because I come from a hardware background, I think it's going to be easier (and less eligant) for me to implement IO control on another controller and to pass control data to it via a COM port.
Thanks again for any (simple) assistance.
You will need to do a few different things here. First, replace this member:
unsafe byte* pin_name; //Length???
with:
[MarshalAs(UnmanagedType.LPStr)] public string pin_name;
Then replace the in/out buffers in the P/Invoke declaration from byte[] to IntPtr. Then you can use this code to convert the data:
pio_desc PA13;
// Set the members of PA13...
IntPtr ptr = IntPtr.Zero;
try {
var size = Marshal.SizeOf(PA13);
ptr = Marshal.AllocHGlobal(size);
Marshal.StructureToPtr(PA13, ptr, false);
// Your P/Invoke call goes here.
// size will be the "nInBufferSize" argument
// ptr will be the "lpInBuffer" argument
} finally {
if (ptr != IntPtr.Zero) {
Marshal.DestroyStructure(ptr, typeof(pio_desc));
Marshal.FreeHGlobal(ptr);
}
}
You can make this a lot easier by lying about the [DllImport] declaration. Just declare the lpInBuffer argument as the structure type, the pinvoke marshaller will convert it to a pointer anyway. Thus:
[DllImport("coredll.dll", EntryPoint = "DeviceIoControl", SetLastError = true)]
internal static extern bool SetOutputPin(IntPtr hDevice,
int dwIoControlCode,
ref pio_desc lpInBuffer,
int nInBufferSize,
IntPtr lpOutBuffer,
int nOutBufferSize,
out int lpBytesReturned,
IntPtr lpOverlapped);
Using IntPtr for lpOutBuffer because the driver probably doesn't return anything. Pass IntPtr.Zero. Same idea with the structure. If the field isn't used then simply declare it as an IntPtr:
[StructLayout(LayoutKind.Sequential)]
public struct pio_desc
{
public IntPtr pin_name; // Leave at IntPtr.Zero
public uint pin_number; //4 bytes
public uint default_value; //4 bytes
public byte attribute; //1 byte
public uint pio_type; //4 bytes
}
Be careful about the Packing property, it makes a difference here because of the byte sized field. You may need 1 but that's just a guess without knowing anything about the driver. If you have working C code then test the value of sizeof(pio_desc) and compare with Marshal.SizeOf(). Pass Marshal.SizeOf(typeof(pio_desc)) as the nInBufferSize argument. If you would have posted the C declarations then this would have been easier to answer accurately.
Declare lpInBuffer and lpOutBuffer as IntPtr. Initialize them using Marshal.AllocHGlobal (don't forget to release them with Marshal.FreeHGlobal in the end). Fill these buffer and read it using different Marshal.Copy overloads.

How does one create structures for C# originally written in C++

I am working on an embedded ARM platform and wish to have control over some of the GPIO pins. I've experience in C and am a newcomer to C#. It seems that controlling low-level hardware is difficult from managed-code applications. I have Windows CE6.0 and .NET Compact Framework 2 running on my hardware.
I've found an example, written in C++ that would allow me access to GPIO port pins, however, I am struggling to implement the example in C#.
The following snippet shows how DeviceIOcontrol is used to control port pins:
const struct pio_desc hw_pio[] =
{
{"LED1", AT91C_PIN_PA(13), 0, PIO_DEFAULT, PIO_OUTPUT},
{"LED2", AT91C_PIN_PA(14), 0, PIO_DEFAULT, PIO_OUTPUT},
};
T_GPIOIOCTL_STATE * pSetState;
T_GPIOIOCTL_STATE * pGetState;
// Configure PIOs
bSuccessDevIOC = DeviceIoControl(hGPIO, IOCTL_GPIO_CONFIGURE, (LPBYTE*)hw_pio, sizeof(hw_pio), NULL, 0, NULL, NULL);
Other definitions:
struct pio_desc
{
const char *pin_name; /* Pin Name */
unsigned int pin_num; /* Pin number */
unsigned int dft_value; /* Default value for outputs */
unsigned char attribute;
enum pio_type type;
};
/* I/O type */
enum pio_type
{
PIO_PERIPH_A,
PIO_PERIPH_B,
PIO_INPUT,
PIO_OUTPUT,
PIO_UNDEFINED
};
/* I/O attributes */
#define PIO_DEFAULT (0 << 0)
#define PIO_PULLUP (1 << 0)
#define PIO_DEGLITCH (1 << 1)
#define PIO_OPENDRAIN (1 << 2)
The following is the C# definition of DeviceIOControl. The problem parameter is lpInBuffer.
[DllImport("coredll.dll", EntryPoint = "DeviceIoControl", SetLastError = true)]
internal static extern bool DeviceIoControlCE(int hDevice,
int dwIoControlCode,
byte[] lpInBuffer,
int nInBufferSize,
byte[] lpOutBuffer,
int nOutBufferSize,
ref int lpBytesReturned,
IntPtr lpOverlapped);
The questions:
How does one create these equivalent structures in C#?
How does one pass these as a byte array (byte[] lpInBuffer) to the DeviceIOControl function?
Any assistance appreciated!
Use some interop decoration, for instance, a structure like following:
typedef struct
{
char Data[MAXCHARS];//assuming a #define MAXCHARS 15
int Values[MAXCHARS];
} StSomeData;
would look like following in C#:
[StructLayout(LayoutKind.Sequential)]
private struct StSomeData
{
[System.Runtime.InteropServices.MarshalAs(System.Runtime.InteropServices.UnmanagedType.ByValTStr, SizeConst = 15)]
public string Data;
[System.Runtime.InteropServices.MarshalAs(System.Runtime.InteropServices.UnmanagedType.ByValArray, SizeConst = 15)]
public int[] Values;
}
And use it like: StSomeData[] array = new StSomeData[3];
Note that you can use IntPtr when dealing with pointers.
For instance your call to:
DeviceIoControl(hGPIO, IOCTL_GPIO_CONFIGURE, (LPBYTE*)hw_pio
, sizeof(hw_pio), NULL, 0, NULL, NULL);
may look something like following:
IntPtr ipByte;
Marshal.StructureToPtr(StLPByte, ipByte,false);
IntPtr ipConfig;
Marshal.StructureToPtr(StIOCTL_GPIO_CONFIGURE, ipConfig, false);
IntPtr iphGPIO;
Marshal.StructureToPtr(SthGPIO, iphGPIO, false);
bool bSuccessDevIOC = DeviceIoControl(iphGPIO
, ipConfig
, ipByte
, Marshal.SizeOf(typeof(StLPByte))
, IntPtr.Zero
, IntPtr.Zero
, IntPtr.Zero
, IntPtr.Zero);
Also, you can look into the usage of unsafe keyword and try put your code within unsafe blocks; this may be a dirty solution since this code wont be a managed code.
a byte[] is converted to LPBYTE automatically
a char* in c# is equivalent to unsigned short* in c++
a byte* in c# is a unsigned char* in c++
c#-enums behave similar enough to c++ enums. you may just write it
one important thing:
[StructLayout(LayoutKind.Sequential,Pack=1)]

Categories