I have this code (snipet) and it runs through and works, but after leaving the whole procedure procedureInAClass() and going on with next code it crashes: "Bad_module_error". I don't see the error.
public void procedureInAClass(){ //this code is in a Class, it works but after leaving whole procedure it crashes
char** comment=(char**)Marshal.AllocHGlobal(sizeof(char*)); //comment is in Code a class Member
string aval="some chars in a string";
SToCP(val, comment) ; //value of String to *comment
CPToS(comment); //**comment to string
}
//this part is in a static class
public static void SToCP(string s, char** c)//writes string s in *c
{
*c = SToCP(s);
}
public static char* SToCP(string s)
{
char* ret= (char*)Marshal.AllocHGlobal( sizeof(char) * (s.Length +1));
int i;
byte se = sizeof(char);
for (i = 0; i < s.Length; i++)
*(ret + se * i) = s[i];
*(ret + s.Length * se) = '\0';
return ret;
}
public static String CPToS(char** c)
{
return CPToS(*c); //passing the pointer char* which is holded by char** c
}
public static String CPToS(char* c)
{
string ret = "";
byte s = sizeof(char);//char is two bytes long
int i = 0;
while (*(c + s * i) != '\0')//zero terminated string
ret += *(c + s * i++);
return ret;
}
Your problem is that you are multiplying your index i by the sizeof(char*) but treating that as a char*. Adding 1 to a char* adds the size of a char (2 bytes) to the pointer. You were writing every other character, then continuing to overwrite memory after your string allocation because of going too far.
Try this instead:
public static char* SToCP(string s) {
char* ret = (char*)Marshal.AllocHGlobal(sizeof(char) * (s.Length + 1));
char* p = ret;
for (int i = 0; i < s.Length; i++)
*(p++) = s[i];
*(p++) = '\0';
return ret;
}
Of course your CPToS method is also wrong, so if you fix SToCP to correctly not write over other character, you will need to fix CPToS as well otherwise it will return the wrong string back.
Here's a fixed version of CPToS to match:
public static String CPToS(char* c) {
string ret = "";
int i = 0;
while (*(c + i) != '\0')//zero terminated string
ret += *(c + i++);
return ret;
}
SToCP was causing the stack to be screwed up and causing the crash, though.
Related
I want to call a function of a library from Net6 C#. The function expects a pointer to a structure. Inside the structure is a variable-length array. I donĀ“t know how to marshal this array correctly.
The following code is test code to demonstrate the problem.
This is the header of the C-library:
typedef struct SubTest
{
char *name;
int num1;
} SubTest;
typedef struct Test
{
char *name;
int num1;
float num2;
SubTest *testarray;
int testarraylen;
} Test;
void PrintTestStruct(Test *teststruct);
This is the implementation of PrintTestStruct:
void PrintTestStruct(Test *teststruct)
{
printf("Name: %s \n", teststruct->name);
printf("Num1: %d \n", teststruct->num1);
printf("Num2: %f \n", teststruct->num2);
printf("Array content: \n");
for(int i=0; i < teststruct->testarraylen; i++)
{
printf("Array Name %d: %s \n", i,teststruct->testarray[i].name);
printf("Array Number %d: %d \n", i,teststruct->testarray[i].num1);
}
}
This is the definition in C#:
[StructLayout(LayoutKind.Sequential)]
public struct Test
{
public string name;
public int num1;
public float num2;
public IntPtr testarray;
public int testarraylen;
}
[StructLayout(LayoutKind.Sequential)]
public struct SubTest
{
public string name;
public int num1;
}
[DllImport("cshared")]
private static extern void PrintTestStruct(ref Test teststruct);
This is what I have tried:
public static void Main(string[] args)
{
var data = new Test();
data.name = "Hello from C#";
data.num1 = 5;
data.num2 = 3.2f;
data.testarraylen = 2;
var field1 = new SubTest();
field1.name = "Testarray 1";
field1.num1 = 1;
var field2 = new SubTest();
field2.name = "Testarray 2";
field1.num1 = 2;
SubTest[] subarray = {field1, field2};
IntPtr mem = Marshal.AllocCoTaskMem(Marshal.SizeOf(typeof(SubTest)) * data.testarraylen);
for (int ix = 0; ix < 2; ix++)
{
Marshal.StructureToPtr<SubTest>(subarray[ix], mem, false);
mem += Marshal.SizeOf(subarray[ix]);
}
data.testarray = mem;
PrintTestStruct(ref data);
}
Unfortunately the result is garbage, the data of the array is not printed correctly. I followed all suggestions I found on stackoverflow, but could not get any better results.
Question:
Is there a way to fix this ?
As I have access to the source code of the C library, is there a better way to transmit these kind of variable-length arrays between C# and C ? Can I change the C library in some way to make this easier ?
The problem is you've changed the address of mem, but you didn't revert it.
mem += Marshal.SizeOf(subarray[ix]);
So a solution is adding the offset address to men instead of changing in the loop.
var sizeOfSubTest = Marshal.SizeOf(typeof(SubTest));
IntPtr mem = Marshal.AllocCoTaskMem(sizeOfSubTest * data.testarraylen);
for (int ix = 0; ix < 2; ix++)
{
Marshal.StructureToPtr(subarray[ix], mem + sizeOfSubTest * ix, false);
}
Can I change the C library in some way to make this easier ?
IMO this is easiest.
I just started using dlls, but I haven't had this problem before, so it might not be dll connected. I am have KMP String-match algorithm implemented in c++ and I am calling it from c# using dll.
This is my export:
extern "C" __declspec (dllexport) const char* naive(const char* text, const char* str);
extern "C" __declspec (dllexport) const char* KMP(const char* text, const char* str);
My import:
[DllImport(#"dll_path", CallingConvention = CallingConvention.Cdecl)]
public static extern IntPtr KMP([MarshalAs(UnmanagedType.LPStr)] string text, [MarshalAs(UnmanagedType.LPStr)] string str);
Calling from c#
string output = Marshal.PtrToStringAnsi(KMP(richTextBox1.Text, richTextBox2.Text));
And the c++ function:
const char* KMP(const char* text, const char* str)
{
int tL = strlen(text);
int sL = strlen(str);
/* Algorithm */
}
The exception is thrown right after the function is called. So I figured it's not the code implementation. The wired thing is it's only thrown when there is a '\n' new line in the second parameter (str), no matter where exactly. If there are no new lines it runs normally. The thing that confuses me the most is why just the second argument, both are identically declared and used. I also have implemented Naive algorithm, same story.
All the answers I found were only when a negative number was given as size to an array or an undeclared variable, but nothing on pointers. But I doubt it's anything similar, because when my search string (2nd parameter (str)) doesn't contain new line the code executes normally.
Any ideas ?
Thank you in front.
EDIT (body of function):
const char* KMP(const char* text, const char* str)
{
int tL = strlen(text);
int sL = strlen(str);
string match = "";
if (sL == 0 || tL == 0)
throw "both text and string must be larger than 0";
else if (sL > tL)
throw "the text must be longer than the string";
int tI = 0;
int col = 0, row = 0;
while (tI <= tL - sL)
{
int i = 0;
int tmpCol = -1;
int next = 1;
for (; i <= sL && text[i + tI] != '\0'; i++)
{
if (text[i + tI] == '\n')
{
row++;
tmpCol++;
}
if (text[i + tI] == str[0] && next == 1 && i > 0)
next = i;
if (text[i + tI] != str[i])
break;
}
if (i == sL)
{
match += to_string(row) + ',' + to_string(col) + ';';
}
tI += next;
col = tmpCol > -1 ? tmpCol : col + next;
}
char* c = new char[match.length() - 1];
c[match.length() - 1] = '\0';
for (int i = 0; i < match.length() - 1; i++)
c[i] = match[i];
return c;
}
Just change your code to handle no matches case, because runtime cannot allocate 0-1 = 0xFFFFFFFFF bytes. And now I have also changed your copy buffer allocation and loop code to avoid overwrite(as pointed by #HenkHoltermann):
...
if (match.length() == 0)
return "No matches";
// Allocate for all chars + \0 except the last semicolon
char* c = new char[match.length()];
c[match.length() - 1] = '\0';
// Copy all chars except the last semicolon
for (int i = 0; i < match.length() - 1; i++)
c[i] = match[i];
return c;
!It still does not copy the last semicolon, so if you need it then you will have to add one more symbol to the buffer.
P.S.: Also I see a few issues with your code:
You use C++ exceptions. While CLR will catch them as SEH (because VC++ uses SEH) it is still not a good idea overall - Throwing C++ exceptions across DLL boundaries
You use signed int for length int tL = strlen(text); and strlen returns unsigned size_t. It may not be an actual problem, but it is not a right way either.
I have a function in a small application that I'm writing to break a recycled one-time pad cypher. Having used VB.NET for most of my career I thought it would be interesting to implement the app in C#. However, I have encountered a problem due to my present unfamiliarity with C#.
The function takes in two strings (of binary digits), converts these strings to char arrays, and then performs an XOR on them and places the result in a third char array.
This is fine until I try to convert the third char array to a string. Instead of the string looking like "11001101" etc, I get the following result: " \0\0 \0 " i.e. the "1"s are being represented by spaces and the "0"s by "\0".
My code is as follows:
public string calcXor(string a, string b)
{
char[] charAArray = a.ToCharArray();
char[] charBArray = b.ToCharArray();
int len = 0;
// Set length to be the length of the shorter string
if (a.Length > b.Length)
len = b.Length - 1;
else
len = a.Length - 1;
char[] result = new char[len];
for (int i = 0; i < len; i++)
{
result[i] = (char)(charAArray[i] ^ charBArray[i]);
}
return new string(result);
}
Your problem is in the line
result[i] = (char)(charAArray[i] ^ charBArray[i]);
that should be
// (Char) 1 is not '1'!
result[i] = (char)((charAArray[i] ^ charBArray[i]) + '0');
More compact solution is to use StringBuilder, not arrays:
public string calcXor(String a, String b) {
int len = (a.Length < b.Length) ? a.Length : b.Length;
StringBuilder Sb = new StringBuilder();
for (int i = 0; i < len; ++i)
// Sb.Append(CharToBinary(a[i] ^ b[i])); // <- If you want 0's and 1's
Sb.Append(a[i] ^ b[i]); // <- Just int, not in binary format as in your solution
return Sb.ToString();
}
public static String CharToBinary(int value, Boolean useUnicode = false) {
int size = useUnicode ? 16 : 8;
StringBuilder Sb = new StringBuilder(size);
Sb.Length = size;
for (int i = size - 1; i >= 0; --i) {
Sb[i] = value % 2 == 0 ? '0' : '1';
value /= 2;
}
return Sb.ToString();
}
Your solution just computes xor's (e.g. "65") and put them into line (e.g. 65728...); if you want 0's and 1's representation, you should use formatting
Have a look at the ASCII Table. 0 is the Null character \0. You could try ToString()
Have you tried using binary / byte[]? It seems like the fastest way to me.
public string calcXor(string a, string b)
{
//String to binary
byte[] ab = ConvertToBinary(a);
byte[] bb = ConvertToBinary(b);
//(XOR)
byte[] cb = a^b
return cb.ToString();
}
public static byte[] ConvertToBinary(string str)
{
System.Text.ASCIIEncoding encoding = new System.Text.ASCIIEncoding();
return encoding.GetBytes(str);
}
I just wanted to add that the solution I eventually chose is as follows:
//Parameter binary is a bit string
public void someroutine(String binary)
{
var data = GetBytesFromBinaryString(binary);
var text = Encoding.ASCII.GetString(data);
}
public Byte[] GetBytesFromBinaryString(String binary)
{
var list = new List<Byte>();
for (int i = 0; i < binary.Length; i += 8)
{
String t = binary.Substring(i, 8);
list.Add(Convert.ToByte(t, 2));
}
return list.ToArray();
}
I have a char** in a C struct which is allocated in the C code as a Nx128 matrix. In C# I have an array of strings and I want to copy this array to the char double pointer, without reallocating anything. I tried this:
public void StringArrayToPtr(IntPtr ptr, string[] array)
{
for (int i = 0; i < array.Length; i++)
{
char[] chars = (array[i] + '\0').ToCharArray();
Marshal.Copy(chars, 0, IntPtr.Add(ptr, 128*i), chars.Length);
}
}
But this doesn't work. Does somebody know how to perform such a copy ?
UPDATE:
Here is how my char** names is allocated in the C code for 3 items: names = (char **) MallocArray2D (3, 128, sizeof ( char ));
Here is the details of the MallocArray2D method:
void ** MallocArray2D (
int n1,
int n2,
int size_elem )
{
void ** p2;
void * p1;
size_t i;
p1 = (void *) malloc (size_elem * n1 * n2);
p2 = (void **) malloc (n1 * sizeof ( void * ));
for ( i = 0 ; i < n1 ; i++ )
{
p2[i] = (char *) p1 + size_elem * n2 * i;
}
return p2;
}
This MallocArray2D method is called into a MallocImage which is exposed in my C# code.
Here is the MallocImage method interesting part in C code:
int MallocImage (
IMAGE * image,
int nxyz,
int nvar )
{
//... Allocating others objects
image->names = (char **) MPDSMallocArray2D ( nvar, 128, sizeof ( char ));
}
Now my C# exposed MallocImage method:
[DllImport(DLL_PATH, CallingConvention = CallingConvention.Cdecl)]
public static extern int MallocImage([In, Out]Image image, int nXYZ, int nVar);
// My Image class
[StructLayout(LayoutKind.Sequential)]
class Image {
private IntPtr names;
public string[] Names {
set {ArrayOfStringToPtr(names, value);}
}
// Constructor
public Image(int nVar, int nXYZ) {
MallocImage(this, nXYZ, nVar);
}
}
// Somewhere else in my code
image.Names = new string[] {"porosity", "duplicity", "facies"];
A System.Char is a unicode 16 bits character [MSDN]. You probably work with ASCII string.
The pointer arithmetic seems wrong here, since you are working with a pointer who points to an array of three pointers you probably need to get the address of the string using : Marshal.ReadIntPtr(ptr, i * IntPtr.Size) thus the resulting code will be :
public static void StringArrayToPtr(IntPtr ptr, string[] array)
{
for (int i = 0; i < array.Length; i++)
{
byte[] chars = System.Text.Encoding.ASCII.GetBytes(array[i] + '\0');
Marshal.Copy(chars, 0, Marshal.ReadIntPtr(ptr, i * IntPtr.Size), chars.Length);
}
}
How would you convert a parapraph to hex notation, and then back again into its original string form?
(C#)
A side note: would putting the string into hex format shrink it the most w/o getting into hardcore shrinking algo's?
What exactly do you mean by "hex notation"? That usually refers to encoding binary data, not text. You'd need to encode the text somehow (e.g. using UTF-8) and then encode the binary data as text by converting each byte to a pair of characters.
using System;
using System.Text;
public class Hex
{
static void Main()
{
string original = "The quick brown fox jumps over the lazy dog.";
byte[] binary = Encoding.UTF8.GetBytes(original);
string hex = BytesToHex(binary);
Console.WriteLine("Hex: {0}", hex);
byte[] backToBinary = HexToBytes(hex);
string restored = Encoding.UTF8.GetString(backToBinary);
Console.WriteLine("Restored: {0}", restored);
}
private static readonly char[] HexChars = "0123456789ABCDEF".ToCharArray();
public static string BytesToHex(byte[] data)
{
StringBuilder builder = new StringBuilder(data.Length*2);
foreach(byte b in data)
{
builder.Append(HexChars[b >> 4]);
builder.Append(HexChars[b & 0xf]);
}
return builder.ToString();
}
public static byte[] HexToBytes(string text)
{
if ((text.Length & 1) != 0)
{
throw new ArgumentException("Invalid hex: odd length");
}
byte[] ret = new byte[text.Length/2];
for (int i=0; i < text.Length; i += 2)
{
ret[i/2] = (byte)(ParseNybble(text[i]) << 4 | ParseNybble(text[i+1]));
}
return ret;
}
private static int ParseNybble(char c)
{
if (c >= '0' && c <= '9')
{
return c-'0';
}
if (c >= 'A' && c <= 'F')
{
return c-'A'+10;
}
if (c >= 'a' && c <= 'f')
{
return c-'A'+10;
}
throw new ArgumentOutOfRangeException("Invalid hex digit: " + c);
}
}
No, doing this would not shrink it at all. Quite the reverse - you'd end up with a lot more text! However, you could compress the binary form. In terms of representing arbitrary binary data as text, Base64 is more efficient than plain hex. Use Convert.ToBase64String and Convert.FromBase64String for the conversions.
public string ConvertToHex(string asciiString)
{
string hex = "";
foreach (char c in asciiString)
{
int tmp = c;
hex += String.Format("{0:x2}", (uint)System.Convert.ToUInt32(tmp.ToString()));
}
return hex;
}
While I can't help much on the C# implementation, I would highly recommend LZW as a simple-to-implement data compression algorithm for you to use.
Perhaps the answer can be more quickly reached if we ask: what are you really trying to do? Converting an ordinary string to a string of a hex representation seems like the wrong approach to anything, unless you are making a hexidecimal/encoding tutorial for the web.
static byte[] HexToBinary(string s) {
byte[] b = new byte[s.Length / 2];
for (int i = 0; i < b.Length; i++)
b[i] = Convert.ToByte(s.Substring(i * 2, 2), 16);
return b;
}
static string BinaryToHex(byte[] b) {
StringBuilder sb = new StringBuilder(b.Length * 2);
for (int i = 0; i < b.Length; i++)
sb.Append(Convert.ToString(256 + b[i], 16).Substring(1, 2));
return sb.ToString();
}