writing a byte array from as string using an unknown dll - c#

i have the following code:
int BufSize = 60000000;
int BufSizeM1M = BufSize - 1000000;
byte[] ByteBuf = new byte[BufSizeM1M];
byte[] ByteBufVer = new byte[BufSizeM1M];
using (WinFileIO WFIO = new WinFileIO(ByteBuf))
{
WFIO.OpenForWriting(path);
Byte[] BytesInFiles = GetBytes(content);
WFIO.WriteBlocks(BytesInFiles.Length);
}
EDIT:
This is the original code i was working with, trying to modify it myself seems to fail, so i was thinking you guyz might have a look:
int BufSize = 60000000;
int BufSizeM1M = BufSize - 1000000;
byte[] ByteBuf = new byte[BufSizeM1M];
byte[] ByteBufVer = new byte[BufSizeM1M];
int[] BytesInFiles = new int[3]
using (WinFileIO WFIO = new WinFileIO(ByteBuf))
WFIO.OpenForWriting(path);
WFIO.WriteBlocks(BytesInFiles[FileLoop]);
}
FileLoop is an int between 0-3 (the code was run in a loop)
this was used for testing write speed.
how would one change it to write actual content of a string?
the WFIO dll was provided to me without instructions and i cannot seem to get it to work.
the code above is the best i could get, but it writes a file filled with spaces instead of the actual string in the content variable. help please.

You seem to be passing only a length (number of bytes) to this components so it probably doesn't know what to write. Your ByteBuf array is initialized to an empty byte array and you probably write out BytesInFiles.Length number of 0-s. You are putting the converted content into BytesInFiles but you never use that buffer for writing - you only use its length.

I think you might be missing a step here. Once you've done:
Byte[] BytesInFiles = GetBytes(content);
Won't you need to do something with BytesInFiles? Currently it seems as though you are writing chunks of BytesInFiles, which will have been initialized to contain all zeros when you created it.
Edit: Would something like this help?
Byte[] BytesInFiles = GetBytes(content);
using (WinFileIO WFIO = new WinFileIO(BytesInFiles))
{
WFIO.OpenForWriting(path);
WFIO.WriteBlocks(BytesInFiles.Length);
}

Related

How to Convert Primitive[] to byte[]

For Serialization of Primitive Array, i'am wondering how to convert a Primitive[] to his corresponding byte[]. (ie an int[128] to a byte[512], or a ushort[] to a byte[]...)
The destination can be a Memory Stream, a network message, a file, anything.
The goal is performance (Serialization & Deserialization time), to be able to write with some streams a byte[] in one shot instead of loop'ing' through all values, or allocate using some converter.
Some already solution explored:
Regular Loop to write/read
//array = any int[];
myStreamWriter.WriteInt32(array.Length);
for(int i = 0; i < array.Length; ++i)
myStreamWriter.WriteInt32(array[i]);
This solution works for Serialization and Deserialization And is like 100 times faster than using Standard System.Runtime.Serialization combined with a BinaryFormater to Serialize/Deserialize a single int, or a couple of them.
But this solution becomes slower if array.Length contains more than 200/300 values (for Int32).
Cast?
Seems C# can't directly cast a Int[] to a byte[], or a bool[] to a byte[].
BitConverter.Getbytes()
This solution works, but it allocates a new byte[] at each call of the loop through my int[]. Performances are of course horrible
Marshal.Copy
Yup, this solution works too, but same problem as previous BitConverter one.
C++ hack
Because direct cast is not allowed in C#, i tryed some C++ hack after seeing into memory that array length is stored 4 bytes before array data starts
ARRAYCAST_API void Cast(int* input, unsigned char** output)
{
// get the address of the input (this is a pointer to the data)
int* count = input;
// the size of the buffer is located just before the data (4 bytes before as this is an int)
count--;
// multiply the number of elements by 4 as an int is 4 bytes
*count = *count * 4;
// set the address of the byte array
*output = (unsigned char*)(input);
}
and the C# that call:
byte[] arrayB = null;
int[] arrayI = new int[128];
for (int i = 0; i < 128; ++i)
arrayI[i] = i;
// delegate call
fptr(arrayI, out arrayB);
I successfully retrieve my int[128] into C++, switch the array length, and affecting the right adress to my 'output' var, but C# is only retrieving a byte[1] as return. It seems that i can't hack a managed variable like that so easily.
So i really start to think that all theses casts i want to achieve are just impossible in C# (int[] -> byte[], bool[] -> byte[], double[] -> byte[]...) without Allocating/copying...
What am-i missing?
How about using Buffer.BlockCopy?
// serialize
var intArray = new[] { 1, 2, 3, 4, 5, 6, 7, 8 };
var byteArray = new byte[intArray.Length * 4];
Buffer.BlockCopy(intArray, 0, byteArray, 0, byteArray.Length);
// deserialize and test
var intArray2 = new int[byteArray.Length / 4];
Buffer.BlockCopy(byteArray, 0, intArray2, 0, byteArray.Length);
Console.WriteLine(intArray.SequenceEqual(intArray2)); // true
Note that BlockCopy is still allocating/copying behind the scenes. I'm fairly sure that this is unavoidable in managed code, and BlockCopy is probably about as good as it gets for this.

What is the best way to prep data for serial transmission?

I am working on a C# program which will communicate with a VFD using the Mitsubishi communication protocol.
I am preparing several methods to create an array of bytes to be sent out.
Right now, I have typed up more of a brute-force method of preparing and sending the bytes.
public void A(Int16 Instruction, byte WAIT, Int32 Data )
{
byte[] A_Bytes = new byte[13];
A_Bytes[0] = C_ENQ;
A_Bytes[1] = 0x00;
A_Bytes[2] = 0x00;
A_Bytes[3] = BitConverter.GetBytes(Instruction)[0];
A_Bytes[4] = BitConverter.GetBytes(Instruction)[1];
A_Bytes[5] = WAIT;
A_Bytes[6] = BitConverter.GetBytes(Data)[0];
A_Bytes[7] = BitConverter.GetBytes(Data)[1];
A_Bytes[8] = BitConverter.GetBytes(Data)[2];
A_Bytes[9] = BitConverter.GetBytes(Data)[3];
Int16 SUM = 0;
for(int i = 0; i<10; i++)
{
SUM += A_Bytes[i];
}
A_Bytes[10] = BitConverter.GetBytes(SUM)[0];
A_Bytes[11] = BitConverter.GetBytes(SUM)[1];
A_Bytes[12] = C_CR;
itsPort.Write(A_Bytes, 0, 13);
}
However, something seems very inefficient about this. Especially the fact that I call GetBytes() so often.
Is this a good method, or is there a vastly shorter/faster one?
MAJOR UPDATE:
turns out, the mitsubishi structure is a little wonky in how it does all this.
Instead of working with bytes, it works with ascii chars. so while ENQ is still 0x05, an instruction code of E1, for instance, is actually 0x45 and 0x31.
This might actually make things easier.
Even without changing your algorithm, this can be made a bit more efficient and a bit more c#-like. If concating two array bothers you, that is of course optional.
var instructionBytes = BitConverter.GetBytes(instruction);
var dataBytes = BitConverter.GetBytes(data);
var contentBytes = new byte[] {
C_ENQ, 0x00, 0x00, instructionBytes[0], instructionBytes[1], wait,
dataBytes[0], dataBytes[1], dataBytes[2], dataBytes[3]
};
short sum = 0;
foreach(var byteValue in contentBytes)
{
sum += byteValue;
}
var sumBytes = BitConverter.GetBytes(sum);
var messageBytes = contentBytes.Concat(new byte[] { sumBytes[0], sumBytes[1], C_CR } );
itsPort.Write(messageBytes, 0, messageBytes.Length);
What I would suggest though, if you find yourself writing a lot of code like this, is to consider wrapping this up into a Message class. This code would form the basis of your constructor. You could then vary behavior (make things longer, shorter etc) with inheritance (or composition) and deal with the message as an object rather than a byte array.
Incidentally, you may see margin gains from using BinaryWriter rather than BitConverter (maybe?), but it's more hassle to use. (byte)(sum >> 8) is another option as well, which I think is the fastest actually and probably makes the most sense in your use case.

byte[] to byte[int] - C#

Well, this problem has been causing me problems for quite some time now :-(
in short, I have this code :
byte[] code;
byte[] moreCodes;
int x = moreCodes.Length;
code = moreCodes[x]; // Error !!
I have also tried the method
for(int i = 0; i < moreCodes.Length; i++)
{
code = moreCodes[i]; // Error !!
}
So my question is, how to apply/copy several bytes of code to an empty byte container ?
The byte[] code is currently empty, I want to make this byte contain the full contents of moreCodes.
An alternative idea I had is to use the for loop & apply moreCodes to itself, like this :
for(int i = 0; i < moreCodes.Length; i++)
{
moreCodes = moreCodes[i] ; // Error !!
}
Any ideas on how to achieve a fix to this problem would be greatly appreciated, I feel like this is a silly issue that I should be able to solve, but it's defiantly one of those I just can't get my head around.
Thanks for reading
Right now you can't compile because you are mixing byte arrays with single bytes. Arrays hold bytes, but it makes no sense to try to make an array equal to just one byte.
Also, at runtime, you will be getting an error on moreCodes.Length.
That's because you don't have "an empty byte container", you don't have any container at all.
Try
List<byte> moreCodes = new List<byte>();
and then you can add to it
moreCodes.Add(0xAA);
and when all your data is added, turn it into an array:
code = moreCodes.ToArray();
Or, if you know the desired length in advance, you can use an array:
byte[] moreCodes = new byte[72]; // here [72] specifies the size
for( int i = 0; i < moreCodes.Length; ++i )
moreCodes[i] = (byte)i; // here [i] accesses one byte within the array

Search ReadAllBytes for specific values

I am writing a program that reads '.exe' files and stores their hex values in an array of bytes for comparison with an array containing a series of values. (like a very simple virus scanner)
byte[] buffer = File.ReadAllBytes(currentDirectoryContents[j]);
I have then used BitConverter to create a single string of these values
string hex = BitConverter.ToString(buffer);
The next step is to search this string for a series of values(definitions) and return positive for a match. This is where I am running into problems. My definitions are hex values but created and saved in notepad as defintions.xyz
string[] definitions = File.ReadAllLines(#"C:\definitions.xyz");
I had been trying to read them into a string array and compare the definition elements of the array with string hex
bool[] test = new bool[currentDirectoryContents.Length];
test[j] = hex.Contains(definitions[i]);
This IS a section from a piece of homework, which is why I am not posting my entire code for the program. I had not used C# before last Friday so am most likely making silly mistakes at this point.
Any advice much appreciated :)
It is pretty unclear exactly what kind of format you use of the definitions. Base64 is a good encoding for a byte[], you can rapidly convert back and forth with Convert.ToBase64String and Convert.FromBase64String(). But your question suggests the bytes are encoded in hex. Let's assume it looks like "01020304" for a new byte[] { 1, 2, 3, 4}. Then this helper function converts such a string back to a byte[]:
static byte[] Hex2Bytes(string hex) {
if (hex.Length % 2 != 0) throw new ArgumentException();
var retval = new byte[hex.Length / 2];
for (int ix = 0; ix < hex.Length; ix += 2) {
retval[ix / 2] = byte.Parse(hex.Substring(ix, 2), System.Globalization.NumberStyles.HexNumber);
}
return retval;
}
You can now do a fast pattern search with an algorithm like Boyer-Moore.
I expect you understand that this is a very inefficient way to do it. But except for that, you should just do something like this:
bool[] test = new bool[currentDirectoryContents.Length];
for(int i=0;i<test.Length;i++){
byte[] buffer = File.ReadAllBytes(currentDirectoryContents[j]);
string hex = BitConverter.ToString(buffer);
test[i] = ContainsAny(hex, definitions);
}
bool ContainsAny(string s, string[] values){
foreach(string value in values){
if(s.Contains(value){
return true;
}
}
return false;
}
If you can use LINQ, you can do it like this:
var test = currentDirectoryContents.Select(
file=>definitions.Any(
definition =>
BitConverter.ToString(
File.ReadAllBytes(file)
).Contains(definition)
)
).ToArray();
Also, make sure that your definitions-file is formatted in a way that matches the output of BitConverter.ToString(): upper-case with dashes separating each encoded byte:
12-AB-F0-34
54-AC-FF-01-02

Better/faster way to fill a big array in C#

I have 3 *.dat files (346KB,725KB,1762KB) that are filled with a json-string of "big" int-Arrays.
Each time my object is created (several times) I take those three files and use JsonConvert.DeserializeObject to deserialize the arrays into the object.
I thought about using binary-files instead of a json-string or could I even save these arrays directly? I dont need to use these files, it's just the location the data is currently saved. I would gladly switch to anything faster.
What are the different ways to speed up the initialization of these objects?
The fastest way is to manually serialize the data.
An easy way to do this is by creating a FileStream, and then wrapping it in a BinaryWriter/BinaryReader.
You have access to functions to write the basic data structures (numbers, string, char, byte[] and char[]).
An easy way to write a int[] (unneccesary if it's fixed size) is by prepending the length of the array with either an int/long (depending on the size, unsigned doesn't really give any advantages, since arrays use signed datatypes for their length storage). And then write all the ints.
Two ways to write all the ints would be:
1. Simply loop over the entire array.
2. Convert it into a byte[] and write it using BinaryWriter.Write(byte[])
These is how you can implement them both:
// Writing
BinaryWriter writer = new BinaryWriter(new FileStream(...));
int[] intArr = new int[1000];
writer.Write(intArr.Length);
for (int i = 0; i < intArr.Length; i++)
writer.Write(intArr[i]);
// Reading
BinaryReader reader = new BinaryReader(new FileStream(...));
int[] intArr = new int[reader.ReadInt32()];
for (int i = 0; i < intArr.Length; i++)
intArr[i] = reader.ReadInt32();
// Writing, method 2
BinaryWriter writer = new BinaryWriter(new FileStream(...));
int[] intArr = new int[1000];
byte[] byteArr = new byte[intArr.Length * sizeof(int)];
Buffer.BlockCopy(intArr, 0, byteArr, 0, intArr.Length * sizeof(int));
writer.Write(intArr.Length);
writer.Write(byteArr);
// Reading, method 2
BinaryReader reader = new BinaryReader(new FileStream(...));
int[] intArr = new int[reader.ReadInt32()];
byte[] byteArr = reader.ReadBytes(intArr.Length * sizeof(int));
Buffer.BlockCopy(byteArr, 0, intArr, 0, byteArr.Length);
I decided to put this all to the test, with an array of 10000 integers I ran the test 10000 times.
It resulted in method one consumes averagely 888200ns on my system (about 0.89ms).
While method 2 only consumes averagely 568600ns on my system (0.57ms averagely).
Both times include the work the garbage collector has to do.
Obviously method 2 is faster than method 1, though possibly less readable.
Another reason why method 1 can be better than method 2 is because method 2 requires double the amount of RAM free than data you're going to write (the original int[] and the byte[] that's converted from the int[]), when dealing with limited RAM/extremely large files (talking about 512MB+), though if this is the case, you can always make a hybrid solution, by for example writing away 128MB at a time.
Note that method 1 also requires this extra space, but because it's split down in 1 operation per item of the int[], it can release the memory a lot earlier.
Something like this, will write 128MB of an int[] at a time:
const int WRITECOUNT = 32 * 1024 * 1024; // 32 * sizeof(int)MB
int[] intArr = new int[140 * 1024 * 1024]; // 140 * sizeof(int)MB
for (int i = 0; i < intArr.Length; i++)
intArr[i] = i;
byte[] byteArr = new byte[WRITECOUNT * sizeof(int)]; // 128MB
int dataDone = 0;
using (Stream fileStream = new FileStream("data.dat", FileMode.Create))
using (BinaryWriter writer = new BinaryWriter(fileStream))
{
while (dataDone < intArr.Length)
{
int dataToWrite = intArr.Length - dataDone;
if (dataToWrite > WRITECOUNT) dataToWrite = WRITECOUNT;
Buffer.BlockCopy(intArr, dataDone, byteArr, 0, dataToWrite * sizeof(int));
writer.Write(byteArr);
dataDone += dataToWrite;
}
}
Note that this is just for writing, reading works differently too :P.
I hope this gives you some more insight in dealing with very large data files :).
If you've just got a bunch of integers, then using JSON will indeed be pretty inefficient in terms of parsing. You can use BinaryReader and BinaryWriter to write binary files efficiently... but it's not clear to me why you need to read the file every time you create an object anyway. Why can't each new object keep a reference to the original array, which has been read once? Or if they need to mutate the data, you could keep one "canonical source" and just copy that array in memory each time you create an object.
The fastest way to create a byte array from an array of integers is to use Buffer.BlockCopy
byte[] result = new byte[a.Length * sizeof(int)];
Buffer.BlockCopy(a, 0, result, 0, result.Length);
// write result to FileStream or wherever
If you store the size of the array in the first element, you can use it again to deserialize. Make sure everything fits into memory, but looking at your file sizes it should.
var buffer = File.ReadAllBytes(#"...");
int size = BitConverter.ToInt32(buffer,0);
var result = new int[size];
Buffer.BlockCopy(buffer, 0, result, result.length);
Binary is not human readable, but definetely faster than JSON.

Categories