I have an object array containing array of a different type that is not known at compile time, but turns out to be int[], double[] etc.
I want to save these arrays to disk and I don't really need to process their contents online, so I looking for a way to cast the object[] to a byte[] that I then can write to disk.
How can I achieve this?
You may use binary serialization and deserialization for Serializable types.
using System.Runtime.Serialization.Formatters.Binary;
BinaryFormatter binary = BinaryFormatter();
using (FileStream fs = File.Create(file))
{
bs.Serialize(fs, objectArray);
}
Edit: If all these elements of an array are simple types then use BitConverter.
object[] arr = { 10.20, 1, 1.2f, 1.4, 10L, 12 };
using (MemoryStream ms = new MemoryStream())
{
foreach (dynamic t in arr)
{
byte[] bytes = BitConverter.GetBytes(t);
ms.Write(bytes, 0, bytes.Length);
}
}
You could do it the old fashioned way.
static void Main()
{
object [] arrayToConvert = new object[] {1.0,10.0,3.0,4.0, 1.0, 12313.2342};
if (arrayToConvert.Length > 0) {
byte [] dataAsBytes;
unsafe {
if (arrayToConvert[0] is int) {
dataAsBytes = new byte[sizeof(int) * arrayToConvert.Length];
fixed (byte * dataP = &dataAsBytes[0])
// CLR Arrays are always aligned
for(int i = 0; i < arrayToConvert.Length; ++i)
*((int*)dataP + i) = (int)arrayToConvert[i];
} else if (arrayToConvert[0] is double) {
dataAsBytes = new byte[sizeof(double) * arrayToConvert.Length];
fixed (byte * dataP = &dataAsBytes[0]) {
// CLR Arrays are always aligned
for(int i = 0; i < arrayToConvert.Length; ++i) {
double current = (double)arrayToConvert[i];
*((long*)dataP + i) = *(long*)¤t;
}
}
} else {
throw new ArgumentException("Wrong array type.");
}
}
Console.WriteLine(dataAsBytes);
}
}
However, I would recommend that you revisit your design. You should probably be using generics, rather than object arrays.
From here:
List<object> list = ...
byte[] obj = (byte[])list.ToArray(typeof(byte));
or if your list is complex type:
list.CopyTo(obj);
Related
I'm working on c# windows service that handles firebird database requests. My problem occurs at random moments (sometimes after 5 minutes, sometimes after just 4 calls to database), when I try to deserialize object on client application. It happens though only at specific position (stops at 18th byte in 54 byte array). Rest of the time the function returns a proper result.
I'm using this function to serialize single object
public byte[] ObjectToByteArray(Object obj)
{
if (obj == null)
return null;
MemoryStream fs = new MemoryStream();
BinaryFormatter formatter = new BinaryFormatter();
formatter.Serialize(fs, obj);
fs.Seek(0, SeekOrigin.Begin);
byte[] rval = fs.ToArray();
fs.Close();
return rval;
}
I am not serializing any custom classes, only strings and numeric types (firebird api returns them as objects though).
I use this to deserialize:
public object ByteArrayToObject(Byte[] Buffer)
{
BinaryFormatter formatter = new BinaryFormatter();
MemoryStream stream = new MemoryStream(Buffer);
stream.Position = 0;
object rval = formatter.Deserialize(stream); <--- this thing drives me nuts.
stream.Close();
return rval;
}
and main fnct in client aplication. Sorry for ugly code,
public List<object[]> ByteToList(byte[] data, int[] pomocnicza)
{
//pomocnicza table contains size of (original) particular column of list in bytes
int size_row = 0;
foreach (int i in pomocnicza)
{ size_row += i; }
List<object[]> result = new List<object[]>();
int iterator = 0;
for (int i = 0; i < data.Length / size_row ; i++)
{
object[] zxc = new object[3];
int l = pomocnicza.Length/4;
for (int j = 0; j < l; j++)
{
byte[] tmp = new byte[pomocnicza[j*4]];
System.Array.Copy(data, iterator, tmp, 0, pomocnicza[j*4]);
object ffs = ByteArrayToObject(tmp);
zxc[j] = ffs;
iterator += pomocnicza[j*4];
}
result.Add(zxc);
}
return result;
}
What is baffling me is that it works in most cases, but inevitably causes to throw an error. Thing that it happens on random makes pinpointing it harder. Please help.
#EDIT
This is how I read the input:
public List<object[]> RetrieveSelectData(FbConnection dbConn, string SQLCommand)
{
using (var command = dbConn.CreateCommand())
{
command.CommandText = SQLCommand;
using (var reader = command.ExecuteReader())
{
var rows = new List<object[]>();
while (reader.Read())
{
var columns = new object[reader.FieldCount];
reader.GetValues(columns);
rows.Add(columns);
}
return rows;
}
}
}
and then serialize with this function
public byte[] ListToByte(List<object[]> lista, out int[] rozmiary)
{
int size= 0;
rozmiary = new int[lista[0].Length];
for (int i = 0; i < lista[0].Length; i++)
{
byte[] test = this.ObjectToByteArray(lista[0][i]);
size+= test.Length;
rozmiary[i] = test.Length;
}
size*= lista.Count;
byte[] result = new byte[size];
int index = 0;
for (int i = 0; i < lista.Count; i++)
{
for (int j = 0; j < lista[i].Length; j++)
{
byte[] tmp = this.ObjectToByteArray(lista[i][j]);
tmp.CopyTo(result, index);
index += tmp.Length;
}
}
return result;
}
If you are using above deserializing methods & also call them while getting stream from clientstream OR other streams.... skip it. try to use directly those streams with formatter. Like Below :
NetworkStream clientStream = client.GetStream();
Object src = (Object)formatter.Deserialize(clientStream);
I have found the bug. The code above works fine, but care for encoding in some cases(!), so feel free to use it.
The problem laying in another part of a program, where I mistyped and send 4 bytes BUT the client app was told to receive 8, so in most cases it filled it in with zeros, but sometimes it got it from next pack of data.
It was #Marc Gravell and his blog that made me look over and over again to eventually find the source.
I having some issues with converting a large byte[] array into a strongly typed array.
I have an array which has been concatinated into one large byte[] array and stored in a table.
I want to then read this byte[] array but convert it to a strongly typed array.
As I have stored the entire array as a byte[] array, can I not read that byte array and convert it to my strongly typed version? At the moment its returning null...
Is this possible in one hit?
Thanks in advance, Onam.
<code>
#region Save
public void Save<T>(T[] Array) where T : new()
{
List<byte[]> _ByteCollection = new List<byte[]>();
byte[] _Bytes = null;
int _Length = 0;
int _Offset = 0;
foreach (T _Item in Array)
{
_ByteCollection.Add(Serialise(_Item));
}
foreach (byte[] _Byte in _ByteCollection)
{
_Length += _Byte.Length;
}
_Bytes = new byte[_Length];
foreach (byte[] b in _ByteCollection)
{
System.Buffer.BlockCopy(b, 0, _Bytes, _Offset, b.Length);
_Offset += b.Length;
}
Customer[] c = BinaryDeserialize<Customer[]>(_Bytes);
}
#endregion
#region BinaryDeserialize
public static T BinaryDeserialize<T>(byte[] RawData)
{
T _DeserializedContent = default(T);
BinaryFormatter _Formatter = new BinaryFormatter();
try
{
using (MemoryStream _Stream = new MemoryStream())
{
_Stream.Write(RawData, 0, RawData.Length);
_Stream.Seek(0, SeekOrigin.Begin);
_DeserializedContent = (T)_Formatter.Deserialize(_Stream);
}
}
catch (Exception ex)
{
_DeserializedContent = default(T);
}
return _DeserializedContent;
}
#endregion
</code>
I think the problem is that you are serializing each item to a list, then concatenating the bytes. When this is deserialised this just looks like the data for one customer plus some unexpected data (the other customers) at the end.
I don't know how your serialize method works but you can probably just change code:
foreach (T _Item in Array)
{
_ByteCollection.Add(Serialise(_Item));
}
To:
_ByteCollection.Add(Serialise(Array));
And that should work, then you could probably simplify it a little.
Most likely the line
_DeserializedContent = (T)_Formatter.Deserialize(_Stream);
throws an exception. In the catch block you simply swallow and ignore that exception.
OK so I have converted a file into a binary format using BinaryWriter. The format is:
number of ints, followed by the ints.
So the code will be something like:
readLineOfNumbers() {
count = read();
int[] a = read(count ints);
return a;
}
Do I use a BinaryReader? The closest thing I can see there is to read everything into a byte[], but then how do I make that an int array? This all has to be done very efficiently as well. I need buffering and so on.
If you use BinaryWriter to create file it makes sense to read it using BinaryReader
Something like:
private static int[] ReadArray(BinaryReader reader)
{
int count = reader.ReadInt32();
int[] data = new int[count];
for (int i = 0; i < count; i++)
{
data[i] = reader.ReadInt32();
}
return data;
}
I don't know of anything within BinaryReader which will read an array of integers, I'm afraid. If you read into a byte array you could then use Buffer.BlockCopy to copy those bytes into an int[], which is probably the fastest form of conversion - although it relies on the endianness of your processor being appropriate for your data.
Have you tried just looping round, calling BinaryReader.ReadInt32() as many times as you need to, and letting the file system do the buffering? You could always add a BufferedStream with a large buffer into the mix if you thought that would help.
int[] original = { 1, 2, 3, 4 }, copy;
byte[] bytes;
using (var ms = new MemoryStream())
{
using (var writer = new BinaryWriter(ms))
{
writer.Write(original.Length);
for (int i = 0; i < original.Length; i++)
writer.Write(original[i]);
}
bytes = ms.ToArray();
}
using (var ms = new MemoryStream(bytes))
using (var reader = new BinaryReader(ms))
{
int len = reader.ReadInt32();
copy = new int[len];
for (int i = 0; i < len; i++)
{
copy[i] = reader.ReadInt32();
}
}
Although personally I'd just read from the stream w/o BinaryReader.
Actually, strictly speaking, if it was me I would use my own serializer, and just:
[ProtoContract]
public class Foo {
[ProtoMember(1, Options = MemberSerializationOptions.Packed)]
public int[] Bar {get;set;}
}
since this will have known endianness, handle buffering, and will use variable-length encoding to help reduce bloat if most of the numbers aren't enormous.
Please show me optimized solutions for castings:
1)
public static byte[] ToBytes(List<Int64> list)
{
byte[] bytes = null;
//todo
return bytes;
}
2)
public static List<Int64> ToList(byte[] bytes)
{
List<Int64> list = null;
//todo
return list;
}
It will be very helpful to see versions with minimized copying and/or with unsafe code (if it can be implemented). Ideally, copying of data are do not need at all.
Update:
My question is about casting like C++ manner:
__int64* ptrInt64 = (__int64*)ptrInt8;
and
__int8* ptrInt8 = (__int8*)ptrInt64
Thank you for help!!!
Edit, fixed for correct 8 byte conversion, also not terribly efficient when converting back to byte array.
public static List<Int64> ToList(byte[] bytes)
{
var list = new List<Int64>();
for (int i = 0; i < bytes.Length; i += sizeof(Int64))
list.Add(BitConverter.ToInt64(bytes, i));
return list;
}
public static byte[] ToBytes(List<Int64> list)
{
var byteList = list.ConvertAll(new Converter<Int64, byte[]>(Int64Converter));
List<byte> resultList = new List<byte>();
byteList.ForEach(x => { resultList.AddRange(x); });
return resultList.ToArray();
}
public static byte[] Int64Converter(Int64 x)
{
return BitConverter.GetBytes(x);
}
Use Mono.DataConvert. This library has converters to/from most primitive types, for big-endian, little-endian, and host-order byte ordering.
CLR arrays know their types and sizes so you can't just cast an array of one type to another. However, it is possible to do unsafe casting of value types. For example, here's the source to BitConverter.GetBytes(long):
public static unsafe byte[] GetBytes(long value)
{
byte[] buffer = new byte[8];
fixed (byte* numRef = buffer)
{
*((long*) numRef) = value;
}
return buffer;
}
You could write this for a list of longs, like this:
public static unsafe byte[] GetBytes(IList<long> value)
{
byte[] buffer = new byte[8 * value.Count];
fixed (byte* numRef = buffer)
{
for (int i = 0; i < value.Count; i++)
*((long*) (numRef + i * 8)) = value[i];
}
return buffer;
}
And of course it would be easy to go in the opposite direction if this was how you wanted to go.
I have a long array. How to write this array to a binary file?
Problem is that if I convert it into byte array some values are changed.
The array is like:
long array = new long[160000];
Give some code snippet.
The BinaryFormatter will be the easiest.
Also valuetypes (I assume this is what you mean by long), serializes very efficiently.
var array = new[] { 1L, 2L, 3L };
using (var stream = new FileStream("test.bin", FileMode.Create, FileAccess.Write, FileShare.None))
using (var writer = new BinaryWriter(stream))
{
foreach (long item in array)
{
writer.Write(item);
}
}
How are the values changed? And an array of long can be copied into an array of byte very quickly, no need for serialization.
static void Main(string[] args) {
System.Random random = new Random();
long[] arrayOriginal = new long[160000];
long[] arrayRead = null;
for (int i =0 ; i < arrayOriginal.Length; i++) {
arrayOriginal[i] = random.Next(int.MaxValue) * random.Next(int.MaxValue);
}
byte[] bytesOriginal = new byte[arrayOriginal.Length * sizeof(long)];
System.Buffer.BlockCopy(arrayOriginal, 0, bytesOriginal, 0, bytesOriginal.Length);
using (System.IO.MemoryStream stream = new System.IO.MemoryStream()) {
// write
stream.Write(bytesOriginal, 0, bytesOriginal.Length);
// reset
stream.Flush();
stream.Position = 0;
int expectedLength = 0;
checked {
expectedLength = (int)stream.Length;
}
// read
byte[] bytesRead = new byte[expectedLength];
if (expectedLength == stream.Read(bytesRead, 0, expectedLength)) {
arrayRead = new long[expectedLength / sizeof(long)];
Buffer.BlockCopy(bytesRead, 0, arrayRead, 0, expectedLength);
}
else {
// exception
}
// check
for (int i = 0; i < arrayOriginal.Length; i++) {
if (arrayOriginal[i] != arrayRead[i]) {
throw new System.Exception();
}
}
}
System.Console.WriteLine("Done");
System.Console.ReadKey();
}