XmlSerializer formatter = new XmlSerializer(typeof(List<Objects.PIP>));
**MemoryStream stream = new MemoryStream(new byte[1024]);**
formatter.Serialize(stream, Repository.GlobalRepository.PIPInformation);
byte[] bt = stream.ToArray();
foreach (Communication.Client Client in server.ClientList)
{
Client.SendMessage(bt);
}
stream.Flush();
I get an error in stared line as:
There was an error generating the XML document.
InnerException {"Memory stream is not expandable."}
PIP Class
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using ObjectLibrary.Model;
using System.Xml.Serialization;
namespace PIPServer.Objects
{
[XmlRoot(Namespace = "urn:my-namespace")]
public class PIP
{
private long pipID;
public long PipID
{
get { return pipID; }
set { pipID = value; }
}
private CurrencyPair currencyPair;
public CurrencyPair CurrencyPair
{
get { return currencyPair; }
set { currencyPair = value; }
}
...............
}
}
What am i doing wrong?
Also is my method to convert Stream to byte array correct? [Answered]
The MemoryStream has a ToArray() method that you can use. It returns a byte array with the entire contents of the stream. Just make sure you are finished writing to the stream before reading the contents. :-)
As for the error, it is hard to answer without more information. Have you verified that all the objects you try to serialize are valid?
Edit:
From your comments I think I know what the problem is. You define the MemoryStream with a fixed size of 1024 bytes. When you serialize your object the generated xml is larger than that and you end up with an error saying you cannot expand the memory stream. Try to create it without setting an explicit size (just use
new MemoryStream()
), or set a larger size.
Also, if you don't want to serialize a property, just tag it with the XmlIgnore attribute.
With your update to include the inner exception, it becomes clear: since you are initializing the MemoryStream with a fixed-size byte array, the entire XML must fit in your 1024 bytes. It sounds like you have enough properties to make that a problem.
Did you really mean for a maximum of 1024 bytes, or were did you intend to set an initial size capacity for the stream. The latter would be:
MemoryStream stream = new MemoryStream(1024);
Related
I have implemented a code block in order to convert Stream into Byte Array. And code snippet is shown below. But unfortunately, it gives OutOfMemory Exception while converting MemoryStream to Array (return newDocument.ToArray();). please could someone help me with this?
public byte[] MergeToBytes()
{
using (var processor = new PdfDocumentProcessor())
{
AppendStreamsToDocumentProcessor(processor);
using (var newDocument = new MemoryStream())
{
processor.SaveDocument(newDocument);
return newDocument.ToArray();
}
}
}
public Stream MergeToStream()
{
return new MemoryStream(MergeToBytes());
}
Firstly: how big is the document? if it is too big for the byte[] limit: you're going to have to use a different approach.
However, a MemoryStream is already backed by an (oversized) array; you can get this simply using newDocument.TryGetBuffer(out var buffer), and noting that you must restrict yourself to the portion of the .Array indicated by .Offset (usually, but not always, zero) and .Count (the number of bytes that should be considered "live"). Note that TryGetBuffer can return false, but not in the new MemoryStream() scenario.
If is also interesting that you're converting a MemoryStream to a byte[] and then back to a MemoryStream. An alternative here would just have been to set the Position back to 0, i.e. rewind it. So:
public Stream MergeToStream()
{
using var processor = new PdfDocumentProcessor();
AppendStreamsToDocumentProcessor(processor);
var newDocument = new MemoryStream();
processor.SaveDocument(newDocument);
newDocument.Position = 0;
return newDocument;
}
I have an array of class data which I'm serialising into a byte array then pushing it into a database. This program runs on a scheduled basis during the night. On the other end I have another program which pulls this data out of the database, processes it into a report - or at least that's the plan.
The class is incased in 2 namespaces, first the the application name, the second is just something to hold my structures. Eg below.
namespace FibreTrend
{
namespace Structures
{
[Serializable]
public class Trend
{
public Trend(DateTime date, string ref, int port)
{
Date = date;
Reference = ref;
PortNo = port;
}
public DateTime Date;
public string Reference;
public int PortNo;
}
}
}
{
// Function to take the trendData list, convert it to a byte array
// List<Structures.Trend> trendData;
BinaryFormatter bf = new BinaryFormatter();
using (MemoryStream mStream = new MemoryStream())
{
bf.Serialize(mStream, trendData.ToArray());
byte[] b = mStream.ToArray();
// code that pushes the array into the database...
}
}
I have a completely separate application which reads in the data from the database as the byte array. I then go to converting it from the bytes back to my data class.
using (MemoryStream mStream = new MemoryStream())
{
BinaryFormatter binaryFormat = new BinaryFormatter();
mStream.Write(data, 0, data.Length);
mStream.Seek(0, SeekOrigin.Begin);
Structures.Trend[] obj = (Structures.Trend[])binaryFormat.Deserialize(mStream);
}
And here is my error. Its telling me it wants the FibreTrend binary to deserialize the data. Why?? My Trend class is the same size, same data layout, its an exact copy and paste from my other project. Why is it insisting on needing my other binary file in companion. When I do put the binary with it then deserialise it into an object it comes put as a FibreTrend.Structures.Trend[]. I'm obviously not going to include the other binary file with it, I'm also not going to double handle the data converting it to a Report.Structures.Trend[]. Its just a stream of 1s and 0s, why can't I just push it into any class that I deem I want, isn't that the purpose of the cast to tell the compiler how I want the data ordered and structured?
Binary Serialized data stream contains a header with type information in it. You can refer to the Binary Format Data structure here. That's why you are getting that exception about missing assembly.
One way to solve your issue is by implementing a SerializationBinder that overrides the type to be deserialized into at runtime and set Binder property on BinaryFormatter. Here is a very good example.
A preferred solution would be to use alternative serialization formats such as XML, JSON.
I'm trying to translate a function from ActionScript 3 into C# .NET.
What I have trouble is how to properly use ByteArrays in C#. In As3 there is a specific Class for it that already has most of the functionality i need, but in C# nothing of that sort seems to exist and I can't wrap my head around it.
This is the As3 function:
private function createBlock(type:uint, tag:uint,data:ByteArray):ByteArray
{
var ba:ByteArray = new ByteArray();
ba.endian = Endian.LITTLE_ENDIAN;
ba.writeUnsignedInt(data.length+16);
ba.writeUnsignedInt(0x00);
ba.writeUnsignedInt(type);
ba.writeUnsignedInt(tag);
data.position = 0;
ba.writeBytes(data);
ba.position = 0;
return ba;
}
But from what I gather, in C# I have to use a normal Array with the byte type, like this
byte[] ba = new byte[length];
Now, I looked into the Encoding Class, the BinaryWriter and BinaryFormatter class and researched if somebody made a Class for ByteArrays, but with no luck.
Can somebody nudge me in the right direction please?
You should be able to do this using a combination of MemoryStream and BinaryWriter:
public static byte[] CreateBlock(uint type, uint tag, byte[] data)
{
using (var memory = new MemoryStream())
{
// We want 'BinaryWriter' to leave 'memory' open, so we need to specify false for the third
// constructor parameter. That means we need to also specify the second parameter, the encoding.
// The default encoding is UTF8, so we specify that here.
var defaultEncoding = new UTF8Encoding(encoderShouldEmitUTF8Identifier:false, throwOnInvalidBytes:true);
using (var writer = new BinaryWriter(memory, defaultEncoding, leaveOpen:true))
{
// There is no Endian - things are always little-endian.
writer.Write((uint)data.Length+16);
writer.Write((uint)0x00);
writer.Write(type);
writer.Write(data);
}
// Note that we must close or flush 'writer' before accessing 'memory', otherwise the bytes written
// to it may not have been transferred to 'memory'.
return memory.ToArray();
}
}
However, note that BinaryWriter always uses little-endian format. If you need to control this, you can use Jon Skeet's EndianBinaryWriter instead.
As an alternative to this approach, you could pass streams around instead of byte arrays (probably using a MemoryStream for implementation), but then you will need to be careful about lifetime management, i.e. who will close/dispose the stream when it's done with? (You might be able to get away with not bothering to close/dispose a memory stream since it uses no unmanaged resources, but that's not entirely satisfactory IMO.)
You want to have a byte stream and then extract the array from it:
using(MemoryStream memory = new MemoryStream())
using(BinaryWriter writer = new BinaryWriter(memory))
{
// write into stream
writer.Write((byte)0); // a byte
writer.Write(0f); // a float
writer.Write("hello"); // a string
return memory.ToArray(); // returns the underlying array
}
I have the following C# code which is supposed to serialize arbitrary objects to a string, and then of course deserialize it.
public static string Pack(Message _message)
{
BinaryFormatter formatter = new BinaryFormatter();
MemoryStream original = new MemoryStream();
MemoryStream outputStream = new MemoryStream();
formatter.Serialize(original, _message);
original.Seek(0, SeekOrigin.Begin);
DeflateStream deflateStream = new DeflateStream(outputStream, CompressionMode.Compress);
original.CopyTo(deflateStream);
byte[] bytearray = outputStream.ToArray();
UTF8Encoding encoder = new UTF8Encoding();
string packed = encoder.GetString(bytearray);
return packed;
}
public static Message Unpack(string _packed_message)
{
UTF8Encoding encoder = new UTF8Encoding();
byte[] bytearray = encoder.GetBytes(_packed_message);
BinaryFormatter formatter = new BinaryFormatter();
MemoryStream input = new MemoryStream(bytearray);
MemoryStream decompressed = new MemoryStream();
DeflateStream deflateStream = new DeflateStream(input, CompressionMode.Decompress);
deflateStream.CopyTo(decompressed); // EXCEPTION
decompressed.Seek(0, SeekOrigin.Begin);
var message = (Message)formatter.Deserialize(decompressed); // EXCEPTION 2
return message;
}
But the problem is that any time the code is ran, I am experiencing an exception. Using the above code and invoking it as shown below, I am receiving InvalidDataException: Unknown block type. Stream might be corrupted. at the marked // EXCEPTION line.
After searching for this issue I have attempted to ditch the deflation. This was only a small change: in Pack, bytearray gets created from original.ToArray() and in Unpack, I Seek() input instead of decompressed and use Deserialize(input) instead of decompressed too. The only result which changed: the exception position and body is different, yet it still happens. I receive a SerializationException: No map for object '201326592'. at // EXCEPTION 2.
I don't seem to see what is the problem. Maybe it is the whole serialization idea... the problem is that somehow managing to pack the Message instances is necessary because these objects hold the information that travel between the server and the client application. (Serialization logic is in a .Shared DLL project which is referenced on both ends, however, right now, I'm only developing the server-side first.) It also has to be told, that I am only using string outputs because right now, the TCP connection between the servers and clients are based on string read-write on the ends. So somehow it has to be brought down to the level of strings.
This is how the Message object looks like:
[Serializable]
public class Message
{
public MessageType type;
public Client from;
public Client to;
public string content;
}
(Client right now is an empty class only having the Serializable attribute, no properties or methods.)
This is how the pack-unpack gets invoked (from Main()...):
Shared.Message msg = Shared.MessageFactory.Build(Shared.MessageType.DEFAULT, new Shared.Client(), new Shared.Client(), "foobar");
string message1 = Shared.MessageFactory.Pack(msg);
Console.WriteLine(message1);
Shared.Message mess2 = Shared.MessageFactory.Unpack(message1); // Step into... here be exceptions
Console.Write(mess2.content);
Here is an image showing what happens in the IDE. The output in the console window is the value of message1.
Some investigation unfortunately also revealed that the problem could lie around the bytearray variable. When running Pack(), after the encoder creates the string, the array contains 152 values, however, after it gets decoded in Unpack(), the array has 160 values instead.
I am appreciating any help as I am really out of ideas and having this problem the progress is crippled. Thank you.
(Update) The final solution:
I would like to thank everyone answering and commenting, as I have reached the solution. Thank you.
Marc Gravell was right, I missed the closing of deflateStream and because of this, the result was either empty or corrupted. I have taken my time and rethought and rewrote the methods and now it works flawlessly. And even the purpose of sending these bytes over the networked stream is working too.
Also, as Eric J. suggested, I have switched to using ASCIIEnconding for the change between string and byte[] when the data is flowing in the Stream.
The fixed code lies below:
public static string Pack(Message _message)
{
using (MemoryStream input = new MemoryStream())
{
BinaryFormatter bformatter = new BinaryFormatter();
bformatter.Serialize(input, _message);
input.Seek(0, SeekOrigin.Begin);
using (MemoryStream output = new MemoryStream())
using (DeflateStream deflateStream = new DeflateStream(output, CompressionMode.Compress))
{
input.CopyTo(deflateStream);
deflateStream.Close();
return Convert.ToBase64String(output.ToArray());
}
}
}
public static Message Unpack(string _packed)
{
using (MemoryStream input = new MemoryStream(Convert.FromBase64String(_packed)))
using (DeflateStream deflateStream = new DeflateStream(input, CompressionMode.Decompress))
using (MemoryStream output = new MemoryStream())
{
deflateStream.CopyTo(output);
deflateStream.Close();
output.Seek(0, SeekOrigin.Begin);
BinaryFormatter bformatter = new BinaryFormatter();
Message message = (Message)bformatter.Deserialize(output);
return message;
}
}
Now everything happens just right, as the screenshot proves below. This was the expected output from the first place. The Server and Client executables communicate with each other and the message travels... and it gets serialized and unserialized properly.
In addition to the existing observations about Encoding vs base-64, note you haven't closed the deflate stream. This is important because compression-streams buffer: if you don't close, it may not write the end. For a short stream, that may mean it writes nothing at all.
using(DeflateStream deflateStream = new DeflateStream(
outputStream, CompressionMode.Compress))
{
original.CopyTo(deflateStream);
}
return Convert.ToBase64String(outputStream.GetBuffer(), 0,
(int)outputStream.Length);
Your problem is most probably in the UTF8 encoding. Your bytes are not really a character string and UTF-8 is a encoding with different byte lengths for characters.
This means the byte array may not correspond to a correctly encoded UTF-8 string (there may be some bytes missing at the end for instance.)
Try using UTF16 or ASCII which are constant length encodings (the resulting string will likely contain control characters so it won't be printable or transmitable through something like HTTP or email.)
But if you want to encode as a string it is customary to use UUEncoding to convert the byte array into a real printable string, then you can use any encoding you want.
When I run the following Main() code against your Pack() and Unpack():
static void Main(string[] args)
{
Message msg = new Message() { content = "The quick brown fox" };
string message1 = Pack(msg);
Console.WriteLine(message1);
Message mess2 = Unpack(message1); // Step into... here be exceptions
Console.Write(mess2.content);
}
I see that the bytearray
byte[] bytearray = outputStream.ToArray();
is empty.
I did modify your serialized class slightly since you did not post code for the included classes
public enum MessageType
{
DEFAULT = 0
}
[Serializable]
public class Message
{
public MessageType type;
public string from;
public string to;
public string content;
}
I suggest the following steps to resolve this:
Check the intermediate results along the way. Do you also see 0 bytes in the array? What is the string value returned by Pack()?
Dispose of your streams once you are done with them. The easiest way to do that is with the using keyword.
Edit
As Eli and Marc correctly pointed out, you cannot store arbitrary bytes in a UTF8 string. The mapping is not bijective (you can't go back and forth without loss/distortion of information). You will need a mapping that is bijective, such as the Convert.ToBase64String() approach Marc suggests.
I have a small issue accessing a byte[]:
I have a binary object (byte[] saved to mssql db) which I get from the db and I want to read it. Whenever I access it, for its length or for its Read() method, I get a Cannot access a closed Stream exception.
What's the best way to treat binaries if they have to be updated in the code and than saved again to the db?
Thanks.
Edit - code
In this application we convert a test object to a generic data object we've created to simplify, so this is the data object:
public class DataObject
{
public Stream Content { get; set; }
public Descriptor Descriptor { get; set; }
}
The descriptor contains metadata only (currently only name and description strings) and is not relevant, I think.
The test is more complicated, I'll start by adding the mapping into data object. The serializer mentioned is NetDataContractSerializer.
public DataObject Map(Test test)
{
using(var stream = new MemoryStream())
{
Serialize(test, stream);
return new DataObject { Content = stream, Descriptor = test.Descriptor };
}
}
private void Serialize(Test test, MemoryStream stream)
{
serializer.WriteObject(stream, test);
stream.Flush();
stream.Position = 0;
}
and vice versa:
public Test Build(DataObject data)
{
using (var stream = data.Content)
{
var test = Deserialize(stream);
test.Descriptor = data.Descriptor;
return test ;
}
}
private Test Deserialize(Stream stream)
{
return serializer.ReadObject(stream) as IPythonTest;
}
Edit II - trying to change the test's content:
This is my first attempt handling streams, I'm not sure I'm doing it right, so I'll explain first what I want to do: The information in data field should be saved into the test's data object.
private static void UpdateTestObject(DataObject data, Test test)
{
var testData = new byte[data.Content.Length];
data.Content.Read(testData, 0, (int) data.Content.Length);
test.TestObject = testData;
}
The exception is thrown in UpdateTestObject when accessing data.Content. I get it after creating some test, mapping it, and trying to save it.
data.Content.Read(testData, 0, (int) data.Content.Length);
Here we go. The data object has a Content stream that it closed.
Result: Error.
Reasno? TOTALLY (!) unrelated to your question. Basically find out why / what is the problem there in your data handling.
Could be a design fubar in which the stream is not available after a certain point and youru sage of the object is past this point.
So the problem is caused by the Map() method - as far as I could understand, since it used:
using (var stream = new MemoryStream())
{ ... }
The stream was disposed of at the end of the block. Changing it to declaring a MemoryStream and then using it afterwards worked.
Thanks to everyone who gave it a thought (not mentioning reading all this code)! :)