I like to using binaryformatter to serializing flow document. but it makes exception.
[Serializable]
public class BinFlow
{
public FlowDocument my { get; set; }
}
BinFlow myBF = new BinFlow();
myBF.my = myFlowDocument;
FileStream myFile = File.Create(#"d:\test.bin");
BinaryFormatter myBinaryFormat = new BinaryFormatter();
//exception occured here!!
myBinaryFormat.Serialize(myFile, myBF);
Exception message said "FlowDocument does not decared 'Serializable' proeprty".
ps. Of cause, I can use XamlReader and XamlWriter for serializing of FlowDocument. but I think binary can more fast performance for this work.
I am assuming you are asking for your related question -- to shuttle your FlowDocument from one thread to another. I've never had any success using BinaryFormatter. Unless your FlowDocument is very large (say, more than 100 MB), you can easily save it to memory to share between threads as a memory stream like this:
MemoryStream stream = new MemoryStream();
XamlWriter.Save(myFlowDoc, stream);
stream.Position = 0;
You can share MemoryStream between threads, and avoid disk IO. On your other thread, use XamlReader.Load from the MemoryStream.
If you do want to write it to disk in a binary format I'd say get the Xaml, then use the compression libraries to make a ZIP file, as XPS does.
Related
I was wondering if I can send a whole obj which has 4 members:
string name
string lastname
int StudentID
int CitizenID
using the filestream commends to save in a .txt file in binary format?
I know of Serialization, and it sends an obj to a .xml file quite easily, but when I made my program using it and showed it to my teacher, he said he don't want me to use it, said I have to use the base filestream commands like fstream, instream, outstream and ... in C++ , and told me he want a binary format saved .txt file.
Now my question is, can I send a whole obj to a .txt file in C# or do i have to send it like a string, which will be quite a lot of work.
Because I have to turn all members to string then put a indicator or something at end of each so I can read from them later and separate them.
now my question is , can i send a whole obj to a .txt file in C# or do i have to send it like a string ... which will be quite a lot of work ,
Yes, there is a name for this: serialization. Whether you do it manually in your own code, or using one of a range of existing serialization libraries, it is serialization. If your teacher doesn't want you using a serialization library, then you would have to do it in your own code. Which is frankly stupid. I wonder if the teacher wants you to write raw commands to the hard disk as well, to avoid all those silly abstractions like a HAL, OS, or common class libraries...
It may be worth clarifying with your teacher exactly what is, and is not, acceptable. For example, would a binary serializer suffice? I could name several. Would it be OK to use BinaryWriter / BinaryReader? Or is the intent of the exercise to make you think about what you need to do to serialize/deserialize? If the latter: then yes, learn about streams, encodings, and framing protocols (sentinel-values vs length-prefixed data, etc).
I'd use this...
void saveInBinary()
{
IFormatter binaryFormatter = new BinaryFormatter();
Stream binaryStream = new FileStream("File Path Here", FileMode.Create, FileAccess.Write, FileShare.None);
binaryFormatter.Serialize(binaryStream, this);
binaryStream.Close();
}
Stick that in the object's member space, and just call it.
Or, something a bit more useful...
void saveInBinary(object o)
{
IFormatter binaryFormatter = new BinaryFormatter();
Stream binaryStream = new FileStream("File Path Here", FileMode.Create, FileAccess.Write, FileShare.None);
binaryFormatter.Serialize(binaryStream, o);
binaryStream.Close();
}
This will serialize any object passed to it.
Last variation (I promise!) :p
void saveInBinary(object o, string filePath)
{
IFormatter binaryFormatter = new BinaryFormatter();
Stream binaryStream = new FileStream(filePath, FileMode.Create, FileAccess.Write, FileShare.None);
binaryFormatter.Serialize(binaryStream, o);
binaryStream.Close();
}
This one allows you to pass an object and a filepath to the method to serialize the passed object to the passed filepath.
Of course, this will not allow you to read the contents of the .txt file. It'll look like mumbo jumbo. If your intent is to write the data to a human readable file, well, there are tons of examples for that. :p I just provided the serialization stuff because it took me a while to figure it out. ;)
Edit: Shoot. I forgot some code...
using System.Runtime.Serialization;
using System.Runtime.Serialization.Formatters.Binary;
Code needs those to work (for the IFormatter) Also, for Stream to work, you need to use
using System.IO;
Sorry for forgetting.
Edit 2: Forgot to mention, you'll need to use the [Serializeable] tag on the object(s) you wish to serialize this way. Just put "[Serializeable]" on top of the class you want to serialize.
[Serializeable]
class SerializeableObject
{
string aString;
void aMethod()
{
}
}
Like so.
I want to serialize/deserialize multiple object from/to a file. The syntax should be similar to this:
obj.Append(byteArray);
obj.Append(byteArray);
obj.Append(byteArray);
IEnumerable<byte[]> obj.Extract();
While this is very simple to accomplish (e.g., write a class that uses a filestream and protobuf-net internally), I'm wondering if there is any more elegant way of doing this. Is there any class (from a third party library), that uses a serializer to write to an filestream?
Edit: I need this as a filestream that captures video data that is sent through network. So the filestream must be open for a dedicated amount of time. My previous solution was to save every video frame to a new file, but it's not scalable (especially for hdd, and increasing video partners).
What about:
using(var stream = File.Open(filename, FileMode.Create))
{
var formatter = new BinaryFormatter();
formatter.Serialize(stream, firstObjectToSerialize);
formatter.Serialize(stream, secondObjectToSerialize);
}
Working with C# Visual Studio 2008, MVC1.
I'm creating an xml file by fetching one from a WebService and adding some nodes to it. Now I wanted to deserialize it to a class which is the model used to strongtyped the View.
First of all, I'm facing problems to achieve that without storing the xml in the filesystem cause I don't know how this serialize and deserialize work. I guess there's a way and it's a matter of time.
But, searching for the previous in the web I came accross LINQ to XML and now I doubt whether is better to use it.
The xml would be formed by some clients details, and basically I will use all of them.
Any hint?
Thanks!!
You can save a XElement to and from a MemoryStream (no need to save it to a file stream)
MemoryStream ms = new MemoryStream();
XmlWriter xw = XmlWriter.Create(ms);
document.Save(xw);
xw.Flush();
Then if you reset the position back to 0 you can deserialize it using the DataContractSerializer.
ms.Position = 0;
DataContractSerializer serializer = new DataContractSerializer(typeof(Model));
Model model = (model) serializer.ReadObject(ms);
There are other options for how serialization works, so if this is not what you have, let me know what you are using and I will help.
try this:
XmlSerializer xmls = new XmlSerializer(typeof(XElement));
FileStream FStream;
try
{
FStream = new FileStream(doctorsPath, FileMode.Open);
_Doctors = (XElement)xmls.Deserialize(FStream); FStream.Close();
FStream = new FileStream(patientsPath, FileMode.Open);
_Patients = (XElement)xmls.Deserialize(FStream)
FStream.Close();
FStream = new FileStream(treatmentsPath, FileMode.Open);
_Treatments = (XElement)xmls.Deserialize(FStream);
FStream.Close();
}
catch
{ }
This will load all of the XML files into our XElement variables. The try – catch block is a form of exception handling that ensures that if one of the functions in the try block throws an exception, the program will jump to the catch section where nothing will happen. When working with files, especially reading files, it is a good idea to work with try – catch.
LINQ to XML is an excellent feature. You can always rely on that. You don't need to write or read or data from file. You can specify either string or stream to the XDocument
There are enough ways to load an XML element to the XDocument object. See the appropriate Load functions. Once you load the content, you can easily add/remove the elements and later you can save to disk if you want.
really stuck, any help and/or comments would be greatly appreciated!
I've written a database program that needs to be able to load contents from a file into a list. So basically i'm trying to use serialization and deserialization for this. The area where the error occurs is in bold and is italicized:
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.IO;
using System.Runtime.Serialization;
using System.Runtime.Serialization.Formatters.Binary;
namespace ConsoleApplication1
{
class TheFile
{
//Version 1 serialiser, instance specific, so a bit of a pain
public void SerializeObject(string filename, TheDocument document)
{
Stream stream = File.Open(filename, FileMode.Create);
BinaryFormatter bFormatter = new BinaryFormatter();
bFormatter.Serialize(stream, document);
stream.Close();
}
public TheDocument DeSerializeObject(string filename)
{
TheDocument document;
Stream stream = File.Open(filename, FileMode.Open);
BinaryFormatter bFormatter = new BinaryFormatter();
***document = (TheDocument)bFormatter.Deserialize(stream);***
stream.Close();
return document;
}
}
}
The error which i receive is as follows: The input stream is not a valid binary format. The starting contents (in bytes) are: 31-37-2F-30-39-2F-32-30-31-31-20-31-31-3A-30-36-3A ...
I have seen this reported somewhere else recently, and I was unable to find an explanation there either. The code as presented looks like it should be fine (although it would benefit greatly from a few using statements, but they won't break the success case since you are calling .Close()).
However! I would also warn that IMO BinaryFormatter is not a good choice for storage in a database, since that suggests it is desirable to read it back in the future. The fact that BinaryFormatter is type-dependent makes it very very brittle as you version your application. Any of: creating a new app-version, renaming/adding/removing a field, changing a property to an automatically implemented property, changing .NET version, changing platform, ... could make your data either unreadable, or readable only by adding a lot of custom binder code.
I strongly suggest that you consider using a contract-based serializer instead of BinaryFormatter; any of: DataContractSerializer (but not NetDataContractSerializer), XmlSerializer, JavascriptSerializer, JSON.Net. If you want binary for size and performance, then protocol buffers (with several C# implementations, including protobuf-net) is designed (by Google) to be version tolerant, small, and fast. Since that list is also cross-platform, it also means your data is safe if, say, you switch platform to Java, Mono, WinRT (the new windows 8 subsystem), PHP, or anything else. BinaryFormatter will not work on any of those.
Does this succeed?
var path = "...";
var doc = new TheDocument(...);
SerializeObject(path, doc);
var restored = DeserializedObject(path);
Assert.IsNotNull(restored); // NUnit check; use appropriate code
Too big for a comment
I am getting an intermittent "out of memory" exception at this statement:
return ms.ToArray();
In this method:
public static byte[] Serialize(Object inst)
{
Type t = inst.GetType();
DataContractSerializer dcs = new DataContractSerializer(t);
MemoryStream ms = new MemoryStream();
dcs.WriteObject(ms, inst);
return ms.ToArray();
}
How can I prevent it? Is there a better way to do this?
The length of ms is 182,870,206 bytes (174.4 MB)
I am putting this into a byte array so that I can then run it through compression and store it to disk. The data is (obviously) a large list of a custom class that I am downloading from a WCF server when my silverlight application starts. I am serializing it and compressing it so it uses only about 6MB in isolated storage. The next time the user visits and runs the silverlight application from the web, I check the timestamp, and if good I just open the file from isolated, decompress it, deserialize it, and load my structure. I am keeping the entire structure in memory because the application is mostly geared around manipulating the contents of this structure.
#configurator is correct. The size of the array was too big. I rolled by own serializer, by declaring a byte array of [list record count * byte count per record], then stuffed it directly myself using statements like this to stuff it:
Buffer.BlockCopy(
BitConverter.GetBytes(node.myInt),0,destinationArray,offset,sizeof(int));
offset += sizeof(int);
and this to get it back:
newNode.myInt= BitConverter.ToInt32(sourceByteArray,offset);
offset += sizeof(int);
Then I compressed it and stored it to isolated storage.
My size went from 174MB with the DataContractSerializer to 14MB with mine.
After compression it went from a 6MB to a 1MB file in isolated storage.
Thanks to Configurator and Filip for their help.
The problem seems to be that you're expecting to return a 180MB byte array. That means the framework would need to find and allocate a consecutive 180MB of free memory to copy the stream data into, which is usually quite hard - hence the OutOfMemoryException. If you need to continue handling this amount of memory, use the memory stream itself (reading and writing to it as you need) to hold the buffer; otherwise, save it to a file (or to whatever other place you need it, e.g. serving it over a network) directly instead of using the memory stream.
I should mention that the memory stream has a 180MB array of its own in there as well, so is also in a bit of trouble and could cause OutOfMemory during serialization - it would likely be better (as in, more robust) if you could serialize it to a temporary file. You might also want to consider a more compact - but possibly less readable - serialization format, like json, binary serialization, or protocol buffers.
In response to the comment: to serialize directly to disk, use a FileStream instead of a MemoryStream:
public static void Serialize(Object inst, string filename)
{
Type t = inst.GetType();
DataContractSerializer dcs = new DataContractSerializer(t);
using (FileStream stream = File.OpenWrite(filename)) {
dcs.WriteObject(ms, inst);
}
}
I don't know how you use that code, but one thing that strikes me is that you don't release your resources. For instance, if you call Serialize(obj) a lot of times with a lot of large objects, you will end up having a lot of memory being used that is not released directly, however the GC should handle that properly, but you should always release your resources.
I've tried this piece of code:
public static byte[] Serialize(object obj)
{
Type type = obj.GetType();
DataContractSerializer dcs = new DataContractSerializer(type);
using (var stream = new MemoryStream())
{
dcs.WriteObject(stream, obj);
return stream.ToArray();
}
}
With the following Main-method in a Console Application
static void Main(string[] args)
{
var filipEkberg = new Person {Age = 24, Name = #"Filip Ekberg"};
var obj = Serialize(filipEkberg);
}
However, my byte-array is not nearly as big as yours. Having a look at this similar issue, you might want to consider checking out protobuf-net.
It might also be interesting to know what you are intending to do with the serialized data, do you need it as a byte-array or could it just as well be XML written to a text-file?
Try to serialize to a stream (i.e. FileStream) instead of byte array. This way you can serialize gigabytes of data without OutOfMemory exception.
public static void Serialize<T>(T obj, string path)
{
DataContractSerializer serializer = new DataContractSerializer(typeof(T));
Stream stream = File.OpenWrite(path);
serializer.WriteObject(stream, obj);
}
public static T Deserialize<T>(string path)
{
DataContractSerializer serializer = new DataContractSerializer(typeof(T));
Stream stream = File.OpenRead(path);
return (T)serializer.ReadObject(stream);
}
Try to set memory stream position to 0 and after only call ToArray().
Regards.