What's the best PrefixStyle with ProtoBuf-net for big files? - c#

I need to store big data (in the order of Giga bytes) to stream using protobuf-net 2.4.0.
Currently, I use the strategy of writing a small header with LengthPrefix with PrefixStyle.Base128 and the big body with the standard protobuf serialization approach with the below code, and it works like a charm.
private void Serialize(Stream stream)
{
Model.SerializeWithLengthPrefix(stream, FileHeader, typeof(FileHeader), PrefixStyle.Base128, 1);
if (FileHeader.SerializationMode == serializationType.Compressed)
{
using (var gzip = new GZipStream(stream, CompressionMode.Compress, true))
using (var bs = new BufferedStream(gzip, GZIP_BUFFER_SIZE))
{
Model.Serialize(bs, FileBody);
}
}
else
Model.Serialize(stream, FileBody);
}
Now I need to split the body into 2 different objects, so I have to use the LengthPrefix approach for them too, but I don't know what is the best PrefixStyle in such a scenario. Can I continue to use Base128? What does "useful for compatibility" mean in Fixed32 description?
UPDATE
I found this post where Marc Gravell explains that there is an option to use a start-marker and an end-marker, but I'm not sure if it can be used with the LengthPrefix approach or not.
To be more clear, is a valid approach the one shown in the below code?
[ProtoContract]
public class FileHeader
{
[ProtoMember(1)]
public int Version { get; }
[ProtoMember(2)]
public string Author { get; set; }
[ProtoMember(3)]
public string Organization { get; set; }
}
[ProtoContract(IsGroup = true)] // can IsGroup=true help with LengthPrefix for big data?
public class FileBody1
{
[ProtoMember(1), DataFormat = DataFormat.Group)]
public List<Foo1> Foo1s { get; }
[ProtoMember(2), DataFormat = DataFormat.Group)]
public List<Foo2> Foo2s { get; }
[ProtoMember(3), DataFormat = DataFormat.Group)]
public List<Foo3> Foo3s { get; }
}
[ProtoContract(IsGroup = true)] // can IsGroup=true help with LengthPrefix for big data?
public class FileBody2
{
[ProtoMember(1), DataFormat = DataFormat.Group)]
public List<Foo4> Foo4s { get; }
[ProtoMember(2), DataFormat = DataFormat.Group)]
public List<Foo5> Foo5s { get; }
[ProtoMember(3), DataFormat = DataFormat.Group)]
public List<Foo6> Foo6s { get; }
}
public static class Helper
{
private static void SerializeFile(Stream stream, FileHeader header, FileBody1 body1, FileBody2 body2)
{
var model = RuntimeTypeModel.Create();
var serializationContext = new ProtoBuf.SerializationContext();
model.SerializeWithLengthPrefix(stream, header, typeof(FileHeader), PrefixStyle.Base128, 1);
model.SerializeWithLengthPrefix(stream, body1, typeof(FileBody1), PrefixStyle.Base128, 1, serializationContext);
model.SerializeWithLengthPrefix(stream, body2, typeof(FileBody2), PrefixStyle.Base128, 1, serializationContext);
}
private static void DeserializeFile(Stream stream, ref FileHeader header, ref FileBody1 body1, ref FileBody2 body2)
{
var model = RuntimeTypeModel.Create();
var serializationContext = new ProtoBuf.SerializationContext();
header = model.DeserializeWithLengthPrefix(stream, null, typeof(FileHeader), PrefixStyle.Base128, 1) as FileHeader;
body1 = model.DeserializeWithLengthPrefix(stream, null, typeof(FileBody1), PrefixStyle.Base128, 1, null, out _, out _, serializationContext) as FileBody1;
body2 = model.DeserializeWithLengthPrefix(stream, null, typeof(FileBody2), PrefixStyle.Base128, 1, null, out _, out _, serializationContext) as FileBody2;
}
}
If so, I suppose that I can continue to store big data without worrying about the prefix-length (I mean the marker indicating the length of the message)

Base128 is probably the best general-purpose choice, simply because it maintains protocol compatibility (the others: do not). What I would suggest, though, is that for very large files, using "group" mode on the collections (and sub-objects in general) may be highly desirable; this makes serialization faster, by virtue of not having to calculate any length-prefixes for large object graphs.

Related

Correctly create a json file in c#

I'm currently setting-up my application, and i need to manage a json file (that contain some settings) in my windows form. Once you open it, you can easy choose different settings, and once you've done, you can save it (which mean i need to overwrite the existing one json file settings, and replace it with new one!
I tried to follow this guide for correctly create my json file! But i met 2 problems:
This solution mentioned up, create square brackets (which i don't need!)
Seems to create all settings on one line. Is that correct or it could make some problem in the future?
Some materials....
My application:
Original json file:
Json file generated with my code:
My class:
public class data
{
public bool avx { get; set; }
public bool memory_pool { get; set; }
public bool smt { get; set; }
public bool spectre { get; set; }
public bool unlock_menu { get; set; }
public bool vinput { get; set; }
public double cpu_memory_pool_fraction { get; set; }
public double gpu_memory_pool_fraction { get; set; }
}
My code:
private void btn_save_Click(object sender, EventArgs e)
{
string settings_path = general_path + "\\plugins\\cyber_engine_tweaks\\" + "config.json"; //path
bool avx_set = cb_avx.Checked;
bool smt_set = cb_smt.Checked;
bool memory_set = cb_memory.Checked;
bool spectre_set = cb_spectre.Checked;
bool debug_set = cb_debug.Checked;
bool vinput_set = cb_vinput.Checked;
List<data> _data = new List<data>();
_data.Add(new data()
{
avx = avx_set,
cpu_memory_pool_fraction = 0.5,
gpu_memory_pool_fraction = 1.0,
memory_pool = memory_set,
smt = smt_set,
spectre = spectre_set,
unlock_menu = debug_set,
vinput = vinput_set
});
using (StreamWriter file = File.CreateText(settings_path))
{
JsonSerializer serializer = new JsonSerializer();
//serialize object directly into file stream
serializer.Serialize(file, _data);
}
}
Square brackets are because you send an array:
Instead of
List<data> _data = new List<data>(); _data.Add(new data()...
try
var data = new data()... serializer.Serialize(file, data)
All settings on one line is normal.
Very nice detailed answer here: Can JSON start with "["?
TLDR:
It's not a Json without either {} indicating an object or [] indicating an array. So no, you can't have a json with multiple keys without one.
The newlines are optional in Json. Since most json objects are used for transfering over the wire, there is no need for newlines (which take up unessecary bytes).
Issues:
1 . This solution mentioned up, create square brackets (which i don't need!)
Solution:
Create your data object like:
//List<data> _data = new List<data>();
data _data = new data
{
avx = avx_set,
cpu_memory_pool_fraction = 0.5,
gpu_memory_pool_fraction = 1.0,
memory_pool = memory_set,
smt = smt_set,
spectre = spectre_set,
unlock_menu = debug_set,
vinput = vinput_set
};
2 . Seems to create all settings on one line. Is that correct or it could make some problem in the future?
Solution:
I think it's a format issue. Can igonore

Empty serialized XML file

i get an empty xmlfile after serializing an object. I'm using Monodevelop and Unity 4. I searched for such a long time mostly in this community, but i only found difficult problems with even more difficult answers :) I think mine is so simple, please help me. (I am new to c#)
The serialized object is this:
[System.Serializable]
public class information {
private string data1;
private string data2;
private string data3;
public void Data1(string text)
{
data1 = text;
}
public string GetData1 ()
{
return data1;
}
public void Data2(string text)
{
data2 = text;
}
public string GetData2 ()
{
return data2;
}
public void Data3(string text)
{
data3 = text;
}
}
the serializing class is this, here might be the problem:
public class SaveXml {
public void SaveData(object obj, string filename)
{
XmlSerializer sr = new XmlSerializer(obj.GetType());
TextWriter writer = new StreamWriter(filename);
sr.Serialize(writer, obj);
writer.Close();
}
public string Load()
{
if(File.Exists("accdata.xml"))
{
XmlSerializer xs = new XmlSerializer(typeof(information));
FileStream read = new FileStream("accdata.xml",FileMode.Open, FileAccess.Read, FileShare.Read);
information info = (information)xs.Deserialize(read);
return info.GetData1();
}
else
{
return "file does not exist";
}
}
And the serializing and the serialized object get called by a menu that has this 2 buttons:
if(GUI.Button(new Rect(10,50,300,100),"Save"))
{
SaveXml saver = new SaveXml();
information infol = new information();
infol.Data1("textone");
infol.Data2("texttwo");
infol.Data3( "textthree");
saver.SaveData(infol, "accdata.xml");
}
if(GUI.Button(new Rect(500,50,300,100),"Load"))
{
SaveXml saver1 = new SaveXml();
text = saver1.Load();
}
so the variable text that is declared in the class menu, should be "textone", after i clicked the Save Button and the LoadButton. The Savebutton creates a file that is empty.
The Deserialization seems to work but of course there is no String in the data1 variable in Information so the variable in the menu called text is empty too. I get no errors and i can work with the object after serialization.
So why doesnt my serialization work? Please help me. I excuse for my bad english and mistakes, i am new to stackoverflow.
Xml serializer serializes public fields/properties not methods. Change your methods to properties. For ex,
public string Data2
{
set { data2 = value; }
get { return data2; }
}
So your information class can be
public class Information
{
public string Data1 { get; set; }
public string Data2 { get; set; }
public string Data3 { get; set; }
}
BTW: you don't need this Serializable attribute. It is only used by BinaryFormatter
I'm not sure but from what i see you don't have any public fields... Take a look here
And also, why don't you just use auto getter/setter ?
According to this MSDN support article, using XmlSerializer the way you have performs only "shallow" serialization - it only serializes public fields/properties. To serialize private data requires "deep" serialization which appears to be a whole other animal.

ProtoBuf corrupts byte array during deserialization (extra 0's added)

I'm using ProtoBuf.NET to serialize/deserialize some classes. I'm finding that on deserializing, I'm getting a corrupt byte[] (extra 0's). Before you ask, yes, I need the *WithLengthPrefix() versions of the ProtoBuf API since the ProtoBuf portion is at the starts of a custom stream :)
Anyway, I see
Original object is (JSON depiction):
{"ByteArray":"M+B6q+PXNuF8P5hl","ByteArray2":"DmErxVQ2y87IypSRcfxcWA==","K":2,"V
":1.0}
Protobuf: Raw Hex (42 bytes):
29-2A-20-0A-0C-33-E0-7A-AB-E3-D7-36-E1-7C-3F-98-65-12-10-0E-61-2B-C5-54-36-CB-CE
-C8-CA-94-91-71-FC-5C-58-08-02-15-00-00-80-3F
Regenerated object is (JSON depiction):
{"ByteArray":"AAAAAAAAAAAAAAAAM+B6q+PXNuF8P5hl","ByteArray2":"DmErxVQ2y87IypSRcf
xcWA==","K":2,"V":1.0}
The extra AAA*A in the ByteArray member are basically hex 0x00's in base64.
The app logic is similar to
static void Main(string[] args)
{
var parent = new Parent();
parent.Init();
Console.WriteLine("\nOriginal object is (JSON depiction):");
Console.WriteLine(JsonConvert.SerializeObject(parent));
using (var ms = new MemoryStream())
{
Serializer.SerializeWithLengthPrefix(ms, parent, PrefixStyle.Base128);
byte[] bytes2 = ms.ToArray();
var hex2 = BitConverter.ToString(bytes2);
Console.WriteLine("\nProtobuf: Hex ({0} bytes):\n{1}", bytes2.Length, hex2);
ms.Seek(0, SeekOrigin.Begin);
var backFirst = Serializer.DeserializeWithLengthPrefix<Parent>(ms,PrefixStyle.Base128);
Console.WriteLine("\nRegenerated object is (JSON depiction):");
Console.WriteLine(JsonConvert.SerializeObject(backFirst));
}
}
The DTO classes are
[DataContract]
[ProtoContract]
internal class Parent : Child
{
[DataMember(Name = "ByteArray", Order = 10)]
[ProtoMember(1)]
public byte[] ByteArray { get; set; }
[DataMember(Name = "ByteArray2", Order = 30, EmitDefaultValue = false)]
[ProtoMember(2)]
public byte[] ByteArray2 { get; set; }
public Parent()
{
ByteArray = new byte[12];
}
internal void Init(bool bindRow = false)
{
base.Init();
var rng = new RNGCryptoServiceProvider();
rng.GetBytes(ByteArray);
ByteArray2 = new byte[16];
rng.GetBytes(ByteArray2);
}
}
[DataContract]
[ProtoContract]
[ProtoInclude(5, typeof(Parent))]
public class Child
{
[DataMember(Name = "K", Order = 100)]
[ProtoMember(1)]
public Int32 K { get; set; }
[DataMember(Name = "V", Order = 110)]
[ProtoMember(2)]
public float V { get; set; }
internal void Init()
{
K = 2;
V = 1.0f;
}
}
I do see that when I move ByteArray = new byte[12] out form the Parent constructor into it's Init() method, ProtoBuf works fine. However we have app logic that prevents that in the real version (vs the SO trim down code you see above).
Are we doing something wrong or is this a bug in ProtoBuf?
Here we go:
public Parent()
{
ByteArray = new byte[12];
}
Note: protobuf is designed (by google) to be both appendable and mergeable. Where append / merge is synonymous (for lists / arrays etc) with "append".
Two options (both possible via attributes):
disable the constructor: [ProtoContract(SkipConstructor = true)]
disable the append: [ProtoMember(1, OverwriteList = true)]
There are other options too, but those are the ones I'd lean towards.
You remark that the array initialisation is different in the real code, but: I can't comment on code I can't see.
I was facing the same issue and my Bytestring data was actually an XML data I get from the server, so in my application I was already having an XML Serialaizer, thus I decided to use that instead of introducing a new serializer for Photobuf and decorate all of my models, I found this task is very time consuming.
Here is my function which will deserialize my Bytestring.
public SceneDTO ParseSceneAsyncFromByteString(ByteString byteStringData) {
var serializer = new CustomXmlSerializer(typeof(SceneDTO), _timingManager);
serializer.UnknownNode += OnUnknownNode;
serializer.UnknownAttribute += OnUnknownAttribute;
var ms = new MemoryStream(byteStringData.ToByteArray());
var scene = (SceneDTO) serializer.Deserialize(ms);
serializer.UnknownNode -= OnUnknownNode;
serializer.UnknownAttribute -= OnUnknownAttribute;
return scene;
}
The other reason why I was following this is I was getting the below error after used the Photobuf serializer.
Unexpected end-group in source data; this usually means the source
data is corrupt photobuf deserialzer
And even after fixing this, my model data was always null, and I decided not to spend many hours in fixing that issue as I already had a working solution.

serialize list of huge composite graphs using protobuf-net causing out-of-memory-exception

I am trying to serialize an object containing a list of very large composite object graphs (~200000 nodes or more) using Protobuf-net. Basically what I want to achieve is to save the complete object into a single file as fast and as compact as possible.
My problem is that I get an out-of-memory-exception while trying to serialize the object. On my machine the exception is thrown when the file size is around 1.5GB. I am running a 64 bit process and using a StreamWriter as input to protobuf-net. Since I am writing directly to a file I suspect that some kind of buffering is taking place within protobuf-net causing the exception. I have tried to use the DataFormat = DataFormat.Group attribute but with no luck so far.
I can avoid the exception by serializing each composite in the list to a separate file but I would prefer to have it all done in one go if possible.
Am I doing something wrong or is it simply not possible to achieve what i want?
Code to illustrate the problem:
class Program
{
static void Main(string[] args)
{
int numberOfTrees = 250;
int nodesPrTree = 200000;
var trees = CreateTrees(numberOfTrees, nodesPrTree);
var forest = new Forest(trees);
using (var writer = new StreamWriter("model.bin"))
{
Serializer.Serialize(writer.BaseStream, forest);
}
Console.ReadLine();
}
private static Tree[] CreateTrees(int numberOfTrees, int nodesPrTree)
{
var trees = new Tree[numberOfTrees];
for (int i = 0; i < numberOfTrees; i++)
{
var root = new Node();
CreateTree(root, nodesPrTree, 0);
var binTree = new Tree(root);
trees[i] = binTree;
}
return trees;
}
private static void CreateTree(INode tree, int nodesPrTree, int currentNumberOfNodes)
{
Queue<INode> q = new Queue<INode>();
q.Enqueue(tree);
while (q.Count > 0 && currentNumberOfNodes < nodesPrTree)
{
var n = q.Dequeue();
n.Left = new Node();
q.Enqueue(n.Left);
currentNumberOfNodes++;
n.Right = new Node();
q.Enqueue(n.Right);
currentNumberOfNodes++;
}
}
}
[ProtoContract]
[ProtoInclude(1, typeof(Node), DataFormat = DataFormat.Group)]
public interface INode
{
[ProtoMember(2, DataFormat = DataFormat.Group, AsReference = true)]
INode Parent { get; set; }
[ProtoMember(3, DataFormat = DataFormat.Group, AsReference = true)]
INode Left { get; set; }
[ProtoMember(4, DataFormat = DataFormat.Group, AsReference = true)]
INode Right { get; set; }
}
[ProtoContract]
public class Node : INode
{
INode m_parent;
INode m_left;
INode m_right;
public INode Left
{
get
{
return m_left;
}
set
{
m_left = value;
m_left.Parent = null;
m_left.Parent = this;
}
}
public INode Right
{
get
{
return m_right;
}
set
{
m_right = value;
m_right.Parent = null;
m_right.Parent = this;
}
}
public INode Parent
{
get
{
return m_parent;
}
set
{
m_parent = value;
}
}
}
[ProtoContract]
public class Tree
{
[ProtoMember(1, DataFormat = DataFormat.Group)]
public readonly INode Root;
public Tree(INode root)
{
Root = root;
}
}
[ProtoContract]
public class Forest
{
[ProtoMember(1, DataFormat = DataFormat.Group)]
public readonly Tree[] Trees;
public Forest(Tree[] trees)
{
Trees = trees;
}
}
Stack-trace when the exception is thrown:
at System.Collections.Generic.Dictionary`2.Resize(Int32 newSize, Boolean forceNewHashCodes)
at System.Collections.Generic.Dictionary`2.Insert(TKey key, TValue value, Boolean add)
at ProtoBuf.NetObjectCache.AddObjectKey(Object value, Boolean& existing) in NetObjectCache.cs:line 154
at ProtoBuf.BclHelpers.WriteNetObject(Object value, ProtoWriter dest, Int32 key, NetObjectOptions options) BclHelpers.cs:line 500
at proto_5(Object , ProtoWriter )
I am trying to do a workaround where I serialize the array of trees one at a time to a single file using the SerializeWithLengthPrefix method. Serialization seems work - I can see the filesize is increased after each tree in the list is added to the file. However, when I try to Deserialize the trees I get the Invalid wire-type exception. I am creating a new file when I serialize the trees so the file should be garbage free - unless I am writing garbage of cause ;-). My serialize and deserialization methods are listed below:
using (var writer = new FileStream("model.bin", FileMode.Create))
{
foreach (var tree in trees)
{
Serializer.SerializeWithLengthPrefix(writer, tree, PrefixStyle.Base128);
}
}
using (var reader = new FileStream("model.bin", FileMode.Open))
{
var trees = Serializer.DeserializeWithLengthPrefix<Tree[]>>(reader, PrefixStyle.Base128);
}
Am I using the method in a incorrect way?
It wasn't helping that the AsReference code was only respecting default data-format, which means it was trying to hold data in memory so that it can write the object-length prefix back into the data-stream, which is exactly what we don't want here (hence your quite correct use of DataFormat.Group). That will account for buffering for an individual branch of the tree. I've tweaked it locally, and I can definitely confirm that it is now writing forwards-only (the debug build has a convenient ForwardsOnly flag that I can enable which detects this and shouts).
With that tweak, I have had it work for 250 x 20,000, but I'm getting secondary problems with the dictionary resizing (even in x64) when working on the 250 x 200,000 - like you say, at around the 1.5GB level. It occurs to me, however, that I might be able to discard one of these (forwards or reverse) respectively when doing each of serialization / deserialization. I would be interested in the stack-trace when it breaks for you - if it is ultimately the dictionary resize, I may need to think about moving to a group of dictionaries...

How can I send multiple types of objects across Protobuf?

I'm implementing a client-server application, and am looking into various ways to serialize and transmit data. I began working with Xml Serializers, which worked rather well, but generate data slowly, and make large objects, especially when they need to be sent over the net. So I started looking into Protobuf, and protobuf-net.
My problem lies in the fact that protobuf doesn't sent type information with it. With Xml Serializers, I was able to build a wrapper which would send and receive any various (serializable) object over the same stream, since object serialized into Xml contain the type name of the object.
ObjectSocket socket = new ObjectSocket();
socket.AddTypeHandler(typeof(string)); // Tells the socket the types
socket.AddTypeHandler(typeof(int)); // of objects we will want
socket.AddTypeHandler(typeof(bool)); // to send and receive.
socket.AddTypeHandler(typeof(Person)); // When it gets data, it looks for
socket.AddTypeHandler(typeof(Address)); // these types in the Xml, then uses
// the appropriate serializer.
socket.Connect(_host, _port);
socket.Send(new Person() { ... });
socket.Send(new Address() { ... });
...
Object o = socket.Read();
Type oType = o.GetType();
if (oType == typeof(Person))
HandlePerson(o as Person);
else if (oType == typeof(Address))
HandleAddress(o as Address);
...
I've considered a few solutions to this, including creating a master "state" type class, which is the only type of object sent over my socket. This moves away from the functionality I've worked out with Xml Serializers, though, so I'd like to avoid that direction.
The second option would be to wrap protobuf objects in some type of wrapper, which defines the type of object. (This wrapper would also include information such as packet ID, and destination.) It seems silly to use protobuf-net to serialize an object, then stick that stream between Xml tags, but I've considered it. Is there an easy way to get this functionality out of protobuf or protobuf-net?
I've come up with a third solution, and posted it below, but if you have a better one, please post it too!
Information on field bounds bug (using System.String):
Hashing:
protected static int ComputeTypeField(Type type) // System.String
{
byte[] data = ASCIIEncoding.ASCII.GetBytes(type.FullName);
MD5CryptoServiceProvider md5 = new MD5CryptoServiceProvider();
return Math.Abs(BitConverter.ToInt32(md5.ComputeHash(data), 0));
}
Serialization:
using (MemoryStream stream = new MemoryStream())
{
Serializer.NonGeneric.SerializeWithLengthPrefix
(stream, o, PrefixStyle.Base128, field); // field = 600542181
byte[] data = stream.ToArray();
_pipe.Write(data, 0, data.Length);
}
Deserializaion:
using (MemoryStream stream = new MemoryStream(_buffer.Peek()))
{
lock (_mapLock)
{
success = Serializer.NonGeneric.TryDeserializeWithLengthPrefix
(stream, PrefixStyle.Base128, field => _mappings[field], out o);
}
if (success)
_buffer.Clear((int)stream.Position);
else
{
int len;
if (Serializer.TryReadLengthPrefix(stream, PrefixStyle.Base128, out len))
_buffer.Clear(len);
}
}
field => _mappings[field] throws a KeyNotFoundException while looking for 63671269.
If I replace ToInt32 with ToInt16 in the hash function, the field value is set to 29723 and it works. It also works if I explicitly define System.String's field to 1. Explicitly defining the field to 600542181 has the same effect as using the hash function to define it. The value of the string being serialized does not change the outcome.
This functionality is actually built in, albeit not obviously.
In this scenario, it is anticipated that you would designate a unique number per message type. The overload you are using passes them all in as "field 1", but there is an overload that lets you include this extra header information (it is still the job of the calling code to decide how to map numbers to types, though). You can then specify different types as different fields is the stream (note: this only works with the base-128 prefix style).
I'll need to double check, but the intention is that something like the following should work:
using System;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using ProtoBuf;
static class Program
{
static void Main()
{
using (MemoryStream ms = new MemoryStream())
{
WriteNext(ms, 123);
WriteNext(ms, new Person { Name = "Fred" });
WriteNext(ms, "abc");
ms.Position = 0;
while (ReadNext(ms)) { }
}
}
// *** you need some mechanism to map types to fields
static readonly IDictionary<int, Type> typeLookup = new Dictionary<int, Type>
{
{1, typeof(int)}, {2, typeof(Person)}, {3, typeof(string)}
};
static void WriteNext(Stream stream, object obj) {
Type type = obj.GetType();
int field = typeLookup.Single(pair => pair.Value == type).Key;
Serializer.NonGeneric.SerializeWithLengthPrefix(stream, obj, PrefixStyle.Base128, field);
}
static bool ReadNext(Stream stream)
{
object obj;
if (Serializer.NonGeneric.TryDeserializeWithLengthPrefix(stream, PrefixStyle.Base128, field => typeLookup[field], out obj))
{
Console.WriteLine(obj);
return true;
}
return false;
}
}
[ProtoContract] class Person {
[ProtoMember(1)]public string Name { get; set; }
public override string ToString() { return "Person: " + Name; }
}
Note that this doesn't currently work in the v2 build (since the "WithLengthPrefix" code is incomplete), but I'll go and test it on v1. If it works, I'll all the above scenario to the test suite to ensure it does work in v2.
Edit:
yes, it does work fine on "v1", with output:
123
Person: Fred
abc
I've come up with another solution, but I decided to put it as an answer, instead of in the question, because that makes more sense to me. It's pretty ugly, in my opinion, and I've been warned against using reflection, so please comment on it or provide better answers if you have them. Thanks!
class Program
{
static void Main(string[] args)
{
Person person = new Person
{
Id = 12345,
Name = "Fred",
Address = new Address
{
Line1 = "Flat 1",
Line2 = "The Meadows"
}
};
object value;
using (Stream stream = new MemoryStream())
{
Send<Person>(stream, person);
stream.Position = 0;
value = Read(stream);
person = value as Person;
}
}
static void Send<T>(Stream stream, T value)
{
Header header = new Header()
{
Guid = Guid.NewGuid(),
Type = typeof(T)
};
Serializer.SerializeWithLengthPrefix<Header>(stream, header, PrefixStyle.Base128);
Serializer.SerializeWithLengthPrefix<T>(stream, value, PrefixStyle.Base128);
}
static object Read(Stream stream)
{
Header header;
header = Serializer.DeserializeWithLengthPrefix<Header>
(stream, PrefixStyle.Base128);
MethodInfo m = typeof(Serializer).GetMethod("DeserializeWithLengthPrefix",
new Type[] {typeof(Stream), typeof(PrefixStyle)}).MakeGenericMethod(header.Type);
Object value = m.Invoke(null, new object[] {stream, PrefixStyle.Base128} );
return value;
}
}
[ProtoContract]
class Header
{
public Header() { }
[ProtoMember(1, IsRequired = true)]
public Guid Guid { get; set; }
[ProtoIgnore]
public Type Type { get; set; }
[ProtoMember(2, IsRequired = true)]
public string TypeName
{
get { return this.Type.FullName; }
set { this.Type = Type.GetType(value); }
}
}
[ProtoContract]
class Person {
[ProtoMember(1)]
public int Id { get; set; }
[ProtoMember(2)]
public string Name { get; set; }
[ProtoMember(3)]
public Address Address { get; set; }
}
[ProtoContract]
class Address {
[ProtoMember(1)]
public string Line1 { get; set; }
[ProtoMember(2)]
public string Line2 { get; set; }
}

Categories