Does protobuf-net use BinaryFormatter or other formatter as an base serializer to serialize an object as byte[], and then write to stream?
add:
I use protobuf-net serialize data and want to deserialize in golang, is there any serializer can do the work in go?
Protobuf-net is a ground-up implementation of the "Protocol Buffers" serialization format, with an idiomatic .NET API. It has nothing to do with BinaryFormatter (although it can be used to create custom ISerializable implementations for use with BinaryFormatter, if you still play in that world).
If you want to use Protocol Buffers (protobuf) with Go, just pick one of the Go implementations from this list.
Most protobuf libraries are "contract first", meaning: you need a .proto schema; to get that from protobuf-net, use Serializer.GetProto<T>() for the T that you are using as a root type.
Note: if you are serializing DateTime or TimeSpan, it would be a good idea to make sure that you are using DataFormat.WellKnown on those members - it'll make it much easier to work in a cross-platform way with other libraries; but note that this is not a data-compatible change: it fundamentally changes how those values are stored, so : if you have existing data you'll need to think of a migration strategy.
Related
I have been provided a c# class generated through protogen.
I have a json response which I want to convert into object(map to the class generated from protogen).
How can I achieve this ?
The Json that I have is like:
Test {Id:"0000001" InsertDateTime:"4/12/2018 01:01:01" ModifyDateTime:"4/12/2018 01:05:10" ParentId:"0000001"
Sample{Id:"0000002" InsertDateTime:"4/12/2018 01:01:01" ModifyDateTime:"4/12/2018 01:05:10"}
}
The main class name is 'Test' and it has int, string properties as well as properties of type 'class' as well.
I want to convert this Json to 'protogen' generated class object.
protobuf-net only implements the binary Protocol Buffers (ProtoBuf) specification.
It is unclear whether by "json" you mean "general purpose JSON, nothing to do with ProtoBuf", or whether you mean "the Protocol Buffers JSON format added around 3.0".
If you mean general purpose JSON: then usually just about any JSON serializer will work fine - Json.NET is a good default, but other JSON serializers exist. This is because protobuf-net tries to work with idiomatic .NET objects, which means that it tends to play very nicely with other tools that work with idiomatic .NET objects.
If you mean the ProtoBuf-specific JSON, then I would suggest using Google's official C# ProtoBuf library, which implements this. I simply haven't had need or time to add support for this into protobuf-net, and to date adding it has been a very low priority for me - I simply haven't seen anyone asking for it from me.
If you have generic JSON, but the JSON layout is different to your protobuf model, then frankly I would recommend having two DTO models:
one that is designed to work with your JSON data and your chosen JSON serializer (such as Json.NET)
one that is designed to work with your ProtoBuf data and your chosen ProtoBuf serializer (such as protobuf-net)
and simply map between the two representations with regular C# code (or any auto-mapper tool of your choosing).
You can sometimes fight a serializer library to get it work with an object model that doesn't match the shape of the data, but in my experience this is a bad use of time and leads to brittle, buggy code.
I wonder if there is any possibility of serializing a class described in a topic.
Suppose we have someone's library that is shared as binary DLL file. Additionally a creator of this lib created a class that is not Serializable. How to serialize such a class? I know I can create a twin-class that contains all the poperties etc. that can be serialized. But is there any other, easier solution to do this? How do you serialize classes that are "not yours" and are stored as binary only?
The 3rd party class is an implementation detail; frankly, it is a very bad idea to involve this in your serialization, as you are then completely fenced into a corner, and can never change implementation. You would also face significant risk of versioning issues - something that BinaryFormatter simply doesn't handle well.
It might not be what you want to hear, but I offer two recommendations:
do not serialize implementation details; serialize the data (only); this may indeed require you to write a DTO that mirrors the implementation, but this is usually a trivial job
make sure you understand the implications of BinaryFormatter; frankly, I never recommend it - it has... glitches.
As for workarounds: you can investigate serialization surrogates, but that isn't a trivial thing to do inside BinaryFormatter, and is basically just a re-statement of the first bullet.
If it was me (although I am hugely biased), I would change serializer; protobuf-net (disclosure: I'm the author) works as a binary serializer, and has easy-to-implement support for surrogates if the third-party model is already coupled to your model.
The .NET framework is full of examples where a method call will return a Stream that you can then read and implement as you need to. But how does this work under the covers? What backs the stream?
Say I am writing a parser that takes some inputs and parses some data into a pre-defined format. If for example I created a MemoryStream and then write my content to it using a StreamWriter and then have the method return the stream, I will run into issues because the writer will close the underlying stream when it completes and the caller won't be able to read it as expected.
How is this typically managed? Is the content for the stream stored in the object until needed (like a byte[]) and then when the method requesting the Stream is invoked it creates it at that time or what?
A stream is an abstraction of a sequence of bytes, such as a file, an input/output device, an inter-process communication pipe, or a TCP/IP socket. The Stream class and its derived classes provide a generic view of these different types of input and output, isolating the programmer from the specific details of the operating system and the underlying devices.
[MSDN said]
so i suppose you using serialize , for Serialization using different formats in .NET by using stream need to define your demands
Serialization is the process of converting an object in to bytes for persistence storage. The deserialization process converts the bytes to object without any loss of data. Serialization is used for storing values in files or database, send an object through the network and to convert back to the original object format. The .NET Framework provide a set of Framework Class Libraries (FCL) for making the serialization process easy. It is very useful for sending data between two different applications.
The .NET Framework supports binary serialization and XML serialization formats. XML serialization serializes only public fields. But, binary serialization will serialize all private and public fields. Serialization can be performed either as basic or custom. Basic serialization happens when a class has the SerializableAttribute attribute applied. Basic serialization doesn't support for versioning. A custom serialization class must be marked SerializableAttribute and implement the ISerializable interface. The user can implement custom serialization for both Binary and XML serialization formats. GetObjectData needs to be overridden for a custom application. The sample application uses custom serialization for both binary and XML serialization. The .NET Framework supports designer serialization which is associated with development tools.
Custom serialization
Custom serialization is the process of controlling the serialization and deserialization process. Custom serialization can be implemented by running custom methods during and after serialization or by implementing the ISerializable interface. Custom serialization is used for versioning the serialization object. If the serialized object has changed the object state (added a new file in later version), custom serialization is used for getting the values without loss of data. The versioning of the serialized object may fail due to missing attributes.
If the user wants to use custom methods during and after serialization, the user should apply the custom serialization support using OnDeserializedAttribute, OnDeserializingAttribute, OnSerializedAttribute and OnSerializingAttribute attributes for customizing the data during serialization and deserialization. The OptionalFieldAttribute attribute is used for ignoring the old version data for deserialization. The formatter doesn't give any error during deserialization. It allows for updating the object before and after serialization/deserialization.
i think below link help you
http://www.codeproject.com/Articles/422474/Serialization-using-different-formats-in-NET
The stream can be backed by many different things. That's the whole idea of streams deriving from the Stream abstract base class.
The stream can be backed by an OS level file stream, by memory, by an HTTP connection, or anything else that can fulfill the Stream contract.
In the case of a MemoryStream the backing storage is just a block of memory.
In the case of StreamWriter, calling Dispose() on it will close the underlying stream. Make sure you don't dispose the writer as long as you still want to use the stream. Also, if you want to re-access a MemoryStream after writing to it, be sure and set the position to the beginning, e.g.:
memStream.Seek(0, SeekOrigin.Begin);
StreamWriter has an overloaded constructor that you can use to instruct the write to not close the stream.
Also, the Stream should have a .WriteBytes method that allows you to avoid a StreamWriter altogether.
Is there any way to serialize/deserialize an object DataContractJsonSerializer as well as ISerializeable interface. as my requirement is I get JSon from a web service and deserialize it into a collections of objects say message. now I want to write all objects in the list to file.. is there any way please explain.
The DataContractJsonSerialiazer uses the DataContract / DataMember attributes, and doesn't care about ISerializable. Nothing prevents you from adding both the DataContract attribute and implemented the ISerializable interface, though.
You could also use the System.Web.Script.Serialization.JavaScriptSerializer class instead, I believe that it respects ISerializable. It isn't very fast though, and doesn't support dates, so I would really not recommend it.
Are you sure you cannot just use the DataContract serializer? You can use that with a binary writer if you want to: http://msdn.microsoft.com/en-us/library/ms752244.aspx
We have a project which uses serialization by implementing ISerializable and using BinaryFormatter. We are moving to .net compact which do not have BinaryFormatter class. I was wondering if we can use ProtoBuf-Net. Can we instruct ProtoBuf-Net to use GetObjectData() for serialization? Specifically, I want to find out how to use Serializer.Serialize(SerializableInfo, ISerializable) function.
Added
I looked at the source code and I see that though the function
Serializer.Serialize(SerializableInfo, T t) where T is ISerializable
doesn't use the constraint T is ISerializable at all. Why was T declared to be ISerializable?
The ISerializable support exists (at the moment) as a mechanism for injecting protobuf-net into the middle of existing APIs that use BinaryFormatter - for example, you can do this to reduce the overhead of .NET remoting or (serialized) session-state. However, this still results in a pb-net BLOB in the middle of a BinaryFormatter wrapper (in particular, the bf header specifies the concrete type assembly-qualfied-name).
Re the question about the constraint - that is simply to help guide usage; that API is only intended to be used as part of this injection into BinaryFormatter via ISerializable. Otherwise it is preferred to just use the Stream based API.
If you are looking at CF, then of course BinaryFormatter ceases to be useful, but the Stream based API is fine. It works on CF, but you should note that for models with lots of classes there are currently some glitches on CF caused by the overuse of generics ("missing method" or some-such). The good news is that I'm making good progress on some refactoring that ditches this, making it much more stable on the smaller frameworks.