I am getting some type IList<object[]> , what is the best way to serialize it to xml.
And then read it back to IList<object[]>.
I just not see any easy way to do so.
Thanks for help.
The XmlSerializer chokes on interfaces. So you could convert it to an array or a concrete List<T> before serializing. Also you should absolutely specify known types because this object[] will simply not work. The serializer must know in advance all types that you will be dealing with. This way it will emit type information into the resulting XML:
var data = list.ToArray();
var knownTypes = new[] { typeof(Foo), typeof(Bar) };
var serializer = new XmlSerializer(data.GetType(), knownTypes);
serializer.Serialize(someStream, data);
Or if you don't wanna bother with all this and simply get some human readable persistence for your objects you could use Json:
var serializer = new JavaScriptSerializer();
string json = serializer.Serialize(list);
And if you don't care about human readability, a binary serializer should do just fine.
Related
I mostly work on the PHP , recently have switched to ASP.NET,
When parse the JSON, I can simply use -> to get the field, e.g.
foreach(json_decode($_POST['mandrill_events']) as $event) {
$event = $event->event;
$email_type = $event->msg->metadata->email_type;
}
However, in ASP.NET , there is no action, this is my attempt code
var post_data = Request.Form["mandrill_events"];
JavaScriptSerializer ser = new JavaScriptSerializer();
var post_data_json = ser.Deserialize<Dictionary<string, string>>(post_data);
foreach (var event_obj in post_data_json) {
//how to parse the event_obj?
}
Thanks a lot for helping.
use Newtonsoft Json.NET
JsonConvert.DeserializeObject<DataModel>(json);
Unless you want to write a C# class that represents the JSON you are POSTing (the safest solution), you can use the dynamic type to create an object which will look like your JSON. You can then do something like this answer to access the properties.
This solution doesn't give you type safety and the DLR will resolve the properties of the dynamic object at runtime.
As other answers have mentioned, your life will be made much easier by using Newtonsoft JSON which will allow you to write:
dynamic events = JsonConvert.DeserializeObject<dynamic>(post_data);
foreach(dynamic evt in events)
{
string emailType = evt.msg.metadata.email_type;
}
I have a generic list of type object and I am trying to serialize it , but deserializing doesn't fetch me good results.
Here is what I am trying to do:
List<object> sample_list = new List<object>();
sample_list.Add(new Sample() { type=0,message="I am the first"});
sample_list.Add(new Sample1() { type=1,message1 = "I am the 2" });
sample_list.Add(new Sample2() { type=2,message2 = "I am the 3" });
string serial= JsonConvert.SerializeObject(sample_list);
List<object> list = JsonConvert.DeserializeObject<List<object>>(serial);
lstbox.ItemsSource = list;
foreach(var item in list)
{
if (item is Sample)
{
MessageBox.Show("Item is sample");
}
}
But the Message box never shows.
What should be done so that it can work correct?
You are deserializing list into list of object, why you expect then CLR will recognize those objects as Sample or Sample1? Serialized JSON looks:
[{"type":0,"message":"I am the first"},{"type":1,"message1":"I am the 2"},{"type":2,"message2":"I am the 3"}]
So how JSON.NET could magically find out that they are Sample objects? Make a simple test to notice that list[0].GetType().FullName is Newtonsoft.Json.Linq.JObject, not Sample.
If you want to deserialize to list of Sample, write:
var list = JsonConvert.DeserializeObject<List<Sample>>(serial);
and then Json.NET will try to match each property from JSON into Sample properties (but of course it won't success sometimes as not objects are of type Sample).
If you want to serialize your list, your JSON should store information about used types, and there is built-in option for it in JSON.NET:
string serial = JsonConvert.SerializeObject(sample_list,
new JsonSerializerSettings {
TypeNameHandling = TypeNameHandling.Objects
});
Then serialized JSON looks:
[{"$type":"ConsolePusher.Sample, ConsolePusher","type":0,"message":"I am the first"},
{"$type":"ConsolePusher.Sample1, ConsolePusher","type":1,"message1":"I am the 2"},
{"$type":"ConsolePusher.Sample2, ConsolePusher","type":2,"message2":"I am the 3"}]
So it will be able to recreate objects when deserializing:
var list = JsonConvert.DeserializeObject<List<object>>(serial,
new JsonSerializerSettings
{
TypeNameHandling = TypeNameHandling.Objects
});
The clue here should be the json:
[{"type":0,"message":"I am the first"},
{"type":1,"message1":"I am the 2"},
{"type":2,"message2":"I am the 3"}]
You are deserializing that as List<object>. So: there is nothing in either the json or the API call that gives it a hint that you want a Sample. If you want it to store the type name and use that during deserialization, you need to enable that option:
var settings = new JsonSerializerSettings {
TypeNameHandling = TypeNameHandling.Objects
};
string serial = JsonConvert.SerializeObject(sample_list, settings);
List<object> list =
JsonConvert.DeserializeObject<List<object>>(serial, settings);
Note that storing type names is brittle and implementation-specific; it will not work well in all scenarios. The json here becomes:
[{"$type":"Sample, ConsoleApplication43","type":0,"message":"I am the first"},
{"$type":"Sample1, ConsoleApplication43","type":1,"message1":"I am the 2"},
{"$type":"Sample2, ConsoleApplication43","type":2,"message2":"I am the 3"}]
I am looking for an XML Serialization solution to serialize a class which contains a List<T>. Reading the MSDN article for System.XML.Serialiation it states (about 5/8 down the page)
[Serialization of ArrayList and Generic List]
The XmlSerializer cannot serialize or deserialize the following:
Arrays of ArrayList
Arrays of List<T>
Does this mean I cannot serialize my List<T> or does this say I cannot serialize something like
List<int>[] intArr = new List<int>[10];
EDIT: Why isn't it possible to do so?
Array of list will be
List<int>[]
So, you can safely serialize
List<int>
The latter. Serializing List<T> is very common.
You can use whole class, or just the List:
static void SerializeToXml<T>(string file, T value)
{
var serializer = new XmlSerializer(typeof(T));
using (var writer = XmlWriter.Create(file))
serializer.Serialize(writer, value);
}
static T DeserializeFromXml<T>(string file)
{
var serializer = new XmlSerializer(typeof(T));
using (var reader = XmlReader.Create(file))
return (T)serializer.Deserialize(reader);
}
I tried to serialize a DynamicObject class with BinaryFormatter, but:
Output file is too big, not exactly wire-friendly
Circular References not handled (stuck while serializing)
Since serializing a DynamicObject means very little by itself, here's the class I tried to serialize:
[Serializable()]
class Entity
: DynamicObject, ISerializable
{
IDictionary<string, object> values = new Dictionary<string, object>();
public Entity()
{
}
protected Entity(SerializationInfo info, StreamingContext ctx)
{
string fieldName = string.Empty;
object fieldValue = null;
foreach (var field in info)
{
fieldName = field.Name;
fieldValue = field.Value;
if (string.IsNullOrWhiteSpace(fieldName))
continue;
if (fieldValue == null)
continue;
this.values.Add(fieldName, fieldValue);
}
}
public override bool TryGetMember(GetMemberBinder binder, out object result)
{
this.values.TryGetValue(binder.Name, out result);
return true;
}
public override bool TrySetMember(SetMemberBinder binder, object value)
{
this.values[binder.Name] = value;
return true;
}
void ISerializable.GetObjectData(SerializationInfo info, StreamingContext context)
{
foreach (var kvp in this.values)
{
info.AddValue(kvp.Key, kvp.Value);
}
}
}
(I guess I could have used an ExpandoObject, but that's another story.)
Here's a simple test program:
static void Main(string[] args)
{
BinaryFormatter binFmt = new BinaryFormatter();
dynamic obj = new Entity();
dynamic subObj = new Entity();
dynamic obj2 = null;
obj.Value = 100;
obj.Dictionary = new Dictionary<string, int>() { { "la la la", 1000 } };
subObj.Value = 200;
subObj.Name = "SubObject";
obj.Child = subObj;
using (var stream = new FileStream("test.txt", FileMode.OpenOrCreate))
{
binFmt.Serialize(stream, obj);
}
using (var stream = new FileStream("test.txt", FileMode.Open))
{
try
{
obj2 = binFmt.Deserialize(stream);
}
catch (Exception ex)
{
Console.WriteLine(ex);
}
}
Console.ReadLine();
}
Putting some breakpoints here and there helped me have a look at obj2 contents and it looks like the original data is correctly deserialized, though with the above shortcomings if you get imaginative and move data around.
I had a look at Marc Gravell's protobuf-net, but I'm not really sure how to use it in such a context (I'm not even sure I picked up the right version from the repository, but hey).
I know it's more code than words, but I don't think I can explain the scenario any better. Please do tell if there's something I can add to make this question clearer.
Any help is much appreciated.
I'm 98% certain that this sequence will work for a dynamic object.
convert object to an Expando Object
cast expando object to be of type Dictionary
use ProtoBuf-net Serializer.Serialize / .Deserialize as per normal
convert dictionary to Expando Object
You can convert objects to a collection of name/value pairs for transfering.
That's just a small subset of what dynamic can do, but perhaps it is enough for you.
There's some custom code to handle some of the conversions above that I can show you if there's interest.
I don't have a solution for when dynamic is a placeholder to a class. For this case I'd suggest getting the type and using a switch statement to serialize / deserialize as you require. For this last case, you'd need to place a something to indicate which type of generic deserialization that you need (string / id / fully qualified type name / etc). Assumption is that you are dealing with a list of expected types.
Note: Expando implements IDictionary. An Expando is merely merely a list of key/value pairs. ie. the thing you dot into is the key, and the value is the return from whatever chain of functions implements that. There are a set of dynamic interfaces for customising the syntactic sugar experience, but most of the time you wont to look at them.
refs:
Dictionary from IDictionary using the constructor -- http://msdn.microsoft.com/en-us/library/et0ke8sz(v=vs.110).aspx
IDictionary/Dictionary to Expando -- http://theburningmonk.com/2011/05/idictionarystring-object-to-expandoobject-extension-method/
I am not sure if JSON would be acceptable in your senario, but if it is I have used Json.net (http://json.codeplex.com) to serialize a dynamic types. It works quite well, it is fast, and the output is small in size. While Json.net doesn't return dynamic objects directly, it is very easy to convert the deserialized output of Json.Net to any dynamic type. In the example below I am using ExpandoObject as my dynamic type. The code below is what I have used in the Facebook Graph Toolkit. See this link for the original source: http://facebookgraphtoolkit.codeplex.com/SourceControl/changeset/view/48442#904504
public static dynamic Convert(string s) {
object obj = Newtonsoft.Json.JsonConvert.DeserializeObject(s);
if (obj is string) {
return obj as string;
} else {
return ConvertJson((JToken)obj);
}
}
private static dynamic ConvertJson(JToken token) {
// FROM : http://blog.petegoo.com/archive/2009/10/27/using-json.net-to-eval-json-into-a-dynamic-variable-in.aspx
// Ideally in the future Json.Net will support dynamic and this can be eliminated.
if (token is JValue) {
return ((JValue)token).Value;
} else if (token is JObject) {
ExpandoObject expando = new ExpandoObject();
(from childToken in ((JToken)token) where childToken is JProperty select childToken as JProperty).ToList().ForEach(property => {
((IDictionary<string, object>)expando).Add(property.Name, ConvertJson(property.Value));
});
return expando;
} else if (token is JArray) {
List<ExpandoObject> items = new List<ExpandoObject>();
foreach (JToken arrayItem in ((JArray)token)) {
items.Add(ConvertJson(arrayItem));
}
return items;
}
throw new ArgumentException(string.Format("Unknown token type '{0}'", token.GetType()), "token");
}
First off, the size of your file depends on 2 things (if I understand how BinaryFormatter works, please correct me if I'm wrong):
The size of the actual values being serialized, and
The names used to serialize the object's values with the SerializationInfo.AddValue method, which are stored in the output file so values can be used during deserialization with the same name.
Obviously, #1 is going to cause your biggest slowdown, which can only be reduced by optimizing the objects you're trying to serialize.
Because you're using dynamic objects, the almost unnoticably small size increase caused by #2 is unavoidable. If you knew the types and names of the object's members ahead of time, you could just give each member of the object very short, sequentially-determined name ("1", "2", "3", etc.) as you iterated over the object's members, adding them via SerializationInfo.AddValue. Then, during deserialization, you could use SerializationInfo.GetValue with the same sequentially-determined name, and deserialization would work just fine, regardless of the actual names of the values being deserialized, as long as you iterated through the object's members in the same order they were added in. Granted, this might only save you an average of 4 or 5 bytes per member, but those little amounts can add up in large objects.
#Raine: (I guess I could have used an ExpandoObject, but that's another story.)
Not so; I changed your code sample to use ExpandoObject instead of your Entity class, and got a SerializationException thrown at me. ExpandoObject is not marked with a SerializableAttribute, and it doesn't have the appropriate constructors to be deserialized or serialized. However, this doesn't mean you can't use ExpandoObject if you really want to: it implements IDictionary<string, object>, which in turn implements ICollection<KeyValuePair<string, object>>. Thus, an ExpandoObject instance is a collection of KeyValuePair<string, object> instances, which are marked as serializable. So, you could serialize an ExpandoObject, but you'd have to cast it as ICollection<KeyValuePair<string, object>> and serialize each KeyValuePair<string, object> in it individually. This would pointless, though, in terms of optimizing your original code sample, because it takes just as much file space.
In summary, I really don't think there's any way you could optimize serializing a dynamic object- you have to loop through the object's members every time it's serialized, and you have no way to know the object's size beforehand (by definition of dynamic).
I don't know if SharpSerializer supports Dynamic Objects but it might be worth a try:
http://www.sharpserializer.com/en/index.html
One Suggestion for serializing dynamic object is convert them to String and then serialize, you can then deserialize back to your object, if applicable in your case.
I've currently got an XML element in my database that maps to an object (Long story short the XML is complicated and dynamic enough to defy a conventional relational data structure without massive performance hits).
Around it I've wrapped a series of C# objects that encapsulate the structure of the XML. They work off a base class and there's multiple different possible classes it'll deserialize to with different data structures and different implemented methods. I'm currently wrapping the functionality to serialize/deserialize these into a partial class of the LINQ-generated database objects.
My current approach to this is:
public Options GetOptions()
{
if (XmlOptions == null) return null;
XmlSerializer xs = new XmlSerializer(typeof(Options));
return (Options)xs.Deserialize(XmlOptions.CreateReader());
}
public void SetOptions(Options options)
{
if (XmlOptions == null) Options = null;
else
{
XmlSerializer xs = new XmlSerializer(typeof(Options));
using (MemoryStream ms = new MemoryStream())
{
xs.Serialize(ms, options);
XmlOptions = System.Xml.Linq.XElement.Parse(System.Text.UTF8Encoding.UTF8.GetString(ms.ToArray()));
}
}
}
(To help with reading given the changed names aren't too clear, XmlOptions is the XElement element from LINQ and Options is my class it deserializes into)
Now, it works. But that's not really enough to call it "finished" to me :P It just seems incredibly inefficient to serialize XML to a memory stream, convert it to a string, then re-parse it as XML. My question is - Is this the cleanest way to do this? Is there a more efficient/tidy mechanism for doing the serialization? Is there a better approach that'll give me the same benefits (I've only got test data in the system so far, so I can change the XML structure if required)?
PS: I've renamed the fields to be more generic - so don't really need comments about naming conventions ;)
XmlSerializer.Serialize has an overload that takes an XmlWriter.
You can create an XmlWriter that writes to an existing XElement by calling the CreateWriter method.
You can therefore write the following:
static readonly XmlSerializer xs = new XmlSerializer(typeof(Options));
...
var temp = new XElement("Parent");
using (var writer = temp.CreateWriter())
xs.Serialize(writer, options);
XmlOptions = Temp.FirstNode;