I have an API method that loads ~100k rows from a DB into memory and each of these rows contain a JSON string. For each request, the JSON object structure will be the same (same object keys) across all of the rows, although I will not know this ahead of time.
At the moment I am calling JObject.Parse(row.Json) for every row to get a JObject. When I examine the heap, I can see a duplicate entry for every single object key string. So if I have the object key id in the JSON of each row and I have 100k rows, I see 100k instances of this string in memory.
I would like to cache (or potentially String.Intern() depending on lifetime) these object keys and reuse the strings in these JObjects. I can see that using JsonConvert.DeserializeObject() I can supply a custom converter, but AFAIK they let you modify the JSON values and not the keys.
Note: I must have all 100k rows in memory at a time as I later run an algo that requires everything at once.
If you know the structure of the JSON, you could always create a class that contains the most common fields. This will save quite a bit of space.
class RowData
{
[JsonProperty("id")]
public int Id { get; set; }
[JsonProperty("anyOtherFixedField")]
public string OtherField{ get; set; }
[JsonExtensionData]
public IDictionary<string, JToken> ExtraProperties {get; set;}
}
The fields that have a property will not have a string on the heap at all.
Any fields in the JSON that doesn't have a corresponding property will go into the ExtraProperties dictionary.
There doesn't seem to be a good way to hook into the default JObject deserialization.
I've made a custom converter, based on the provided ExpandoObjectConverter, that creates a JValue/JObject/JArray instead of an ExpandoObject. All object keys are cached and reused for the lifetime of the converter.
To use this converter, you have to specify that you want to deserialize to a JToken,JObject or JArray.
If you don't specify a target type, this converter will not be used.
var data = JsonConvert.DeserializeObject<JToken>(json, new NameCachingJObjectConverter());
And the implementation.
public class NameCachingJObjectConverter : JsonConverter
{
public override void WriteJson(JsonWriter writer, object value, JsonSerializer serializer)
{
// can write is set to false
}
public override object ReadJson(JsonReader reader, Type objectType, object existingValue, JsonSerializer serializer)
{
return ReadValue(reader);
}
private JToken ReadValue(JsonReader reader)
{
if (!MoveToContent(reader))
{
throw new Exception("Unexpected end of content");
}
switch (reader.TokenType)
{
case JsonToken.StartObject:
return ReadObject(reader);
case JsonToken.StartArray:
return ReadList(reader);
default:
if (IsPrimitiveToken(reader.TokenType))
{
return new JValue(reader.Value);
}
throw new Exception("Unexpected token when converting object: {reader.TokenType}");
}
}
private static bool IsPrimitiveToken(JsonToken token)
{
switch (token)
{
case JsonToken.Integer:
case JsonToken.Float:
case JsonToken.String:
case JsonToken.Boolean:
case JsonToken.Undefined:
case JsonToken.Null:
case JsonToken.Date:
case JsonToken.Bytes:
return true;
default:
return false;
}
}
private static bool MoveToContent(JsonReader reader)
{
JsonToken t = reader.TokenType;
while (t == JsonToken.None || t == JsonToken.Comment)
{
if (!reader.Read())
{
return false;
}
t = reader.TokenType;
}
return true;
}
private JArray ReadList(JsonReader reader)
{
var list = new JArray();
while (reader.Read())
{
switch (reader.TokenType)
{
case JsonToken.Comment:
break;
default:
object v = ReadValue(reader);
list.Add(v);
break;
case JsonToken.EndArray:
return list;
}
}
throw new Exception("Unexpected end when reading JObject.");
}
private JToken ReadObject(JsonReader reader)
{
var expandoObject = new JObject();
while (reader.Read())
{
switch (reader.TokenType)
{
case JsonToken.PropertyName:
string propertyName = GetCachedName(reader.Value.ToString());
if (!reader.Read())
{
throw new Exception("Unexpected end when reading JObject.");
}
var v = ReadValue(reader);
expandoObject[propertyName] = v;
break;
case JsonToken.Comment:
break;
case JsonToken.EndObject:
return expandoObject;
}
}
throw new Exception("Unexpected end when reading ExpandoObject.");
}
/// <summary>
/// Determines whether this instance can convert the specified object type.
/// </summary>
/// <param name="objectType">Type of the object.</param>
/// <returns>
/// <c>true</c> if this instance can convert the specified object type; otherwise, <c>false</c>.
/// </returns>
public override bool CanConvert(Type objectType)
{
return (typeof(JToken).IsAssignableFrom(objectType));
}
/// <summary>
/// Gets a value indicating whether this <see cref="JsonConverter"/> can write JSON.
/// </summary>
/// <value>
/// <c>true</c> if this <see cref="JsonConverter"/> can write JSON; otherwise, <c>false</c>.
/// </value>
public override bool CanWrite => false;
private string GetCachedName(string value)
{
string ret;
if (!cache.TryGetValue(value, out ret))
{
cache[value] = value;
ret = value;
}
return ret;
}
private readonly Dictionary<string, string> cache = new Dictionary<string, string>();
}
Related
Before Swashbuckle 5 it was possible to define and register a ISchemaFilter that could provide an example implementation of a model:
public class MyModelExampleSchemaFilter : ISchemaFilter
{
public void Apply(Schema schema, SchemaFilterContext context)
{
if (context.SystemType.IsAssignableFrom(typeof(MyModel)))
{
schema.Example = new MyModel
{
Name = "model name",
value = 42
};
}
}
}
The Schema.Example would take an arbitrary object, and it would properly serialize when generating the OpenApi Schema.
However, with the move to .NET Core 3 and Swashbuckle 5 the Schema.Example property is no longer an object and requires the type Microsoft.OpenApi.Any.IOpenApiAny. There does not appear to be a documented path forward regarding how to provide a new example.
I've attempted, based on looking at code within Microsoft.OpenApi, to build my own implementation of an IOpenApiAny but any attempt to use it to generate an example fails from within Microsoft.OpenApi.Writers.OpenApiWriterAnyExtensions.WriteObject(IOpenApiWriter writer, OpenApiObject entity) before its Write method is even called. I don't claim that the code below is fully correct, but I would have expected it to at a minimum light up a path and to how to move forward.
/// <summary>
/// A class that recursively adapts a unidirectional POCO tree into an <see cref="IOpenApiAny" />
/// </summary>
/// <remarks>
/// <para>This will fail if a graph is provided (backwards and forwards references</para>
/// </remarks>
public class OpenApiPoco : IOpenApiAny
{
/// <summary>
/// The model to be converted
/// </summary>
private readonly object _model;
/// <summary>
/// Initializes a new instance of the <see cref="OpenApiPoco" /> class.
/// </summary>
/// <param name="model">the model to convert to an <see cref="IOpenApiAny" /> </param>
public OpenApiPoco(object model)
{
this._model = model;
}
/// <inheritdoc />
public AnyType AnyType => DetermineAnyType(this._model);
#region From Interface IOpenApiExtension
/// <inheritdoc />
public void Write(IOpenApiWriter writer,
OpenApiSpecVersion specVersion)
{
this.Write(this._model, writer, specVersion);
}
#endregion
private static AnyType DetermineAnyType(object model)
{
if (model is null)
{
return AnyType.Null;
}
var modelType = model.GetType();
if (modelType.IsAssignableFrom(typeof(int))
|| modelType.IsAssignableFrom(typeof(long))
|| modelType.IsAssignableFrom(typeof(float))
|| modelType.IsAssignableFrom(typeof(double))
|| modelType.IsAssignableFrom(typeof(string))
|| modelType.IsAssignableFrom(typeof(byte))
|| modelType.IsAssignableFrom(typeof(byte[])) // Binary or Byte
|| modelType.IsAssignableFrom(typeof(bool))
|| modelType.IsAssignableFrom(typeof(DateTimeOffset)) // DateTime
|| modelType.IsAssignableFrom(typeof(DateTime)) // Date
)
{
return AnyType.Primitive;
}
if (modelType.IsAssignableFrom(typeof(IEnumerable))) // test after primitive check so as to avoid catching string and byte[]
{
return AnyType.Array;
}
return AnyType.Object; // Assume object
}
private void Write(object model,
[NotNull] IOpenApiWriter writer,
OpenApiSpecVersion specVersion)
{
if (writer is null)
{
throw new ArgumentNullException(nameof(writer));
}
if (model is null)
{
writer.WriteNull();
return;
}
var modelType = model.GetType();
if (modelType.IsAssignableFrom(typeof(int))
|| modelType.IsAssignableFrom(typeof(long))
|| modelType.IsAssignableFrom(typeof(float))
|| modelType.IsAssignableFrom(typeof(double))
|| modelType.IsAssignableFrom(typeof(string))
|| modelType.IsAssignableFrom(typeof(byte[])) // Binary or Byte
|| modelType.IsAssignableFrom(typeof(bool))
|| modelType.IsAssignableFrom(typeof(DateTimeOffset)) // DateTime
|| modelType.IsAssignableFrom(typeof(DateTime)) // Date
)
{
this.WritePrimitive(model, writer, specVersion);
return;
}
if (modelType.IsAssignableFrom(typeof(IEnumerable))) // test after primitive check so as to avoid catching string and byte[]
{
this.WriteArray((IEnumerable) model, writer, specVersion);
return;
}
this.WriteObject(model, writer, specVersion); // Assume object
}
private void WritePrimitive(object model,
IOpenApiWriter writer,
OpenApiSpecVersion specVersion)
{
switch (model.GetType())
{
case TypeInfo typeInfo
when typeInfo.IsAssignableFrom(typeof(string)): // string
writer.WriteValue((string) model);
break;
case TypeInfo typeInfo
when typeInfo.IsAssignableFrom(typeof(byte[])): // assume Binary; can't differentiate from Byte and Binary based on type alone
// if we chose to treat byte[] as Byte we would Base64 it to string. eg: writer.WriteValue(Convert.ToBase64String((byte[]) propertyValue));
writer.WriteValue(Encoding.UTF8.GetString((byte[]) model));
break;
case TypeInfo typeInfo
when typeInfo.IsAssignableFrom(typeof(bool)): // boolean
writer.WriteValue((bool) model);
break;
case TypeInfo typeInfo
when typeInfo.IsAssignableFrom(typeof(DateTimeOffset)): // DateTime as DateTimeOffset
writer.WriteValue((DateTimeOffset) model);
break;
case TypeInfo typeInfo
when typeInfo.IsAssignableFrom(typeof(DateTime)): // Date as DateTime
writer.WriteValue((DateTime) model);
break;
case TypeInfo typeInfo
when typeInfo.IsAssignableFrom(typeof(double)): // Double
writer.WriteValue((double) model);
break;
case TypeInfo typeInfo
when typeInfo.IsAssignableFrom(typeof(float)): // Float
writer.WriteValue((float) model);
break;
case TypeInfo typeInfo
when typeInfo.IsAssignableFrom(typeof(int)): // Integer
writer.WriteValue((int) model);
break;
case TypeInfo typeInfo
when typeInfo.IsAssignableFrom(typeof(long)): // Long
writer.WriteValue((long) model);
break;
case TypeInfo typeInfo
when typeInfo.IsAssignableFrom(typeof(Guid)): // Guid (as a string)
writer.WriteValue(model.ToString());
break;
default:
throw new ArgumentOutOfRangeException(nameof(model),
model?.GetType()
.Name,
"unexpected model type");
}
}
private void WriteArray(IEnumerable model,
IOpenApiWriter writer,
OpenApiSpecVersion specVersion)
{
writer.WriteStartArray();
foreach (var item in model)
{
this.Write(item, writer, specVersion); // recursive call
}
writer.WriteEndArray();
}
private void WriteObject(object model,
IOpenApiWriter writer,
OpenApiSpecVersion specVersion)
{
var propertyInfos = model.GetType()
.GetProperties();
writer.WriteStartObject();
foreach (var property in propertyInfos)
{
writer.WritePropertyName(property.Name);
var propertyValue = property.GetValue(model);
switch (propertyValue.GetType())
{
case TypeInfo typeInfo // primitives
when typeInfo.IsAssignableFrom(typeof(string)) // string
|| typeInfo.IsAssignableFrom(typeof(byte[])) // assume Binary or Byte
|| typeInfo.IsAssignableFrom(typeof(bool)) // boolean
|| typeInfo.IsAssignableFrom(typeof(DateTimeOffset)) // DateTime as DateTimeOffset
|| typeInfo.IsAssignableFrom(typeof(DateTime)) // Date as DateTime
|| typeInfo.IsAssignableFrom(typeof(double)) // Double
|| typeInfo.IsAssignableFrom(typeof(float)) // Float
|| typeInfo.IsAssignableFrom(typeof(int)) // Integer
|| typeInfo.IsAssignableFrom(typeof(long)) // Long
|| typeInfo.IsAssignableFrom(typeof(Guid)): // Guid (as a string)
this.WritePrimitive(propertyValue, writer, specVersion);
break;
case TypeInfo typeInfo // Array test after primitive check so as to avoid catching string and byte[]
when typeInfo.IsAssignableFrom(typeof(IEnumerable)): // Enumerable as array of objects
this.WriteArray((IEnumerable) propertyValue, writer, specVersion);
break;
case TypeInfo typeInfo // object
when typeInfo.IsAssignableFrom(typeof(object)): // Object
default:
this.Write(propertyValue, writer, specVersion); // recursive call
break;
}
}
writer.WriteEndObject();
}
}
What is the proper way to transition ISchemaFilter examples to Swashbuckle 5.0 so that the appropriate serialization rules are respected?
They have an example on the repo:
https://github.com/domaindrivendev/Swashbuckle.AspNetCore/blob/9bb9be9b318c576d236152f142aafa8c860fb946/test/WebSites/Basic/Swagger/ExamplesSchemaFilter.cs#L8
public class ExamplesSchemaFilter : ISchemaFilter
{
public void Apply(OpenApiSchema schema, SchemaFilterContext context)
{
schema.Example = GetExampleOrNullFor(context.Type);
}
private IOpenApiAny GetExampleOrNullFor(Type type)
{
switch (type.Name)
{
case "Product":
return new OpenApiObject
{
[ "id" ] = new OpenApiInteger(123),
[ "description" ] = new OpenApiString("foobar"),
[ "price" ] = new OpenApiDouble(14.37)
};
default:
return null;
}
}
}
How can one deserialize the following JSON
{
"result" : {
"master" : [
["one", "two"],
["three", "four"],
["five", "six", "seven"],
],
"blaster" : [
["ein", "zwei"],
["drei", "vier"]
],
"surprise" : "nonsense-nonsense-nonsense"
}
}
into the following data structure
class ResultView
{
public Dictionary<string, string[][]> Result { get; set; }
}
with Json.NET?
It has to be dictionary because key names such as 'master' and 'blaster' are unknown at the time of compilation. What is known is that they always point to an array of arrays of strings. The problem is that key 'surprise', whose name is known and always the same, points to something that cannot be interpreted as string[][], and this leads to exception in Json.NET.
Is there any way to make Json.NET ignore specific dictionary key?
You can introduce a custom generic JsonConverter for IDictionary<string, TValue> that filters out invalid dictionary values (i.e. those that cannot be deserialized successfully to the dictionary value type):
public class TolerantDictionaryItemConverter<TDictionary, TValue> : JsonConverter where TDictionary : IDictionary<string, TValue>
{
public override bool CanConvert(Type objectType)
{
return typeof(TDictionary).IsAssignableFrom(objectType);
}
public override object ReadJson(JsonReader reader, Type dictionaryType, object existingValue, JsonSerializer serializer)
{
// Get contract information
var contract = serializer.ContractResolver.ResolveContract(dictionaryType) as JsonDictionaryContract;
if (contract == null)
throw new JsonSerializationException(string.Format("Invalid JsonDictionaryContract for {0}", dictionaryType));
if (contract.DictionaryKeyType != typeof(string))
throw new JsonSerializationException(string.Format("Key type {0} not supported", dictionaryType));
var itemContract = serializer.ContractResolver.ResolveContract(contract.DictionaryValueType);
// Process the first token
var tokenType = reader.SkipComments().TokenType;
if (tokenType == JsonToken.Null)
return null;
if (reader.TokenType != JsonToken.StartObject)
throw new JsonSerializationException(string.Format("Expected {0}, encountered {1} at path {2}", JsonToken.StartArray, reader.TokenType, reader.Path));
// Allocate the dictionary
var dictionary = existingValue as IDictionary<string, TValue> ?? (IDictionary<string, TValue>) contract.DefaultCreator();
// Process the collection items
while (reader.Read())
{
if (reader.TokenType == JsonToken.EndObject)
{
return dictionary;
}
else if (reader.TokenType == JsonToken.PropertyName)
{
var key = (string)reader.Value;
reader.ReadSkipCommentsAndAssert();
// For performance, skip tokens we can easily determine cannot be deserialized to itemContract
if (itemContract.QuickRejectStartToken(reader.TokenType))
{
System.Diagnostics.Debug.WriteLine(string.Format("value for {0} skipped", key));
reader.Skip();
}
else
{
// What we want to do is to distinguish between JSON files that are not WELL-FORMED
// (e.g. truncated) and that are not VALID (cannot be deserialized to the current item type).
// An exception must still be thrown for an ill-formed file.
// Thus we first load into a JToken, then deserialize.
var token = JToken.Load(reader);
try
{
var value = serializer.Deserialize<TValue>(token.CreateReader());
dictionary.Add(key, value);
}
catch (Exception)
{
System.Diagnostics.Debug.WriteLine(string.Format("value for {0} skipped", key));
}
}
}
else if (reader.TokenType == JsonToken.Comment)
{
continue;
}
else
{
throw new JsonSerializationException(string.Format("Unexpected token type {0} object at path {1}.", reader.TokenType, reader.Path));
}
}
// Should not come here.
throw new JsonSerializationException("Unclosed object at path: " + reader.Path);
}
public override bool CanWrite { get { return false; } }
public override void WriteJson(JsonWriter writer, object value, JsonSerializer serializer)
{
throw new NotImplementedException();
}
}
public static partial class JsonExtensions
{
public static JsonReader SkipComments(this JsonReader reader)
{
while (reader.TokenType == JsonToken.Comment && reader.Read())
;
return reader;
}
public static void ReadSkipCommentsAndAssert(this JsonReader reader)
{
if (reader == null)
throw new ArgumentNullException();
while (reader.Read())
{
if (reader.TokenType != JsonToken.Comment)
return;
}
new JsonReaderException(string.Format("Unexpected end at path {0}", reader.Path));
}
internal static bool QuickRejectStartToken(this JsonContract contract, JsonToken token)
{
if (contract is JsonLinqContract)
return false;
switch (token)
{
case JsonToken.None:
return true;
case JsonToken.StartObject:
return !(contract is JsonContainerContract) || contract is JsonArrayContract; // reject if not dictionary or object
case JsonToken.StartArray:
return !(contract is JsonArrayContract); // reject if not array
case JsonToken.Null:
return contract.CreatedType.IsValueType && Nullable.GetUnderlyingType(contract.UnderlyingType) == null;
// Primitives
case JsonToken.Integer:
case JsonToken.Float:
case JsonToken.String:
case JsonToken.Boolean:
case JsonToken.Undefined:
case JsonToken.Date:
case JsonToken.Bytes:
return !(contract is JsonPrimitiveContract); // reject if not primitive.
default:
return false;
}
}
}
Then you can add it to settings as follows:
var settings = new JsonSerializerSettings
{
Converters = { new TolerantDictionaryItemConverter<IDictionary<string, TValue>, TValue>() },
};
var root = JsonConvert.DeserializeObject<ResultView>(json, settings);
Or add it directly to ResultView with JsonConverterAttribute:
class ResultView
{
[JsonConverter(typeof(TolerantDictionaryItemConverter<IDictionary<string, string[][]>, string[][]>))]
public Dictionary<string, string[][]> Result { get; set; }
}
Notes:
I wrote the converter in a general way to handle any type of dictionary value including primitives such as int or DateTime as well as arrays or objects.
While a JSON file with invalid dictionary values (ones that cannot be deserialized to the dictionary value type) should be deserializable, an ill-formed JSON file (e.g. one that is truncated) should still result in an exception being thrown.
The converter handles this by first loading the value into a JToken then attempting to deserialize the token. If the file is ill-formed, JToken.Load(reader) will throw an exception, which is intentionally not caught.
Json.NET's exception handling is reported to be "very flaky" (see e.g. Issue #1580: Regression from Json.NET v6: cannot skip an invalid object value type in an array via exception handling) so I did not rely on it to skip invalid dictionary values.
I'm not 100% sure I got all cases of comment handling correct. So that may need additional testing.
Working sample .Net fiddle here.
I think you could ignore exceptions like this:
ResultView result = JsonConvert.DeserializeObject<ResultView>(jsonString,
new JsonSerializerSettings
{
Error = delegate (object sender, Newtonsoft.Json.Serialization.ErrorEventArgs args)
{
// System.Diagnostics.Debug.WriteLine(args.ErrorContext.Error.Message);
args.ErrorContext.Handled = true;
}
}
);
args.ErrorContext.Error.Message would contain the actual error message.
args.ErrorContext.Handled = true; will tell Json.Net to proceed.
I have a JSON input similar to this simplified example.
{
"model1": {
"$type": "MyType, MyAssembly",
"A": 5
},
"model2": {
"C": "something"
}
What I'd like to achieve is a "hybrid" result, which I mean a top-level ExpandoObject, having two properties model1 and model2, BUT model1 would have a strong type of MyType (based on the Json.NET type information. As model2 doesn't have type information, it would be a nested ExpandoObject. This logic should be the same across deeper nesting levels as well (see my update), the example is simplified in this regard.
My problem is that I can't achieve the "hybridness". One way I could have a completely strongly typed result (if the top-level object would be strongly typed), the other way I can have a completely dynamic result (everything is ExpandoObject, or the third way I could have a JObject which is meaningless in this scenario.
// this will give a fully dynamic result, regardless the child type information
var result = JsonConvert.DeserializeObject<ExpandoObject>(input, new JsonSerializerSettings { TypeNameHandling = TypeNameHandling.Auto });
UPDATE
I've just experimented with deserializing into a generic IDictionary, and that way I can get strongly typed results for top-level child properties, which technically solves my example. However, at lower levels it's still not working, and gives a JObject result for untyped child properties. So overall it's not a good solution for my real use case.
The problem is that Json.NET's ExpandoObjectConverter simply does not handle any of its own metadata properties such as "$type", "id" or "$ref".
However, since Json.NET is open source and its MIT license allows modification, the easiest solution may be to make your own copy of ExpandoObjectConverter and adapt it to your needs, along the lines of Json.NET Deserialization into dynamic object with referencing. You'll need to copy some low-level JSON utilities as well:
/// <summary>
/// Converts an ExpandoObject to and from JSON.
/// Adapted from https://github.com/JamesNK/Newtonsoft.Json/blob/master/Src/Newtonsoft.Json/Converters/ExpandoObjectConverter.cs
/// License: https://github.com/JamesNK/Newtonsoft.Json/blob/master/LICENSE.md
/// </summary>
public class TypeNameHandlingExpandoObjectConverter : JsonConverter
{
/// <summary>
/// Writes the JSON representation of the object.
/// </summary>
/// <param name="writer">The <see cref="JsonWriter"/> to write to.</param>
/// <param name="value">The value.</param>
/// <param name="serializer">The calling serializer.</param>
public override void WriteJson(JsonWriter writer, object value, JsonSerializer serializer)
{
// can write is set to false
}
/// <summary>
/// Reads the JSON representation of the object.
/// </summary>
/// <param name="reader">The <see cref="JsonReader"/> to read from.</param>
/// <param name="objectType">Type of the object.</param>
/// <param name="existingValue">The existing value of object being read.</param>
/// <param name="serializer">The calling serializer.</param>
/// <returns>The object value.</returns>
public override object ReadJson(JsonReader reader, Type objectType, object existingValue, JsonSerializer serializer)
{
return ReadValue(reader, serializer);
}
private object ReadValue(JsonReader reader, JsonSerializer serializer)
{
if (!reader.MoveToContent())
{
throw JsonSerializationExceptionHelper.Create(reader, "Unexpected end when reading ExpandoObject.");
}
switch (reader.TokenType)
{
case JsonToken.StartObject:
return ReadObject(reader, serializer);
case JsonToken.StartArray:
return ReadList(reader, serializer);
default:
if (JsonTokenUtils.IsPrimitiveToken(reader.TokenType))
{
return reader.Value;
}
throw JsonSerializationExceptionHelper.Create(reader, string.Format("Unexpected token when converting ExpandoObject: {0}", reader.TokenType));
}
}
private object ReadList(JsonReader reader, JsonSerializer serializer)
{
IList<object> list = new List<object>();
while (reader.Read())
{
switch (reader.TokenType)
{
case JsonToken.Comment:
break;
default:
object v = ReadValue(reader, serializer);
list.Add(v);
break;
case JsonToken.EndArray:
return list;
}
}
throw JsonSerializationExceptionHelper.Create(reader, "Unexpected end when reading ExpandoObject.");
}
private object ReadObject(JsonReader reader, JsonSerializer serializer)
{
if (serializer.TypeNameHandling != TypeNameHandling.None)
{
var obj = JObject.Load(reader);
Type polymorphicType = null;
var polymorphicTypeString = (string)obj["$type"];
if (polymorphicTypeString != null)
{
if (serializer.TypeNameHandling != TypeNameHandling.None)
{
string typeName, assemblyName;
ReflectionUtils.SplitFullyQualifiedTypeName(polymorphicTypeString, out typeName, out assemblyName);
polymorphicType = serializer.Binder.BindToType(assemblyName, typeName);
}
obj.Remove("$type");
}
if (polymorphicType == null || polymorphicType == typeof(ExpandoObject))
{
using (var subReader = obj.CreateReader())
return ReadExpandoObject(subReader, serializer);
}
else
{
using (var subReader = obj.CreateReader())
return serializer.Deserialize(subReader, polymorphicType);
}
}
else
{
return ReadExpandoObject(reader, serializer);
}
}
private object ReadExpandoObject(JsonReader reader, JsonSerializer serializer)
{
IDictionary<string, object> expandoObject = new ExpandoObject();
while (reader.Read())
{
switch (reader.TokenType)
{
case JsonToken.PropertyName:
string propertyName = reader.Value.ToString();
if (!reader.Read())
{
throw JsonSerializationExceptionHelper.Create(reader, "Unexpected end when reading ExpandoObject.");
}
object v = ReadValue(reader, serializer);
expandoObject[propertyName] = v;
break;
case JsonToken.Comment:
break;
case JsonToken.EndObject:
return expandoObject;
}
}
throw JsonSerializationExceptionHelper.Create(reader, "Unexpected end when reading ExpandoObject.");
}
/// <summary>
/// Determines whether this instance can convert the specified object type.
/// </summary>
/// <param name="objectType">Type of the object.</param>
/// <returns>
/// <c>true</c> if this instance can convert the specified object type; otherwise, <c>false</c>.
/// </returns>
public override bool CanConvert(Type objectType)
{
return (objectType == typeof(ExpandoObject));
}
/// <summary>
/// Gets a value indicating whether this <see cref="JsonConverter"/> can write JSON.
/// </summary>
/// <value>
/// <c>true</c> if this <see cref="JsonConverter"/> can write JSON; otherwise, <c>false</c>.
/// </value>
public override bool CanWrite
{
get { return false; }
}
}
internal static class JsonTokenUtils
{
// Adapted from https://github.com/JamesNK/Newtonsoft.Json/blob/master/Src/Newtonsoft.Json/Utilities/JsonTokenUtils.cs
public static bool IsPrimitiveToken(this JsonToken token)
{
switch (token)
{
case JsonToken.Integer:
case JsonToken.Float:
case JsonToken.String:
case JsonToken.Boolean:
case JsonToken.Undefined:
case JsonToken.Null:
case JsonToken.Date:
case JsonToken.Bytes:
return true;
default:
return false;
}
}
}
internal static class JsonReaderExtensions
{
// Adapted from internal bool JsonReader.MoveToContent()
// https://github.com/JamesNK/Newtonsoft.Json/blob/master/Src/Newtonsoft.Json/JsonReader.cs#L1145
public static bool MoveToContent(this JsonReader reader)
{
if (reader == null)
throw new ArgumentNullException();
JsonToken t = reader.TokenType;
while (t == JsonToken.None || t == JsonToken.Comment)
{
if (!reader.Read())
{
return false;
}
t = reader.TokenType;
}
return true;
}
}
internal static class JsonSerializationExceptionHelper
{
public static JsonSerializationException Create(this JsonReader reader, string format, params object[] args)
{
// Adapted from https://github.com/JamesNK/Newtonsoft.Json/blob/master/Src/Newtonsoft.Json/JsonPosition.cs
var lineInfo = reader as IJsonLineInfo;
var path = (reader == null ? null : reader.Path);
var message = string.Format(CultureInfo.InvariantCulture, format, args);
if (!message.EndsWith(Environment.NewLine, StringComparison.Ordinal))
{
message = message.Trim();
if (!message.EndsWith(".", StringComparison.Ordinal))
message += ".";
message += " ";
}
message += string.Format(CultureInfo.InvariantCulture, "Path '{0}'", path);
if (lineInfo != null && lineInfo.HasLineInfo())
message += string.Format(CultureInfo.InvariantCulture, ", line {0}, position {1}", lineInfo.LineNumber, lineInfo.LinePosition);
message += ".";
return new JsonSerializationException(message);
}
}
internal static class ReflectionUtils
{
// Utilities taken from https://github.com/JamesNK/Newtonsoft.Json/blob/master/Src/Newtonsoft.Json/Utilities/ReflectionUtils.cs
// I couldn't find a way to access these directly.
public static void SplitFullyQualifiedTypeName(string fullyQualifiedTypeName, out string typeName, out string assemblyName)
{
int? assemblyDelimiterIndex = GetAssemblyDelimiterIndex(fullyQualifiedTypeName);
if (assemblyDelimiterIndex != null)
{
typeName = fullyQualifiedTypeName.Substring(0, assemblyDelimiterIndex.GetValueOrDefault()).Trim();
assemblyName = fullyQualifiedTypeName.Substring(assemblyDelimiterIndex.GetValueOrDefault() + 1, fullyQualifiedTypeName.Length - assemblyDelimiterIndex.GetValueOrDefault() - 1).Trim();
}
else
{
typeName = fullyQualifiedTypeName;
assemblyName = null;
}
}
private static int? GetAssemblyDelimiterIndex(string fullyQualifiedTypeName)
{
int scope = 0;
for (int i = 0; i < fullyQualifiedTypeName.Length; i++)
{
char current = fullyQualifiedTypeName[i];
switch (current)
{
case '[':
scope++;
break;
case ']':
scope--;
break;
case ',':
if (scope == 0)
{
return i;
}
break;
}
}
return null;
}
}
Then use it like:
var settings = new JsonSerializerSettings
{
Formatting = Newtonsoft.Json.Formatting.Indented,
TypeNameHandling = TypeNameHandling.Auto,
Converters = new [] { new TypeNameHandlingExpandoObjectConverter() },
};
var expando2 = JsonConvert.DeserializeObject<ExpandoObject>(input, settings);
Prototype fiddle.
Finally, when using TypeNameHandling, do take note of this caution from the Newtonsoft docs:
TypeNameHandling should be used with caution when your application deserializes JSON from an external source. Incoming types should be validated with a custom SerializationBinder when deserializing with a value other than None.
For a discussion of why this may be necessary, see TypeNameHandling caution in Newtonsoft Json.
Here's how I would do it:
void Main()
{
var json = "{\r\n \"model1\": {\r\n \"$type\": \"MyType, MyAssembly\",\r\n \"A\": 5\r\n },\r\n \"model2" +
"\": {\r\n \"C\": \"something\"\r\n}}";
var result = JsonConvert.DeserializeObject<Result>(json);
}
public class Result
{
public MyType Model1 { get; set; }
public ExpandoObject Model2 { get; set;}
}
public class MyType { public int A { get; set;} }
You can also give Result.Model2 a type of dynamic (which allows you to access its properties using syntax like result.Model2.something), or JSON.NET's JObject, which is more JSON-oriented.
However, if you're saying that you don't want a class like Result, but you want the JSON's $type to be able to determine a specific instance type, you can use the TypeNameHandling setting.
var result = JsonConvert.DeserializeObject<ExpandoObject>(
json,
new JsonSerializerSettings {TypeNameHandling = TypeNameHandling.Auto});
Just be aware that there are security implications if you're allowing client-provided JSON values to instantiate arbitrary types in your .NET environment.
A bit of an old thread, but here is what can be done.
First construct your ExpandoObject. For instance:
dynamic someObject = new ExpandoObject();
someObject.Name = "My Expando Object";
someObject.SomeProperty = 123;
I would then suggest to use "JsonConvert" in order to serialise this object (I know you want to do it the other way around, but bear with me).
So let's serialise "someObject" into a test.json file as per:
JsonSerializerSettings settings = new JsonSerializerSettings
{
TypeNameHandling = TypeNameHandling.All // Or Use .Auto for light weigth output
};
File.WriteAllText(#"e:\test.json", JsonConvert.SerializeObject(someObject, settings));
Now, if you open the resulting Json file, you will be able to see the exact syntax required by this JsonConverter. You are now able to write your own Json and do the inverse operation. ie. Deserializing your own Json file into a dynamic object as here below:
dynamic test = JsonConvert.DeserializeObject<ExpandoObject>(File.ReadAllText(#"e:\yourOwnJsonFile.json"), settings);
Finally, in order to access your object's properties, do something of the sort:
((dynamic)test.Name = "My Own Expando Object";
Referencing this SO question about custom serialization of strings to enums and vice versa in Json.NET, decorating the enum members using the EnumMember attribute - is there a way to get MongoDB to perform the same feat?
I have just refactored some previously string fields to enums and was wondering if there is any way to instruct Mongo to also read the EnumMember values when (de-)serializing and avoid me having to go through the database and update all the current text values.
i´m using package: PackageReference Include="MongoDB.Bson" Version="2.12.1"
my map class:
public class OfferMap
{
public static void Configure()
{
BsonClassMap.RegisterClassMap<Offer>(map => //Offer is a class
{
map.AutoMap();
map.SetIgnoreExtraElements(true);
map
.SetIsRootClass(true);
map
.MapMember(x => x.OfferType)
.SetSerializer(new EnumSerializer<OfferType>(MongoDB.Bson.BsonType.String)) // OfferType is an Enum
.SetElementName("offerType")
.SetIgnoreIfNull(false)
.SetIsRequired(true);
I used a CustomEnumSerializer to handle the EnumMember attribute
public class CustomEnumSerializer<TEnum> : StructSerializerBase<TEnum>, IRepresentationConfigurable<CustomEnumSerializer<TEnum>> where TEnum : struct
{
private static readonly Dictionary<Type, Dictionary<string, object>> _fromValueMap = new Dictionary<Type, Dictionary<string, object>>(); // string representation to Enum value map
private static readonly Dictionary<Type, Dictionary<object, string>> _toValueMap = new Dictionary<Type, Dictionary<object, string>>(); // Enum value to string map
// private fields
private readonly BsonType _representation;
// constructors
/// <summary>
/// Initializes a new instance of the <see cref="EnumSerializer{TEnum}"/> class.
/// </summary>
public CustomEnumSerializer()
: this((BsonType)0) // 0 means use underlying type
{
}
/// <summary>
/// Initializes a new instance of the <see cref="EnumSerializer{TEnum}"/> class.
/// </summary>
/// <param name="representation">The representation.</param>
public CustomEnumSerializer(BsonType representation)
{
switch (representation)
{
case 0:
case BsonType.Int32:
case BsonType.Int64:
case BsonType.String:
break;
default:
var message = string.Format("{0} is not a valid representation for an EnumSerializer.", representation);
throw new ArgumentException(message);
}
// don't know of a way to enforce this at compile time
var enumTypeInfo = typeof(TEnum).GetTypeInfo();
if (!enumTypeInfo.IsEnum)
{
var message = string.Format("{0} is not an enum type.", typeof(TEnum).FullName);
throw new BsonSerializationException(message);
}
_representation = representation;
if (representation == BsonType.String)
{
var enumType = typeof(TEnum);
if (!_fromValueMap.ContainsKey(enumType))
{
Dictionary<string, object> fromMap = new Dictionary<string, object>(StringComparer.InvariantCultureIgnoreCase);
Dictionary<object, string> toMap = new Dictionary<object, string>();
FieldInfo[] fields = enumType.GetFields(BindingFlags.Static | BindingFlags.Public);
foreach (FieldInfo field in fields)
{
string name = field.Name;
object enumValue = Enum.Parse(enumType, name);
// use EnumMember attribute if exists
EnumMemberAttribute enumMemberAttrbiute = field.GetCustomAttribute<EnumMemberAttribute>();
if (enumMemberAttrbiute != null)
{
string enumMemberValue = enumMemberAttrbiute.Value;
fromMap[enumMemberValue] = enumValue;
toMap[enumValue] = enumMemberValue;
}
else
{
toMap[enumValue] = name;
}
fromMap[name] = enumValue;
}
_fromValueMap[enumType] = fromMap;
_toValueMap[enumType] = toMap;
}
}
}
// public properties
/// <summary>
/// Gets the representation.
/// </summary>
/// <value>
/// The representation.
/// </value>
public BsonType Representation
{
get { return _representation; }
}
// public methods
/// <summary>
/// Deserializes a value.
/// </summary>
/// <param name="context">The deserialization context.</param>
/// <param name="args">The deserialization args.</param>
/// <returns>A deserialized value.</returns>
public override TEnum Deserialize(BsonDeserializationContext context, BsonDeserializationArgs args)
{
var bsonReader = context.Reader;
var bsonType = bsonReader.GetCurrentBsonType();
switch (bsonType)
{
case BsonType.Int32: return (TEnum)Enum.ToObject(typeof(TEnum), bsonReader.ReadInt32());
case BsonType.Int64: return (TEnum)Enum.ToObject(typeof(TEnum), bsonReader.ReadInt64());
case BsonType.Double: return (TEnum)Enum.ToObject(typeof(TEnum), (long)bsonReader.ReadDouble());
case BsonType.String:
var fromValue = FromValue(typeof(TEnum), bsonReader.ReadString());
return (TEnum)Enum.Parse(typeof(TEnum), fromValue.ToString());
default:
throw CreateCannotDeserializeFromBsonTypeException(bsonType);
}
}
/// <summary>
/// Serializes a value.
/// </summary>
/// <param name="context">The serialization context.</param>
/// <param name="args">The serialization args.</param>
/// <param name="value">The object.</param>
public override void Serialize(BsonSerializationContext context, BsonSerializationArgs args, TEnum value)
{
var bsonWriter = context.Writer;
switch (_representation)
{
case 0:
var underlyingTypeCode = Type.GetTypeCode(Enum.GetUnderlyingType(typeof(TEnum)));
if (underlyingTypeCode == TypeCode.Int64 || underlyingTypeCode == TypeCode.UInt64)
{
goto case BsonType.Int64;
}
else
{
goto case BsonType.Int32;
}
case BsonType.Int32:
bsonWriter.WriteInt32(Convert.ToInt32(value));
break;
case BsonType.Int64:
bsonWriter.WriteInt64(Convert.ToInt64(value));
break;
case BsonType.String:
var val = ToValue(typeof(TEnum), value);
bsonWriter.WriteString(val);
break;
default:
throw new BsonInternalException("Unexpected EnumRepresentation.");
}
}
private string ToValue(Type enumType, object obj)
{
Dictionary<object, string> map = _toValueMap[enumType];
return map[obj];
}
private object FromValue(Type enumType, string value)
{
Dictionary<string, object> map = _fromValueMap[enumType];
if (!map.ContainsKey(value))
return value;
return map[value];
}
/// <summary>
/// Returns a serializer that has been reconfigured with the specified representation.
/// </summary>
/// <param name="representation">The representation.</param>
/// <returns>The reconfigured serializer.</returns>
public CustomEnumSerializer<TEnum> WithRepresentation(BsonType representation)
{
if (representation == _representation)
{
return this;
}
else
{
return new CustomEnumSerializer<TEnum>(representation);
}
}
// explicit interface implementations
IBsonSerializer IRepresentationConfigurable.WithRepresentation(BsonType representation)
{
return WithRepresentation(representation);
}
}
I needed a custom deserializer that would return the default value when encountering an unexpected value in the data, rather than the default behavior of throwing a deserialization exception.
public class CustomEnumSerializer<TEnum>: MongoDB.Bson.Serialization.Serializers.EnumSerializer<TEnum>
where TEnum : struct, IComparable, IFormattable, IConvertible
{
public override TEnum Deserialize(BsonDeserializationContext context, BsonDeserializationArgs args)
{
var bsonReader = context.Reader;
var bsonType = bsonReader.GetCurrentBsonType();
var val = "";
switch (bsonType)
{
case BsonType.String:
val = bsonReader.ReadString() ?? "";
break;
case BsonType.Int32:
val = bsonReader.ReadInt32().ToString();
break;
case BsonType.Int64:
val = bsonReader.ReadInt64().ToString();
break;
case BsonType.Null:
return default(TEnum);
default:
return base.Deserialize(context, args);
}
if(Enum.TryParse(val, true, out TEnum result)){
return result;
}
return default(TEnum);
}
}
To implement it in your repository:
static MyRepository()
{
BsonClassMap.RegisterClassMap<MyDataType>(ms =>
{
ms.AutoMap();
ms.GetMemberMap(i => i.MyEnum)
.SetSerializer(new CustomEnumSerializer<MyEnumType>());
});
}
I have the following app that shows that the key part of a Dictionary is not sent to JsonConverter, but it is called ToString() on. This is an issue for me as I can't deserialize my Json string .
Any ideas?
class Program
{
static void Main(string[] args)
{
var coll = new Dictionary<Tuple<string,string>, string>();
coll.Add(Tuple.Create("key1", "KEY1"), "Value1");
coll.Add(Tuple.Create("key2", "KEY2"), "Value2");
string json = JsonConvert.SerializeObject(coll);
Dictionary<Tuple<string, string>, string> coll2;
Console.WriteLine(json);
//coll2 = JsonConvert.DeserializeObject<Dictionary<Tuple<string, string>, string>>(json);
// It throws an exception here
//foreach (var k in coll2)
//{
// Console.WriteLine("<{0}|{1}>",k.Key, k.Value);
//}
var t = Tuple.Create("key1", "key2");
Console.WriteLine(t.ToString());
string json2 = JsonConvert.SerializeObject(t);
Console.WriteLine(json2);
}
}
Output :
{"(key1, KEY1)":"Value1","(key2, KEY2)":"Value2"} (key1, key2)
{"Item1":"key1","Item2":"key2"}
Press any key to continue . . .
I also had the same problem with Deserializing a Dictionary with Tuple as key. JSON converts the tuple into a mere string. But in my case, i cannot avoid using Tuple as key in the dictionary. So i made a custom JSON convertor to Deserialize the Dictionary with Tuple as key and it worked well.
I have modified the same as per your code. Hope it will work fine and can give you an idea about JSON CustomConverter. Also explained better with comments.
public class TupleKeyConverter : JsonConverter
{
/// <summary>
/// Override ReadJson to read the dictionary key and value
/// </summary>
/// <param name="reader"></param>
/// <param name="objectType"></param>
/// <param name="existingValue"></param>
/// <param name="serializer"></param>
/// <returns></returns>
public override object ReadJson(JsonReader reader, Type objectType, object existingValue, JsonSerializer serializer)
{
Tuple<string, string> _tuple = null;
string _value = null;
var _dict = new Dictionary<Tuple<string, string>, string>();
//loop through the JSON string reader
while (reader.Read())
{
// check whether it is a property
if (reader.TokenType == JsonToken.PropertyName)
{
string readerValue = reader.Value.ToString();
if (reader.Read())
{
// check if the property is tuple (Dictionary key)
if (readerValue.Contains('(') && readerValue.Contains(')'))
{
string[] result = ConvertTuple(readerValue);
if (result == null)
continue;
// Custom Deserialize the Dictionary key (Tuple)
_tuple = Tuple.Create<string, string>(result[0].Trim(), result[1].Trim());
// Custom Deserialize the Dictionary value
_value = (string)serializer.Deserialize(reader, _value.GetType());
_dict.Add(_tuple, _value);
}
else
{
// Deserialize the remaining data from the reader
serializer.Deserialize(reader);
break;
}
}
}
}
return _dict;
}
/// <summary>
/// To convert Tuple
/// </summary>
/// <param name="_string"></param>
/// <returns></returns>
public string[] ConvertTuple(string _string)
{
string tempStr = null;
// remove the first character which is a brace '('
if (_string.Contains('('))
tempStr = _string.Remove(0, 1);
// remove the last character which is a brace ')'
if (_string.Contains(')'))
tempStr = tempStr.Remove(tempStr.Length - 1, 1);
// seperate the Item1 and Item2
if (_string.Contains(','))
return tempStr.Split(',');
return null;
}
/// <summary>
/// WriteJson needs to be implemented since it is an abstract function.
/// </summary>
/// <param name="writer"></param>
/// <param name="value"></param>
/// <param name="serializer"></param>
public override void WriteJson(JsonWriter writer, object value, JsonSerializer serializer)
{
serializer.Serialize(writer, value);
}
/// <summary>
/// Check whether to convert or not
/// </summary>
/// <param name="objectType"></param>
/// <returns></returns>
public override bool CanConvert(Type objectType)
{
return true;
}
}
Now declare a property as follows. JsonConvertor Property is important.
[JsonConverter(typeof(TupleKeyConverter))]
public Dictionary<Tuple<int,string>,string> MyDict {get; set;}
Or you could try this to replace this in your code. though i never tested.
coll2 = JsonConvert.DeserializeObject<Dictionary<Tuple<string, string>, string>>("", new TupleKeyConverter());
Based on the information you have provided, I would suggest that instead of using a Tuple as your key, use a custom struct or object and override the ToString method. Then you can serialize/deserialize as you wish.