Line delimited json serializing and de-serializing - c#

I am using JSON.NET and C# 5. I need to serialize/de-serialize list of objects into line delimited json. http://en.wikipedia.org/wiki/Line_Delimited_JSON. Example,
{"some":"thing1"}
{"some":"thing2"}
{"some":"thing3"}
and
{"kind": "person", "fullName": "John Doe", "age": 22, "gender": "Male", "citiesLived": [{ "place": "Seattle", "numberOfYears": 5}, {"place": "Stockholm", "numberOfYears": 6}]}
{"kind": "person", "fullName": "Jane Austen", "age": 24, "gender": "Female", "citiesLived": [{"place": "Los Angeles", "numberOfYears": 2}, {"place": "Tokyo", "numberOfYears": 2}]}
Why I needed because its Google BigQuery requirement https://cloud.google.com/bigquery/preparing-data-for-bigquery
Update: One way I found is that serialize each object seperataly and join in the end with new-line.

You can do so by manually parsing your JSON using JsonTextReader and setting the SupportMultipleContent flag to true.
If we look at your first example, and create a POCO called Foo:
public class Foo
{
[JsonProperty("some")]
public string Some { get; set; }
}
This is how we parse it:
var json = "{\"some\":\"thing1\"}\r\n{\"some\":\"thing2\"}\r\n{\"some\":\"thing3\"}";
var jsonReader = new JsonTextReader(new StringReader(json))
{
SupportMultipleContent = true // This is important!
};
var jsonSerializer = new JsonSerializer();
while (jsonReader.Read())
{
Foo foo = jsonSerializer.Deserialize<Foo>(jsonReader);
}
If you want list of items as result simply add each item to a list inside the while loop to your list.
listOfFoo.Add(jsonSerializer.Deserialize<Foo>(jsonReader));
Note: with Json.Net 10.0.4 and later same code also supports comma separated JSON entries see How to deserialize dodgy JSON (with improperly quoted strings, and missing brackets)?)

To implement with .NET 5 (C# 9) and the System.Text.Json.JsonSerializer class, and for "big" data, I wrote code for streaming processing.
Using the System.IO.Pipelines extension package, this is quite efficient.
using System;
using System.Buffers;
using System.Collections.Generic;
using System.IO;
using System.IO.Pipelines;
using System.Text;
using System.Text.Json;
using System.Threading.Tasks;
class Program
{
static readonly byte[] NewLineChars = {(byte)'\r', (byte)'\n'};
static readonly byte[] WhiteSpaceChars = {(byte)'\r', (byte)'\n', (byte)' ', (byte)'\t'};
private static async Task Main()
{
JsonSerializerOptions jsonOptions = new(JsonSerializerDefaults.Web);
var json = "{\"some\":\"thing1\"}\r\n{\"some\":\"thing2\"}\r\n{\"some\":\"thing3\"}";
var contentStream = new MemoryStream(Encoding.UTF8.GetBytes(json));
var pipeReader = PipeReader.Create(contentStream);
await foreach (var foo in ReadItemsAsync<Foo>(pipeReader, jsonOptions))
{
Console.WriteLine($"foo: {foo.Some}");
}
}
static async IAsyncEnumerable<TValue> ReadItemsAsync<TValue>(PipeReader pipeReader, JsonSerializerOptions jsonOptions = null)
{
while (true)
{
var result = await pipeReader.ReadAsync();
var buffer = result.Buffer;
bool isCompleted = result.IsCompleted;
SequencePosition bufferPosition = buffer.Start;
while (true)
{
var(value, advanceSequence) = TryReadNextItem<TValue>(buffer, ref bufferPosition, isCompleted, jsonOptions);
if (value != null)
{
yield return value;
}
if (advanceSequence)
{
pipeReader.AdvanceTo(bufferPosition, buffer.End); //advance our position in the pipe
break;
}
}
if (isCompleted)
yield break;
}
}
static (TValue, bool) TryReadNextItem<TValue>(ReadOnlySequence<byte> sequence, ref SequencePosition sequencePosition, bool isCompleted, JsonSerializerOptions jsonOptions)
{
var reader = new SequenceReader<byte>(sequence.Slice(sequencePosition));
while (!reader.End) // loop until we've come to the end or read an item
{
if (reader.TryReadToAny(out ReadOnlySpan<byte> itemBytes, NewLineChars, advancePastDelimiter: true))
{
sequencePosition = reader.Position;
if (itemBytes.TrimStart(WhiteSpaceChars).IsEmpty)
{
continue;
}
return (JsonSerializer.Deserialize<TValue>(itemBytes, jsonOptions), false);
}
else if (isCompleted)
{
// read last item
var remainingReader = sequence.Slice(reader.Position);
ReadOnlySpan<byte> remainingSpan = remainingReader.IsSingleSegment ? remainingReader.First.Span : remainingReader.ToArray();
reader.Advance(remainingReader.Length); // advance reader to the end
sequencePosition = reader.Position;
if (!remainingSpan.TrimStart(WhiteSpaceChars).IsEmpty)
{
return (JsonSerializer.Deserialize<TValue>(remainingSpan, jsonOptions), true);
}
else
{
return (default, true);
}
}
else
{
// no more items in sequence
break;
}
}
// PipeReader needs to read more
return (default, true);
}
}
public class Foo
{
public string Some
{
get;
set;
}
}
Run at https://dotnetfiddle.net/M5cNo1

Related

Parse JSON elegantly

I have a web api controller in .NET Core 2.1, which receives
JToken jsonBody
The json has the following structure
{
"id": "xxx",
"payload": {
"TelephoneNumber": "1111",
"Name": "Hans"
}
}
and more fields, but it's irrelevant.
I want to retrieve the Number and Name elegantly. Currently, I do the following, which I'm sure could be done in a nicer way:
var payload = JObject.Parse(jsonBody.SelectToken("Payload").ToString());
telephoneNumber = new TelephoneNumber(payload.SelectToken("TelephoneNumber").ToString());
I've tried just doing
jsonBody.SelectToken("Payload.TelephoneNumber")
but that doesn't work. I think that it's because somehow the jsonBody, that the controller receives, has only parsed the top nodes as json, hence it could be that it regards the value of
jsonBody.SelectToken("Payload")
as a string.
As per official documentation - you can do something like this:
var phone = jsonBody["payload"]["TelephoneNumber"].ToString();
var name = jsonBody["payload"]["Name"].ToString();
See a live demo on rextester.
This is at least a little bit more elegant:
var jsonBody = JObject.Parse(#"{
'id': 'xxx',
'payload': {
'TelephoneNumber': '1111',
'Name': 'Hans'
}
}");
var phone = jsonBody["payload"]["TelephoneNumber"].Value<string>();
var name = jsonBody["payload"]["Name"].Value<string>();
If you don't want to deserialize your full json, you can create a class with the properties you need
public class Payload
{
public string TelephoneNumber { get; set; }
public string Name { get; set; }
}
And then use JsonTextReader to deserialize the string:
private static Payload DeserializePayload(JToken token)
{
var serializer = new JsonSerializer();
using (JsonTextReader reader = new JsonTextReader(new StringReader(token.ToString())))
{
reader.CloseInput = true;
while (reader.Read())
{
if (reader.TokenType == JsonToken.StartObject && reader.Path.Equals("payload"))
{
var payload = serializer.Deserialize<Payload>(reader);
return payload;
}
}
}
// not found - return null? throw exception?
return null;
}
Testing the code:
var token = JToken.Parse(#"{
""id"": ""xxx"",
""payload"": {
""TelephoneNumber"": ""1111"",
""Name"": ""Hans""
}
}");
Payload payload = DeserializePayload(token);
Console.WriteLine($"Name: {payload.Name}, Phone number: {payload.TelephoneNumber}");

How to export JSON string to CSV file?

I have ASP.Net site. I have JSON string that need to exported to physical CSV file.
private String JsonToCsv(string jsonData, string delimiter)
{
try
{
StringWriter swObj = new StringWriter();
using (var csv = new CsvWriter(swObj))
{
csv.Configuration.SkipEmptyRecords = true;
csv.Configuration.WillThrowOnMissingField = false;
csv.Configuration.Delimiter = delimiter;
using (var dt = jsonStringToTable(jsonData))
{
foreach (DataColumn col in dt.Columns)
{
csv.WriteField(col.ColumnName);
}
csv.NextRecord();
foreach(DataRow row in dt.Rows)
{
for (var i = 0; i < dt.Columns.Count; i++)
{
csv.WriteField(row[i]);
}
csv.NextRecord();
}
}
}
return swObj.ToString();
}
catch (Exception ex)
{
//handle exception
return null;
}
}
private DataTable jsonStringToTable(string jsonContent)
{
DataTable dt = JsonConvert.DeserializeObject<DataTable>(jsonContent);
return dt;
}
public HttpResponseMessage ExportToCSV(string jsonData)
{
string csv = JsonToCsv(jsonData, ",");
HttpResponseMessage res = new HttpResponseMessage(HttpStatusCode.OK);
res.Content = new StringContent(csv);
res.Content.Headers.ContentType = new MediaTypeHeaderValue("text/csv");
res.Content.Headers.ContentDisposition = new ContentDispositionHeaderValue("attachment") { FileName = "export.csv" };
return res;
}
But neither I am getting any exception nor CSV file is getting any data.
The Export.csv is located in the root folder.
How do I export the JSON & auto download the file??
It seems you do have an error, that you are suppressing in your catch.
Your first error is in that your jsonContent is not actually json. You have a variable assignment jsonContent = [...] in your sample. the section [...] is your actual json.
to handle that, you need only compose it better, by not having it assign to a variable (recommended approach), or handle instances here with jsonContent.Split(new [] {'='}).Last(). (a declarative vs imperative approach/strategy).
Also, you are attempting to deserialize into an incorrect type, for it does not reflect your json data structure.
although there are other manners to convert and process one string to another. I do agree the proper thing to do here is to deserialize your object (or not serialize beforehand - recommended).
I'm providing a sample Console Application for you to review two implementations of handle a JsonToCsv operation.
dynamic (imperative)
and providing a Type and using System.Reflection on that type. (declarative to the Json.Convert.DeserializeObject<T>() method, imperative afterward)
there is a dependency on NewtonSoft.Json Assembly (Install it via NuGet package) in this implementation; it reflects your provided code.
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Dynamic;
using System.Reflection;
using Newtonsoft;
namespace JsonToCsvTests
{
using Newtonsoft.Json;
using System.IO;
using System.Threading.Tasks;
class Program
{
static void Main(string[] args)
{
TestJsonToCsv();
Console.ReadLine();
}
static void TestJsonToCsv()
{
string jsonData = #"jsonData = [
{
""DocumentName"": ""Test Document"",
""ActionDate"": ""2015-09-25T16:06:25.083"",
""ActionType"": ""View"",
""ActionPerformedBy"": ""Sreeja SJ""
},
{
""DocumentName"": ""Test Document"",
""ActionDate"": ""2015-09-25T16:12:02.497"",
""ActionType"": ""View"",
""ActionPerformedBy"": ""Sreeja SJ""
},
{
""DocumentName"": ""Test Document"",
""ActionDate"": ""2015-09-25T16:13:48.013"",
""ActionType"": ""View"",
""ActionPerformedBy"": ""Sreeja SJ""
}]";
Console.WriteLine("...using System.Dynamic and casts");
Console.WriteLine();
Console.WriteLine(JsonToCsv(jsonData, ","));
Console.WriteLine();
Console.WriteLine();
Console.WriteLine("...using a provided StrongType with System.Reflection.");
Console.WriteLine();
Console.WriteLine(JsonToCsv<JsonData>(jsonData, ","));
}
static private string JsonToCsv(string jsonContent, string delimiter)
{
var data = jsonStringToTable(jsonContent);
var headers = ((IEnumerable<dynamic>)((IEnumerable<dynamic>)data).First()).Select((prop) => prop.Name).ToArray();
var csvList = new List<string>
{
string.Join(delimiter, headers.Select((prop) => string.Format(#"""{0}""", prop)).ToArray())
};
var lines = ((IEnumerable<dynamic>)data)
.Select(row => row)
.Cast<IEnumerable<dynamic>>()
.Select((instance) => string.Join(delimiter, instance.Select((v) => string.Format(#"""{0}""", v.Value))))
.ToArray();
csvList.AddRange(lines);
return string.Join(Environment.NewLine, csvList );
}
static private string JsonToCsv<T>(string jsonContent, string delimiter) where T : class
{
var data = jsonStringToTable<T>(jsonContent);
var properties = data.First().GetType().GetProperties();
var lines = string.Join(Environment.NewLine,
string.Join(delimiter, properties.Select((propInfo) => string.Format(#"""{0}""", propInfo.Name))),
string.Join(Environment.NewLine, data.Select((row) => string.Join(delimiter, properties.Select((propInfo) => string.Format(#"""{0}""", propInfo.GetValue(row)))))));
return lines;
}
static private dynamic jsonStringToTable(string jsonContent)
{
var json = jsonContent.Split(new[] { '=' }).Last();
return JsonConvert.DeserializeObject<dynamic>(json);
}
static private IEnumerable<T> jsonStringToTable<T>(string jsonContent) where T : class
{
var json = jsonContent.Split(new[] { '=' }).Last();
return JsonConvert.DeserializeObject<IEnumerable<T>>(json);
}
public class JsonData
{
public string DocumentName { get; set; }
public DateTime ActionDate { get; set; }
public string ActionType { get; set; }
public string ActionPerformedBy { get; set; }
}
}
}
Console.Output
...using System.Dynamic and casts
"DocumentName","ActionDate","ActionType","ActionPerformedBy"
"Test Document","9/25/2015 4:06:25 PM","View","Sreeja SJ"
"Test Document","9/25/2015 4:12:02 PM","View","Sreeja SJ"
"Test Document","9/25/2015 4:13:48 PM","View","Sreeja SJ"
...using a provided StrongType with System.Reflection.
"DocumentName","ActionDate","ActionType","ActionPerformedBy"
"Test Document","9/25/2015 4:06:25 PM","View","Sreeja SJ"
"Test Document","9/25/2015 4:12:02 PM","View","Sreeja SJ"
"Test Document","9/25/2015 4:13:48 PM","View","Sreeja SJ"
This is what i use to generate CSV file on my ASP.NET Website
public static class CSVUtils
{
public static void AddCsvLine(bool isFrenchSeparator, StringBuilder csv, params object[] values)
{
foreach (var value in values)
{
csv.Append('"').Append(value).Append('"');
if (isFrenchSeparator)
{
csv.Append(';');
}
else
{
csv.Append(',');
}
}
csv.Append('\r'); // AppendLine() adds a double line break with UTF32Encoding
}
}
public FileContentResult ExportCSV()
{
StringBuilder csv = new StringBuilder();
CSVUtils.AddCsvLine(false, csv, "Field1", "Field2", "Field3");
CSVUtils.AddCsvLine(false, csv, "value1", "value2", "value3");
return this.File(new UTF32Encoding().GetBytes(csv.ToString()), "text/csv", "myfile.csv");
}
I basically call the ExportCSV action from my website, on a button click for example and it downloads the file. Make sure to clean your JSON beforehand from all coma otherwise it would mess your CSV file.
EDIT: Specifically for JSON, you'd also have to anti-slash every " otherwise it would mess the StringBuilder i guess

combine all json paths in string c#

i have a large json data and i want to get all paths from the root until getting value for the paths then storing the result to a string as i described bellow
here is my json for example
{
"root":{
"first":{
"first1":{
"value":"1"
},
"first2":{
"value":"2"
},
"first3":{
"value":"3"
}
},
"second":{
"second1":{
"value":"1"
},
"second2":{
"value":"2"
},
"second3":{
"value":"3"
}
},
"third":{
"third1":{
"value":"1"
},
"third2":{
"value":"2"
},
"third3":{
"value":"3"
}
},
"four":{
"value":"4"
},
"five":{
"five1":{
"five11":{
"value":"five11"
},
"five12":{
"value":"five12"
}
},
"five2":{
"five21":{
"five211":{
"value":"five211"
}
}
}
}
}
}
then i want to make each paths like bellow dynamically in c# and showing in screen please tell me a way to make this
root.first.first1.value
root.first.first2.value
root.first.first3.value
root.second.second1.value
......
root.four.value
root.five.five1.five11.value
root.five.five1.five12.value
....
root.five2.five21.five211.value
Use JSON.NET and iterate recursively through the Children property and check if the current token doesn't have HasValues set to true and if that's the case add the Path property of that token to a StringBuilder or what have you. Should give you exactly what you want.
Edith: Code Sample
I was lazy and just included the whole console application code.
Example on dotnetfiddle.net
using System;
using Newtonsoft.Json;
using Newtonsoft.Json.Linq;
using System.Text;
public class Program
{
public static void Main()
{
var json = #"
{
""root"":{
""first"":{
""first1"":{
""value"":""1""
},
""first2"":{
""value"":""2""
},
""first3"":{
""value"":""3""
}
}
}
}";
var jobject = JObject.Parse (json);
var sb = new StringBuilder ();
RecursiveParse (sb, jobject);
Console.WriteLine (sb.ToString());
}
public static void RecursiveParse(StringBuilder sb, JToken token)
{
foreach (var item in token.Children()) {
if (item.HasValues)
{
RecursiveParse (sb, item);
} else {
sb.AppendLine (item.Path);
}
}
}
}

Writing enumerable to csv file

I'm sure its very straightforward but I am struggling to figure out how to write an array to file using CSVHelper.
I have a class for example
public class Test
{
public Test()
{
data = new float[]{0,1,2,3,4};
}
public float[] data{get;set;}
}
i would like the data to be written with each array value in a separate cell. I have a custom converter below which is instead providing one cell with all the values in it.
What am I doing wrong?
public class DataArrayConverter<T> : ITypeConverter
{
public string ConvertToString(TypeConverterOptions options, object value)
{
var data = (T[])value;
var s = string.Join(",", data);
}
public object ConvertFromString(TypeConverterOptions options, string text)
{
throw new NotImplementedException();
}
public bool CanConvertFrom(Type type)
{
return type == typeof(string);
}
public bool CanConvertTo(Type type)
{
return type == typeof(string);
}
}
To further detail the answer from Josh Close, here what you need to do to write any IEnumerable (including arrays and generic lists) in a recent version (anything above 3.0) of CsvHelper!
Here the class under test:
public class Test
{
public int[] Data { get; set; }
public Test()
{
Data = new int[] { 0, 1, 2, 3, 4 };
}
}
And a method to show how this can be saved:
static void Main()
{
using (var writer = new StreamWriter("db.csv"))
using (var csv = new CsvWriter(writer))
{
var list = new List<Test>
{
new Test()
};
csv.Configuration.HasHeaderRecord = false;
csv.WriteRecords(list);
writer.Flush();
}
}
The important configuration here is csv.Configuration.HasHeaderRecord = false;. Only with this configuration you will be able to see the data in the csv file.
Further details can be found in the related unit test cases from CsvHelper.
In case you are looking for a solution to store properties of type IEnumerable with different amounts of elements, the following example might be of any help:
using CsvHelper;
using CsvHelper.Configuration;
using System.Collections.Generic;
using System.IO;
using System.Linq;
namespace CsvHelperSpike
{
class Program
{
static void Main(string[] args)
{
using (var writer = new StreamWriter("db.csv"))
using (var csv = new CsvWriter(writer))
{
csv.Configuration.Delimiter = ";";
var list = new List<AnotherTest>
{
new AnotherTest("Before String") { Tags = new List<string> { "One", "Two", "Three" }, After="After String" },
new AnotherTest("This is still before") {After="after again", Tags=new List<string>{ "Six", "seven","eight", "nine"} }
};
csv.Configuration.RegisterClassMap<TestIndexMap>();
csv.WriteRecords(list);
writer.Flush();
}
using(var reader = new StreamReader("db.csv"))
using(var csv = new CsvReader(reader))
{
csv.Configuration.IncludePrivateMembers = true;
csv.Configuration.RegisterClassMap<TestIndexMap>();
var result = csv.GetRecords<AnotherTest>().ToList();
}
}
private class AnotherTest
{
public string Before { get; private set; }
public string After { get; set; }
public List<string> Tags { get; set; }
public AnotherTest() { }
public AnotherTest(string before)
{
this.Before = before;
}
}
private sealed class TestIndexMap : ClassMap<AnotherTest>
{
public TestIndexMap()
{
Map(m => m.Before).Index(0);
Map(m => m.After).Index(1);
Map(m => m.Tags).Index(2);
}
}
}
}
By using the ClassMap it is possible to enable HasHeaderRecord (the default) again. It is important to note here, that this solution will only work, if the collection with different amounts of elements is the last property. Otherwise the collection needs to have a fixed amount of elements and the ClassMap needs to be adapted accordingly.
This example also shows how to handle properties with a private set. For this to work it is important to use the csv.Configuration.IncludePrivateMembers = true; configuration and have a default constructor on your class.
Unfortunately, it doesn't work like that. Since you are returning , in the converter, it will quote the field, as that is a part of a single field.
Currently the only way to accomplish what you want is to write manually, which isn't too horrible.
foreach( var test in list )
{
foreach( var item in test.Data )
{
csvWriter.WriteField( item );
}
csvWriter.NextRecord();
}
Update
Version 3 has support for reading and writing IEnumerable properties.

How to convert a List<T> to specific Json format

I want to be able to convert a List<T> into a specific JSON table-like format. In my case, the T will always be a simple object (no nested properties). Here are two examples to illustrate what I want.
Example #1: List<Person> to JSON
// C# list of Persons
var list = new List<Person>() {
new Person() { First = "Jesse", Last = "Gavin", Twitter = "jessegavin" },
new Person() { First = "John", Last = "Sheehan", Twitter = "johnsheehan" }
};
// I want to transform the list above into a JSON object like so
{
columns : ["First", "Last", "Twitter"],
rows: [
["Jesse", "Gavin", "jessegavin"],
["John", "Sheehan", "johnsheehan"]
]
}
Example #2: List<Address> to JSON
// C# list of Locations
var list = new List<Location>() {
new Location() { City = "Los Angeles", State = "CA", Zip = "90210" },
new Location() { City = "Saint Paul", State = "MN", Zip = "55101" },
};
// I want to transform the list above into a JSON object like so
{
columns : ["City", "State", "Zip"],
rows: [
["Los Angeles", "CA", "90210"],
["Saint Paul", "MN", "55101"]
]
}
Is there a way to tell JSON.net to serialize an object in this manner? If not, how could I accomplish this? Thanks.
UPDATE:
Thanks to #Hightechrider's answer, I was able to write some code that solves the problem.
You can view a working example here https://gist.github.com/1153155
Using reflection you can get a list of properties for the type:
var props = typeof(Person).GetProperties();
Given an instance of a Person p you can get an enumeration of the property values thus:
props.Select(prop => prop.GetValue(p, null))
Wrap those up in a generic method, add your favorite Json serialization and you have the format you want.
Assuming your using .Net 4 this should do everything you want. The class actually lets you convert to either XML or JSON. The Enum for CommunicationType is at the bottom. The serializer works best if the class your passing it has been decorated with DataContract & DataMember attributes. I've included a sample at the bottom. It will also take an anonymous type so long as it's all simple types.
Reflection would work as well but then you have to understand all the JSON nuances to output complex data types, etc. This used the built-in JSON serializer in .Net 4. One more note, because JSON does not define a date type .Net puts dates in a funky ASP.Net custom format. So long as your deserializing using the built-in deserializer it works just fine. I can dig up the documentation on that if you need.
using System;
using System.Xml.Serialization;
using System.Text;
using System.Runtime.Serialization.Json;
using System.IO;
using System.Xml.Linq;
internal class Converter
{
public static string Convert<T>(T obj, CommunicationType format, bool indent = false, bool includetype = false)
{
if (format == CommunicationType.XML)
{
return ToXML<T>(obj, includetype, indent);
}
else if (format == CommunicationType.JSON)
{
return ToJSON<T>(obj);
}
else
{
return string.Empty;
}
}
private static string ToXML<T>(T obj, bool includetype, bool indent = false)
{
if (includetype)
{
XElement xml = XMLConverter.ToXml(obj, null, includetype);
if(indent) {
return xml.ToString();
}
else
{
return xml.ToString(SaveOptions.DisableFormatting);
}
}
else
{
System.Xml.Serialization.XmlSerializerNamespaces ns = new System.Xml.Serialization.XmlSerializerNamespaces();
XmlSerializer xs = new XmlSerializer(typeof(T));
StringBuilder sbuilder = new StringBuilder();
var xmlws = new System.Xml.XmlWriterSettings() { OmitXmlDeclaration = true, Indent = indent };
ns.Add(string.Empty, string.Empty);
using (var writer = System.Xml.XmlWriter.Create(sbuilder, xmlws))
{
xs.Serialize(writer, obj, ns);
}
string result = sbuilder.ToString();
ns = null;
xs = null;
sbuilder = null;
xmlws = null;
return result;
}
}
private static string ToJSON<T>(T obj)
{
DataContractJsonSerializer ser = new DataContractJsonSerializer(typeof(T));
using (MemoryStream ms = new MemoryStream())
{
string result = string.Empty;
System.Text.UTF8Encoding encoding = new System.Text.UTF8Encoding();
ser.WriteObject(ms, obj);
result = encoding.GetString(ms.ToArray());
ms.Close();
encoding = null;
ser = null;
return result;
}
}
}
[DataContract()]
public enum CommunicationType : int
{
[XmlEnum("0"), EnumMember(Value = "0")]
XML = 0,
[XmlEnum("1"), EnumMember(Value = "1")]
JSON = 1
}
[DataContract(Namespace = "")]
public partial class AppData
{
[DataMember(Name = "ID")]
public string ID { get; set; }
[DataMember(Name = "Key")]
public string Key { get; set; }
[DataMember(Name = "Value")]
public string Value { get; set; }
[DataMember(Name = "ObjectType")]
public string ObjectType { get; set; }
}
Any specific reason why you don't need the standard format?
To actually answer the question:
Since this is something that is outside of JSON syntax I can't think of a way to implement this within the default framework.
One solution would be to leverage attributes decorate the properties you want transported over the wired with a custom attribute and using Reflection cycle through the properties and output their property names as the column headers and then cycle throw the objects and write the values. Generic enough so it could be applied across other objects as well.
public class Location
{
[JsonFooAttribute("City")]
public string city {get;set;}
[JsonFooAttribute("State")]
public string state {get;set;}
}

Categories