Dynamic creation of columns using csvHelper - c#

I have a worker with various fields that are fetched from server. I am using CSVHelper package to convert this class to an excel sheet.
Worker has Fields like :
class Worker
{
string name;
string phone;
string age;
Dictionary<string,object> customerField;
}
I can map the name, phone, number like
class WorkerMap : CsvClassMap<Worker>
{
public WorkerMap()
{
Map(m => m.name);
Map(m => m.phone);
Map(m => m.age);
}
}
And I generate the map by :
csv.Configuration.RegisterClassMap<WorkerMap>();
Write the list of workers by :
csv.WriteRecords(workerList);
How can I map the customerField dictionary to the excel sheet such that the Key (string) is another column name and the value(object) is the value of the column.
Does CSVHelper help us do it at runtime. I looked through the documentation. Couldn't find anything that worked for me.

I don't think that writing a dictionary is supported at this time. For one thing, CsvHelper would have a difficult time knowing what headers to write. Fortunately, it's not too complex to use CsvWriter manually, writing a field at a time. If we assume that each Worker has the same keys in customerField then your code might look something like this.
var firstWorker = workerList.First();
var keys = firstWorker.customerField.Keys.ToList();
var headers = new []{ "name", "phone", "age"}.Concat(keys).ToList();
var csv = new CsvWriter( textWriter );
// Write the headers
foreach( var header in headers )
{
csv.WriteField(header);
}
csv.NextRecord();
// Write the rows
foreach( var item in workerList)
{
csv.WriteField(item.name);
csv.WriteField(item.phone);
csv.WriteField(item.age);
var dict = worker.customerField;
foreach (var key in keys)
{
csv.WriteField(dict[key]);
}
csv.NextRecord();
}
This code is untested, but should get you pretty close to the behavior you need. If the customerField dictionary keys are not consistent in the list then the code would be a bit more complicated but it's still solvable.

Dictionary isn't supported but ExpandoObject is supported.
https://github.com/JoshClose/CsvHelper/blob/48e70742e06007dae3a635c418b7e3358f667c4f/src/CsvHelper.Tests/Writing/MultipleHeadersTest.cs
https://github.com/JoshClose/CsvHelper/blob/b74a2f95a101158f4cdedd25fae6e8392b58855b/src/CsvHelper.Tests/Writing/DynamicTests.cs
If you follow the first link above you'll find the WriteDynamicHeader method in use on lines 50 & 57.
With help of an extension method I create an ExpandoObject for each record and use CsvHelper to write that object.The Dictionary<string, object> parameter named document is what I wish to create the CSV record from.
public static class DictionaryCsvExtentions
{
public static dynamic BuildCsvObject(this Dictionary<string, object> document)
{
dynamic csvObj = new ExpandoObject();
foreach (var p in document)
{
AddProperty(csvObj, p.Key, p.Value);
}
return csvObj;
}
private static void AddProperty(ExpandoObject expando, string propertyName, object propertyValue)
{
var expandoDict = expando as IDictionary<string, object>;
if (expandoDict.ContainsKey(propertyName))
{
expandoDict[propertyName] = propertyValue;
}
else
{
expandoDict.Add(propertyName, propertyValue);
}
}
}
Now I can create an ExpandoObject from my dictionary like this
var csvObj = myDictonary.BuildCsvObject();
and with that, following Josh's tests in the link above we have all we need to use a dictionary fairly seamlessly with CsvHelper. I don't think this is a better solution to Michael's, just a different approach.
credit where credit is due the basic ExpandoObject from dictionary code is from here (where there is a lot more explanation!) https://www.oreilly.com/learning/building-c-objects-dynamically

Recently had the same problem and for the sake of completeness posting another answer. We had fairly complex ClassMap<Worker> like class and did want to loose it. Also, we needed both CSV write and read hence the ExpandoObject from the robs answer was needed as well. Eventually, the approach is like combination of the Michael Richardson and robs answer that should bring best of both worlds.
On top of that, to distinguish dictionary fields in the CSV file during read it is a good idea to prefix them with something like "customerField.".
First we need conversion of dictionary to/from Worker.customerField:
public static class WorkerExtensions
{
const string CustomerFieldPrefix = nameof(Worker.customerField) + ".";
public static dynamic GetCustomerFieldExpando(this Worker worker)
{
var expando = new ExpandoObject() as IDictionary<string, object>;
foreach (var fieldPair in worker.customerField)
{
expando[CustomerFieldPrefix + fieldPair.Key] = fieldPair.Value ?? "";
}
return expando;
}
public static void SetCustomerField(this Worker worker, ExpandoObject expando)
{
var columnsToValues = expando as IDictionary<string, object>;
foreach (var columnValuePair in columnsToValues)
{
if (columnValuePair.Key.StartsWith(CustomerFieldPrefix)
&& columnValuePair.Key.Length > CustomerFieldPrefix.Length)
{
string key = columnValuePair.Key.Substring(CustomerFieldPrefix.Length);
worker.customerField[key] = columnValuePair.Value;
}
}
}
}
Next, CSV write can use both ClassMap and ExpandoObject and looks like this:
csv.Configuration.HasHeaderRecord = true;
csv.Configuration.RegisterClassMap<WorkerMap>();
csv.WriteHeader<Worker>();
(workers.First().GetCustomerFieldExpando() as IDictionary<string, object>)
.Keys.ToList().ForEach(key => csv.WriteField(key));
csv.NextRecord();
foreach (var worker in workers)
{
csv.WriteRecord(worker);
csv.WriteRecord(worker.GetCustomerFieldExpando());
csv.NextRecord();
}
Finally, CSV read can also combine both ClassMap and ExpandoObject:
List<Worker> workers = new List<Worker>();
csv.Configuration.HasHeaderRecord = true;
csv.Configuration.RegisterClassMap<WorkerMap>();
csv.Read();
csv.ReadHeader();
var columns = csv.Context.HeaderRecord;
while (csv.Read())
{
var worker = csv.GetRecord<Worker>();
workers.Add(worker);
ExpandoObject expando = csv.GetRecord<dynamic>();
worker.SetCustomerField(expando);
}
In case of CSV read things get more complicated if you want to read real types into dictionary values (not just strings). We needed predefined associations between dictionary keys and data types to be able to convert to proper types from the ExpandoObject.

Related

How to read Dictionary from FireBase Database in Unity

public void AddDatabase(User user)
{
//User user = new User("pç", 2);
Dictionary<string, object> result = new Dictionary<string, object>();
result["score"] = user.score;
result["username"] = user.username;
reference.Child("datas").Child(user.username).SetValueAsync(result);
}
public void GetDatabase()
{
FirebaseDatabase.DefaultInstance.GetReference("datas").GetValueAsync().ContinueWith(task =>
{
if (task.IsFaulted)
{
// Handle the error...
}
else if (task.IsCompleted)
{
DataSnapshot snapshot = task.Result;
ArrayList childs = new ArrayList();
//Dictionary<string, object> xyz = new Dictionary<string, object>();
foreach (var item in snapshot.Children)
{
childs.Add(item.Value);
//xyz.Add("dat", item.Value);
}
// this does not work I can't see values and keys in the console
foreach (KeyValuePair<string, object> item in childs)
{
Debug.Log(item.Key);
Debug.Log(item.Value);
}
}
});
}
I write datas into firebase with AddDatabase and I want to read every Score and Username one by one in order with GetDatabase so I can put them Leaderboard but I can't how can I do that and I already checked firebase database docs.
Thank you
Depending on the data returned in GetValueAsync, DataSnapshot may have a Dictionary<string, object> in it already:
public void GetDatabase()
{
FirebaseDatabase.DefaultInstance.GetReference("datas").GetValueAsync().ContinueWith(task =>
{
if (task.IsFaulted)
{
// Handle the error...
}
else if (task.IsCompleted)
{
DataSnapshot snapshot = task.Result;
var dictionary = snapshot.Value as Dictionary<string, object>;
if (dictionary != null) {
// dictionary stuff here
}
}
});
}
From the docs:
Value returns the data contained in this snapshot as native types. The possible types returned are:
bool
string
long
double
IDictionary{string, object}
List{object} This list is recursive; the possible types for object in the above list is given by the same list. These types correspond to the types available in JSON.
Now a quick note: the Unity SDK does guess at whether the Value should be a List or a Dictionary if it has children. The general rule is:
It's a List if the elements are sequential and roughly 50% of the keys in the range are filled (ex: 0="hi", 1="I'm", 3="a", 4="list" is a list even though it's missing 2). Otherwise it will try to make a Dictionary. If you are dealing with integer keys, you may want to have mirrored logic for dealing with a List or Dictionary just to be safe.
What is the type of snapshot.Children and what is the type of its contents? If the contents is not of type KeyValuePair<string, object> then the second foreach loop won't work properly (I am not familiar enough with non-generic collections to say what would actually happen).
I strongly recommend using types from the System.Collections.Generic namespace, such as List<T>, rather than non-generic collections like ArrayList. This will make it easier to see what your code is doing. In addition to being easier to work with, they also have better performance because they don't require casting.
Try to follow the two examples:
How to use Firebase Database in Unity (REST API)
Retrieving Data

How to access the values in a dynamic dictionary in C#?

I have a dictionary which is dynamically generated. In the dictionary, for a single Key, there are multiple values. I am trying to access one of the values in those multiple values. My code so far:
var result = new Dictionary<string, object>();
var dictionary = deserialisedResult as IDictionary<string, object>;
foreach (var item in dictionary)
{
result.Add(item.Key, item.Value);
}
object o = result;
string[] names = o.GetType().GetProperties().Select(p => p.Name).ToArray();
foreach (var prop in names)
{
object propValue = o.GetType().GetProperty(prop).GetValue(o, null);
var value = propValue.GetValue(o,null);
}
But this is not working.
I need to get the values for 'resource'.
When I add watch, I see it nested as such:
Following line is causing issue out here:
string[] names = o.GetType().GetProperties().Select(p => p.Name).ToArray();
GetType() for Object base class type will not yield a Type, which can provide the PropertyInfo[], that you are looking for, even otherwise you are trying to run it for Dictionary<string,object> type, which anyway doesn't have properties to help find the relevant information. For it the Type would always be Dictionary
What you need is fetch the key collection from Dictionary and use them to fetch the values stored in the Dictionary
foreach (var key in result.Keys)
{
var value = result[key];
}
Solution is based on code provided in the question, I am not sure if you have further requirements

Uppercase all keys in a series of C# KeyValuePairs

I have a list of JSON objects, each of which I'm iterating over and I simply want to process all of the KeyValuePairs therein so that the keys are uppercased, then return the revised objects.
This is what I attempted:
public static List<JObject> normalizeKeys(List<JObject> jObjList)
{
jObjList.ForEach(jobj => {
foreach(var kvp in jobj)
{
kvp.Key = kvp.Key.ToUpper();
}
});
return jObjList;
}
but it doesn't execute and instead I get KeyValuePair<string, JToken>.Key cannot be assigned to -- it is readonly. Does anyone have any suggestions for how to accomplish what I'm after?
Though the given answer works, I would be inclined to say that it is both not as elegant as it could be, and not as generally useful as it could be. We can make it more of both. To begin with, solve the problem for one object:
public static JObject NormalizeKeys(JObject originalObject)
{
var newObject = new JObject();
foreach (var kvp in originalObject)
newObject.Add(kvp.Key.ToUpper(), kvp.Value);
return newObject;
}
And now to solve the problem for a list:
var newList = oldList.Select(x=>NormalizeKeys(x)).ToList();
Or, even shorter:
var newList = oldList.Select(NormalizeKeys).ToList();
Isn't that much more pleasant than all that nested loop stuff? Say how to normalize one object, and then say that you want to apply that function to the list, and you're done.
Guess you know the answer yourself, you must create a new collection and copy the items over, since key is key, it's case sensitive, and cannot be changed, as otherwise you might end up with probably 2 sets of data, one set with normal case, and the other with upper cases.
The solution is as the following:
public static List<JObject> normalizeKeys(List<JObject> jObjList)
{
// create a new list for return and copy over the existing items
var jObjReturnList = new List<JObject>(jObjList.Count);
jObjList.ForEach(jobj =>
{
var jObjNew = new JObject();
foreach (var kvp in jobj)
{
jObjNew.Add(kvp.Key.ToUpper(), kvp.Value);
}
jObjReturnList.Add(jObjNew);
});
return jObjReturnList;
}

Nested Dictionary collection in .NET

The .NET Dictionary<TKey, TValue> object allows assignment of key/values like so:
Dictionary<string, string> dict = new Dictionary<string, string>();
dict["1"] = "foo";
dict["2"] = "bar";
but I cannot use a Dictionary like so:
Dictionary<string, string> dict = new Dictionary<string, string>();
dict["F1"]["F2"]["F3"] = "foo";
dict["2"]["X"] = "bar";
Is there a collection in .NET which allows me to nest [], or would I have to create my own?
If I have to create my own, how would I do this?
EDIT:
It would also be useful if I could have implementations which expect unique keys, like so:
dict["F1"]["F2"]["F3"] = "foo";
dict["F1"]["F2"]["F3"] = "bar"; //result is "bar" because "foo" was overridden
and an implementation where a key can be used more than once
dict["F1"]["F2"]["F3"] = "foo";
dict["F1"]["F2"]["F3"] = "bar"; //result can be "foo" and "bar"
Is this possible?
EDIT (as per Jon Skeet's question):
I want to use the structure like so (as a very rough example):
json["data"]["request"]["name"] = "username";
json["data"]["request"]["pass"] = "password";
resolves to
{ data: { request: { name: "username", pass: "password" } } }
and equally there would be an equivalent for XML etc.
Having a requirement for arbitrarily long nesting, I have come up with the following solution, which as far as I can see, doesn't break, according to my test:
public class NestedDictionary<K, V> : Dictionary<K, NestedDictionary<K, V>>
{
public V Value { set; get; }
public new NestedDictionary<K, V> this[K key]
{
set { base[key] = value; }
get
{
if (!base.Keys.Contains<K>(key))
{
base[key] = new NestedDictionary<K, V>();
}
return base[key];
}
}
}
TEST:
NestedDictionary<string, string> dict = new NestedDictionary<string, string>();
dict["one"].Value = "Nest level 1";
dict["one"]["two"]["three"].Value = "Nest level 3";
dict["FieldA"]["FieldB"].Value = "Hello World";
Console.WriteLine(dict["one"].Value);
Console.WriteLine(dict["one"]["two"]["three"].Value);
Console.WriteLine(dict["FieldA"]["FieldB"].Value);
You can do this using the standard Dictionary, you just have to declare the nesting:
Dictionary<string, Dictionary<string, string>> dict = ...
string test = dict["first"]["second"]
Dictionary<string, Dictionary<string, Dictionary<string, string>>> dict = ...
string test = dict["first"]["second"]["third"]
etc
The original Dictionary COM object which was created to work with vb6 would respond to an attempt to access a non-existent item by creating a new item of type Dictionary with the corresponding name. This approach allows something to be stored to MyDict["Foo"]["Bar"] without having to first create MyDict["Foo"]. The problem with this approach is that while one would want to add "Foo" to MyDict when performing a write to MyDict["Foo"]["Bar"], one would rather not create such an item if one was attempting to e.g. evaluate MyDict["Foo"]["Bar"].ValueOrDefault(someDefaultValue).
I've used such collections, since they can be handy for modeling certain things (conceptually they're a lot like XML documents). One workable approach is to declare that dictionaries which contain nothing but other dictionaries are considered semantically as non-entities which may be removed at any opportunity. When implicitly adding a subcollection, set a flag in the item to which it's added it indicating that it should be checked for items that may be deleted (or keep a counter of how many such items may exist). Then with some reasonable frequency, scan through the dictionaries and remove such "dead" items.
An alternative approach is to have the indexer from the dictionary not return an actual item, but instead return an "ephemeral indexer" type, which keeps a reference to the parent object and has internal methods GetNestedForReading, SetNestedForReading, GetValue, and SetValue, which chain back to it. Then a statement Foo["Bar"]["Boz"] = "George"; will end up effectively performing Foo.SetNestedForReading("Bar").SetValue("Boz", "George"); while z = Foo["Bar"]["Boz"]; will effectively perform Foo.GetNestedForReading("Bar").GetValue("Boz");. Calling SetNestedForReading method with a non-existent key will create and return a new nested item; the GetNestedForReading method will an immutable "empty" item. Using this approach will thus avoid creating empty items.
Although the latter approach is more complicated than the former, it has another advantage. It's possible to have each node to individually hold its collection as either a shared deeply-immutable dictionary or an unshared mutable one; if a GetNestedForWriting call sees that the nested object is immutable, it can construct a new shallowly-mutable object holding the same items. If one defines the cloning method for a mutable node as creating a new immutable node with (immutable) clones of all subnodes, and the cloning method of an immutable node as returning itself, cloning trees that are mostly immutable becomes very cheap. If one had a newly-cloned (thus immutable) four-level tree with sixteen items on each level (65,536 leaf nodes total) and all the nodes were shared-immutable, updating a leaf node would only require replacing one leaf and four other nodes with mutable ones. Cloning the tree again would only require creating new immutable objects for the nodes which had been replaced with mutable ones (e.g. copying five things). Although one would have the convenience of a fully-mutable tree, one would have the efficiency advantages of an immutable one.
The biggest "problem" I see with this approach is that to avoid some weird behaviors one must require the use of syntax like MyDict["Foo"]["Bar"].Value = "George". If implicit conversion operators were used to avoid that requirement, someone would expect a statement like var st = MyThing["Foo"]["Bar"]; to define st as a string snapshot of whatever MyThing["Foo"]["Bar"] holds at that moment; instead it would define it as something that will index MyThing["Foo"]["Bar"]. If one had to use .Value to read or write strings from such a type, the fact that the variable wasn't a string would be apparent. If one used implicit operators to allow such assignments, the behavior would be odd. It's too bad there's no way a function can specify "do not allow this return value to be used for type inference".
Incidentally, it's possible to have the indexer type be a class or a generic struct. If it's a class, an access to foo["Bar"]["boz"]["baz"]... nested N deep would likely require the creation of N temporary heap objects. If it's a generic struct, it would entail the creation of N structs, but the more-deeply-nested structs would get bigger. For reasonable levels of nesting, generic structs would probably be slightly more efficient, but classes would probably be easier to work with.
You have to decide on either supporting a fixed number of string keys to look up, or provide a more general key mechanism if the number of keys can vary. For the first case try the following:
Dictionary<string,Dictionary<string,string>> dict =
Dictionary<string,Dictionary<string,string>>();
dict["F1"]["F2"] = "foo";
Dictionary<string,Dictionary<string,Dictionary<string,string>>> dict2 =
Dictionary<string,Dictionary<string,string>>();
dict2["F1"]["F2"]["F3"] = "bar";
For the second case, you could do the following:
Dictionary<string[],string> dict = new Dictionary<string[],string>(new MyEqualityComparer());
dict[new string[] {"F1","F2"}] = "foo";
dict[new string[] {"F1","F2","F3"}] = "bar";
where the class MyEqualityComparer would be something like:
public class MyEqualityComparer : IEqualityComparer<string[]>
{
public int GetHashCode(string[]item)
{
int hashcode = 0;
foreach (string s in item)
{
hashcode |= s.GetHashCode();
}
return hashcode;
}
public bool Equals(string [] a, string [] b)
{
if (a.Length != b.Length)
return false;
for (int i = 0; i < a.Length; ++i)
{
if (a[i] != b[i])
return false;
}
return true;
}
Use a Dictionary as TValue:
var dict2 = new Dictionary<string, Dictionary<string, string>>();
var dict3 = new Dictionary<string, Dictionary<string, Dictionary<string, string>>>();
for example:
var dict =
new Dictionary<string, Dictionary<string, string>>
{
{
"F1", new Dictionary<string, string>
{
{"F2", "foo"}
}
}
};
dict["F1"]["F2"] = "bar";
I think, your case is good place to use DynamicObject. I'll create an example for json using Dictionary<string, object> internally.
Same idea can be used for xml also.
string json = #"{""Name"":""Joe"",
""Age"":30,
""Address"":{ ""City"":""NY"" }}";
dynamic dynObj = new DynamicJson(json);
Console.WriteLine(dynObj.Name);
Console.WriteLine(dynObj.Age);
Console.WriteLine(dynObj.Address.City);
--
public class DynamicJson : DynamicObject
{
Dictionary<string, object> _Dict;
public DynamicJson(string json)
{
_Dict = (Dictionary<string, object>)new JavaScriptSerializer().DeserializeObject(json);
}
DynamicJson(Dictionary<string, object> dict)
{
_Dict = dict;
}
public override bool TryGetMember(GetMemberBinder binder, out object result)
{
result = null;
object obj;
if (!_Dict.TryGetValue(binder.Name, out obj)) return false;
if (obj is Dictionary<string, object>)
{
result = new DynamicJson((Dictionary<string, object>)obj);
}else
{
result = obj;
}
return true;
}
}

C# foreach over properties of objects contained within a Dictionary<String, ObjectType>

Is it possible to itterate directly over properties of the objects stored within a Dictionary collection in C#?
For example, I have a Dictionary called Fields of type Dictionary<String, Field>. The Field object has a property of Data which is of type XmlDataDocument so I would like to do something like,
foreach(XmlDataDocument fieldData in Fields.Values.Data){
}
I know it's pretty trivial since all I would need to do when itterating over Field objects instead would be,
XmlDataDocument fieldData = field.Data;
within the Field itteration however if there is a quicker way to do it I'd like to know :-)
In C# 3.0:
foreach (var data in Fields.Values.Select(x => x.Data))
{
}
It's not any "quicker" though.
yes, create a custom collection based on Dictionary and then add your own iterator to it...
public class MyFieldCollection: Dictionary<string, Field>
{
public IEnumerable<XmlDataDocument> Data
{
foreach(Field f in this.Values)
yield return f.Data;
}
}
Then in client code all you need to do is
MyFieldCollection MFC = new MyFieldCollection();
foreach (XmlDataDocument doc in MFC.Data)
{
// DoWhatever (doc );
}

Categories