What is the equivalent of csvWriter.Configuration.ReferenceHeaderPrefix in the newer version of CsvHelper? Trying this
csvWriter.Configuration.ReferenceHeaderPrefix = (memberType, memberName) => $"{memberName}_";
but its not let me because ReferenceHeaderPrefix has only get method after version 20.0.0
The usual workflow is to construct an instance of the CsvConfiguration class and pass that into the constructor for the reader or writer. And CsvConfiguration.ReferenceHeaderPrefix does have a set method.
var config = new CsvConfiguration(CultureInfo.InvariantCulture)
{
ReferenceHeaderPrefix = (args) => $"{args.MemberName}_",
};
using (var writer = new StreamWriter("path\\to\\file.csv"))
using (var csv = new CsvWriter(writer, config))
{
// Write your CSV records here.
csv.WriteRecords(records);
}
Note also that, in the current version (27.2.0), the ReferenceHeaderPrefix takes a single ReferenceHeaderPrefixArgs argument, which contains MemberType and MemberName fields:
public readonly struct ReferenceHeaderPrefixArgs
{
public readonly Type MemberType;
public readonly string MemberName;
public ReferenceHeaderPrefixArgs(Type memberType, string memberName)
{
MemberType = memberType;
MemberName = memberName;
}
}
Related
I'm importing a row with CSVHelper into the following class:
public class RequestMonitoring
{
public string PNdsPn { get; set; }
[Required]
public virtual Stamp PUsrCrStatus { get; set; }
// many other properties
}
with Stamp looking like this:
public class Stamp
{
public int Weight { get; set; }
[Required]
public string MachineName { get; set; }
// many other properties
}
And a CSV looking like this:
$P_NDS_PN;#P_USR_CR_Staus; ...
CTD_037453;SYS complete; ...
Where "SYS complete" is the "MachineName" of the stamp.
But the typeconversion doesn't work like i expected. If i'm using a CustomTypeConverter it doesn't fire it's ConvertFromString method except i change the "PUsrCrStatus" property to type string which makes no sense in my opinion.
So i tried inline conversion with ConvertUsing like this:
classMap.Map(requestMonitoring => requestMonitoring.PUsrCrStatus)
.ConvertUsing(row = > _context.Stamps.SingleOrDefault(stamp => stamp.MachineName == row.GetField("#P_USR_CR_Status"));
But this works only if i put [Ignore] attributes on every attribute except "MachineName" in the Stamp model, which is a bad if want to import Stamps from a different CSV later on.
So i tried to define these ignores in the classmap like:
classMap.Map(requestMonitoring => requestMonitoring.PUsrCrStatus.Weight).Ignore();
and tried also:
classMap.Map(typeof(Stamp),typeof(Stamp).GetProperties().SingleOrDefault(p => p.Name == "Id") ).Ignore();
but they both didn't work, it still tells me that no column with name "Weight" is there in the CSV.
Can anyone please help me to get this working?
Thanks in advance
I maybe understand what you are trying to do. You might want to rethink the var classMap = csv.Configuration.AutoMap<RequestMonitoring>(); you had in your other question if the only field you want to map in the Stamp class is MachineName. AutoMap is mapping all the properties of PUsrCrStatus. See if this gets you closer to what you are trying to do.
static void Main(string[] args)
{
using (var stream = new MemoryStream())
using (var writer = new StreamWriter(stream))
using (var reader = new StreamReader(stream))
using (var csv = new CsvReader(reader, CultureInfo.InvariantCulture))
{
writer.WriteLine("$P_NDS_PN,#P_USR_CR_Status");
writer.WriteLine("test,StampName");
writer.Flush();
stream.Position = 0;
var classMap = new DefaultClassMap<RequestMonitoring>();
foreach (var property in typeof(RequestMonitoring).GetProperties())
{
if (property.Name == "PUsrCrStatus") { continue; }
var columnName = property.Name switch
{
"PNdsPn" => "$P_NDS_PN",
{ } x when x.StartsWith("PUsrCr") => property.Name.Replace("PUsrCr", "#P_USR_CR_"),
_ => property.Name
};
classMap.Map(typeof(RequestMonitoring), property).Name(columnName);
}
classMap.Map(requestMonitoring => requestMonitoring.PUsrCrStatus.MachineName).Name("#P_USR_CR_Status");
csv.Configuration.RegisterClassMap(classMap);
var records = csv.GetRecords<RequestMonitoring>().ToList();
}
}
Or you could use a custom TypeConverter to map PUsrCrStatus
static void Main(string[] args)
{
using (var stream = new MemoryStream())
using (var writer = new StreamWriter(stream))
using (var reader = new StreamReader(stream))
using (var csv = new CsvReader(reader, CultureInfo.InvariantCulture))
{
writer.WriteLine("$P_NDS_PN,#P_USR_CR_Status");
writer.WriteLine("test,StampName");
writer.Flush();
stream.Position = 0;
var classMap = new DefaultClassMap<RequestMonitoring>();
foreach (var property in typeof(RequestMonitoring).GetProperties())
{
var columnName = property.Name switch
{
"PNdsPn" => "$P_NDS_PN",
{ } x when x.StartsWith("PUsrCr") => property.Name.Replace("PUsrCr", "#P_USR_CR_"),
_ => property.Name
};
classMap.Map(typeof(RequestMonitoring), property).Name(columnName);
}
classMap.Map(requestMonitoring => requestMonitoring.PUsrCrStatus).TypeConverter<StampTypeConverter>();
csv.Configuration.RegisterClassMap(classMap);
var records = csv.GetRecords<RequestMonitoring>().ToList();
}
}
public class StampTypeConverter : DefaultTypeConverter
{
public override object ConvertFromString(string text, IReaderRow row, MemberMapData memberMapData)
{
return new Stamp { MachineName = text };
}
}
I am using JoshClose's CsvHelper library in order to serialize and deserialize data into CSV.
I am facing the following problem : When deserializing my data, I want to use a specific function to create a new instance of my class being deserialized.
I saw I can use a specific constructor as indicated here, but in my case, I would like to use a function which is not a constructor (but still returns a new instance of my class). Is this even possible?
Another solution would be to allow CSVHelper to "override" the values of an existing instance of my class. Is this possible?
Here is my current code so far:
private MyClass[] DeserializeFromCSV( string csv )
{
MyClass[] output = null;
using ( System.IO.TextReader textReader = new System.IO.StringReader( csv ) )
using ( CsvHelper.CsvReader csvReader = new CsvHelper.CsvReader( textReader ) )
{
// Here, I would like to provide the function to call
// in order to instantiate a new instance of the class
// The function I want to call does not return a `ConstructorInfo`
// as needed by the property here
//csvReader.Configuration.GetConstructor = type =>
//{
// return null;
//};
// CSVClassMap is a getter to get a ClassMap
csvReader.Configuration.RegisterClassMap( CSVClassMap );
csvReader.Read();
output = csvReader.GetRecords<T>().ToArray();
}
return output;
}
You can use CsvHelper.ObjectResolver to do this.
void Main()
{
var s = new StringBuilder();
s.Append("Id,Name\r\n");
s.Append("1,one\r\n");
s.Append("2,two\r\n");
using (var reader = new StringReader(s.ToString()))
using (var csv = new CsvReader(reader))
{
CsvHelper.ObjectResolver.Current = new ObjectResolver(CanResolve, Resolve);
csv.Configuration.RegisterClassMap<TestMap>();
csv.GetRecords<Test>().ToList().Dump();
}
}
public bool CanResolve(Type type)
{
return type == typeof(Test);
}
public object Resolve(Type type, object[] constructorArgs)
{
// Get a dependency from somewhere.
var someDependency = new object();
return new Test(someDependency);
}
public class Test
{
public int Id { get; set; }
public string Name { get; set; }
public Test(object someDependency) { }
}
public class TestMap : ClassMap<Test>
{
public TestMap()
{
Map(m => m.Id);
Map(m => m.Name);
}
}
I want to write an xml document to disk in a compact format. To this end, I use the net framework method XmlDictionaryWriter.CreateBinaryWriter(Stream stream,IXmlDictionary dictionary)
This method writes a custom compact binary xml representation, that can later be read by XmlDictionaryWriter.CreateBinaryReader. The method accepts an XmlDictionary that can contain common strings, so that those strings do not have to be printed in the output each time. Instead of the string, the dictionary index will be printed in the file. CreateBinaryReader can later use the same dictionary to reverse the process.
However the dictionary I pass is apparently not used. Consider this code:
using System.IO;
using System.Xml;
using System.Xml.Linq;
class Program
{
public static void Main()
{
XmlDictionary dict = new XmlDictionary();
dict.Add("myLongRoot");
dict.Add("myLongAttribute");
dict.Add("myLongValue");
dict.Add("myLongChild");
dict.Add("myLongText");
XDocument xdoc = new XDocument();
xdoc.Add(new XElement("myLongRoot",
new XAttribute("myLongAttribute", "myLongValue"),
new XElement("myLongChild", "myLongText"),
new XElement("myLongChild", "myLongText"),
new XElement("myLongChild", "myLongText")
));
using (Stream stream = File.Create("binaryXml.txt"))
using (var writer = XmlDictionaryWriter.CreateBinaryWriter(stream, dict))
{
xdoc.WriteTo(writer);
}
}
}
The produced output is this (binary control characters not shown)
#
myLongRootmyLongAttribute˜myLongValue#myLongChild™
myLongText#myLongChild™
myLongText#myLongChild™
myLongText
So apparently the XmlDictionary has not been used. All strings appear in their entirety in the output, even multiple times.
This is not a problem limited to XDocument. In the above minimal example I used a XDocument to demonstrate the problem, but originally I stumbled upon this while using XmlDictionaryWriter in conjunction with a DataContractSerializer, as it is commonly used. The results were the same:
[Serializable]
public class myLongChild
{
public double myLongText = 0;
}
...
using (Stream stream = File.Create("binaryXml.txt"))
using (var writer = XmlDictionaryWriter.CreateBinaryWriter(stream, dict))
{
var dcs = new DataContractSerializer(typeof(myLongChild));
dcs.WriteObject(writer, new myLongChild());
}
The resulting output did not use my XmlDictionary.
How can I get XmlDictionaryWriter to use the suplied XmlDictionary?
Or have I misunderstood how this works?
with the DataContractSerializer approach, I tried debugging the net framework code (visual studio/options/debugging/enable net. framework source stepping). Apparently the Writer does attempt to lookup each of the above strings in the dictionary, as expected. However the lookup fails in line 356 of XmlbinaryWriter.cs, for reasons that are not clear to me.
Alternatives I have considered:
There is an overload for XmlDictionaryWriter.CreatebinaryWriter, that also accepts a XmlBinaryWriterSession. The writer then adds any new strings it encounters into the session dictionary. However, I want to only use a static dictionary for reading and writing, which is known beforehand.
I could wrap the whole thing into a GzipStream and let the compression take care of the multiple instances of strings. However, this would not compress the first instance of each string, and seems like a clumsy workaround overall.
Yes there is a misunderstanding. XmlDictionaryWriter is primarily used for serialization of objects and it is child class of XmlWriter. XDocument.WriteTo(XmlWriter something) takes XmlWriter as argument. The call XmlDictionaryWriter.CreateBinaryWriter will create an instance of System.Xml.XmlBinaryNodeWriter internally. This class has both methods for "regular" writing:
// override of XmlWriter
public override void WriteStartElement(string prefix, string localName)
{
// plain old "xml" for me please
}
and for dictionary based approach:
// override of XmlDictionaryWriter
public override void WriteStartElement(string prefix, XmlDictionaryString localName)
{
// I will use dictionary to hash element names to get shorter output
}
The later is mostly used if you serialize object via DataContractSerializer (notice its method WriteObject takes argument of both XmlDictionaryWriter and XmlWriter type), while XDocument takes just XmlWriter.
As for your problem - if I were you I'd make my own XmlWriter:
class CustomXmlWriter : XmlWriter
{
private readonly XmlDictionaryWriter _writer;
public CustomXmlWriter(XmlDictionaryWriter writer)
{
_writer = writer;
}
// override XmlWriter methods to use the dictionary-based approach instead
}
UPDATE (based on your comment)
If you indeed use DataContractSerializer you have few mistakes in your code.
1) POC classes have to be decorated with [DataContract] and [DataMember] attribute, the serialized value should be property and not field; also set namespace to empty value or you'll have to deal with namespaces in your dictionary as well. Like:
namespace XmlStuff {
[DataContract(Namespace = "")]
public class myLongChild
{
[DataMember]
public double myLongText { get; set; }
}
[DataContract(Namespace = "")]
public class myLongRoot
{
[DataMember]
public IList<myLongChild> Items { get; set; }
}
}
2) Provide instance of session as well; for null session the dictionary writer uses default (XmlWriter-like) implementation:
// order matters - add new items only at the bottom
static readonly string[] s_Terms = new string[]
{
"myLongRoot", "myLongChild", "myLongText",
"http://www.w3.org/2001/XMLSchema-instance", "Items"
};
public class CustomXmlBinaryWriterSession : XmlBinaryWriterSession
{
private bool m_Lock;
public void Lock() { m_Lock = true; }
public override bool TryAdd(XmlDictionaryString value, out int key)
{
if (m_Lock)
{
key = -1;
return false;
}
return base.TryAdd(value, out key);
}
}
static void InitializeWriter(out XmlDictionary dict, out XmlBinaryWriterSession session)
{
dict = new XmlDictionary();
var result = new CustomXmlBinaryWriterSession();
var key = 0;
foreach(var term in s_Terms)
{
result.TryAdd(dict.Add(term), out key);
}
result.Lock();
session = result;
}
static void InitializeReader(out XmlDictionary dict, out XmlBinaryReaderSession session)
{
dict = new XmlDictionary();
var result = new XmlBinaryReaderSession();
for (var i = 0; i < s_Terms.Length; i++)
{
result.Add(i, s_Terms[i]);
}
session = result;
}
static void Main(string[] args)
{
XmlDictionary dict;
XmlBinaryWriterSession session;
InitializeWriter(out dict, out session);
var root = new myLongRoot { Items = new List<myLongChild>() };
root.Items.Add(new myLongChild { myLongText = 24 });
root.Items.Add(new myLongChild { myLongText = 25 });
root.Items.Add(new myLongChild { myLongText = 27 });
byte[] buffer;
using (var stream = new MemoryStream())
{
using (var writer = XmlDictionaryWriter.CreateBinaryWriter(stream, dict, session))
{
var dcs = new DataContractSerializer(typeof(myLongRoot));
dcs.WriteObject(writer, root);
}
buffer = stream.ToArray();
}
XmlBinaryReaderSession readerSession;
InitializeReader(out dict, out readerSession);
using (var stream = new MemoryStream(buffer, false))
{
using (var reader = XmlDictionaryReader.CreateBinaryReader(stream, dict, new XmlDictionaryReaderQuotas(), readerSession))
{
var dcs = new DataContractSerializer(typeof(myLongRoot));
var rootCopy = dcs.ReadObject(reader);
}
}
}
I'm sure its very straightforward but I am struggling to figure out how to write an array to file using CSVHelper.
I have a class for example
public class Test
{
public Test()
{
data = new float[]{0,1,2,3,4};
}
public float[] data{get;set;}
}
i would like the data to be written with each array value in a separate cell. I have a custom converter below which is instead providing one cell with all the values in it.
What am I doing wrong?
public class DataArrayConverter<T> : ITypeConverter
{
public string ConvertToString(TypeConverterOptions options, object value)
{
var data = (T[])value;
var s = string.Join(",", data);
}
public object ConvertFromString(TypeConverterOptions options, string text)
{
throw new NotImplementedException();
}
public bool CanConvertFrom(Type type)
{
return type == typeof(string);
}
public bool CanConvertTo(Type type)
{
return type == typeof(string);
}
}
To further detail the answer from Josh Close, here what you need to do to write any IEnumerable (including arrays and generic lists) in a recent version (anything above 3.0) of CsvHelper!
Here the class under test:
public class Test
{
public int[] Data { get; set; }
public Test()
{
Data = new int[] { 0, 1, 2, 3, 4 };
}
}
And a method to show how this can be saved:
static void Main()
{
using (var writer = new StreamWriter("db.csv"))
using (var csv = new CsvWriter(writer))
{
var list = new List<Test>
{
new Test()
};
csv.Configuration.HasHeaderRecord = false;
csv.WriteRecords(list);
writer.Flush();
}
}
The important configuration here is csv.Configuration.HasHeaderRecord = false;. Only with this configuration you will be able to see the data in the csv file.
Further details can be found in the related unit test cases from CsvHelper.
In case you are looking for a solution to store properties of type IEnumerable with different amounts of elements, the following example might be of any help:
using CsvHelper;
using CsvHelper.Configuration;
using System.Collections.Generic;
using System.IO;
using System.Linq;
namespace CsvHelperSpike
{
class Program
{
static void Main(string[] args)
{
using (var writer = new StreamWriter("db.csv"))
using (var csv = new CsvWriter(writer))
{
csv.Configuration.Delimiter = ";";
var list = new List<AnotherTest>
{
new AnotherTest("Before String") { Tags = new List<string> { "One", "Two", "Three" }, After="After String" },
new AnotherTest("This is still before") {After="after again", Tags=new List<string>{ "Six", "seven","eight", "nine"} }
};
csv.Configuration.RegisterClassMap<TestIndexMap>();
csv.WriteRecords(list);
writer.Flush();
}
using(var reader = new StreamReader("db.csv"))
using(var csv = new CsvReader(reader))
{
csv.Configuration.IncludePrivateMembers = true;
csv.Configuration.RegisterClassMap<TestIndexMap>();
var result = csv.GetRecords<AnotherTest>().ToList();
}
}
private class AnotherTest
{
public string Before { get; private set; }
public string After { get; set; }
public List<string> Tags { get; set; }
public AnotherTest() { }
public AnotherTest(string before)
{
this.Before = before;
}
}
private sealed class TestIndexMap : ClassMap<AnotherTest>
{
public TestIndexMap()
{
Map(m => m.Before).Index(0);
Map(m => m.After).Index(1);
Map(m => m.Tags).Index(2);
}
}
}
}
By using the ClassMap it is possible to enable HasHeaderRecord (the default) again. It is important to note here, that this solution will only work, if the collection with different amounts of elements is the last property. Otherwise the collection needs to have a fixed amount of elements and the ClassMap needs to be adapted accordingly.
This example also shows how to handle properties with a private set. For this to work it is important to use the csv.Configuration.IncludePrivateMembers = true; configuration and have a default constructor on your class.
Unfortunately, it doesn't work like that. Since you are returning , in the converter, it will quote the field, as that is a part of a single field.
Currently the only way to accomplish what you want is to write manually, which isn't too horrible.
foreach( var test in list )
{
foreach( var item in test.Data )
{
csvWriter.WriteField( item );
}
csvWriter.NextRecord();
}
Update
Version 3 has support for reading and writing IEnumerable properties.
How would I automate the creation of a default implementation of a class from an interface using conventions. In other words, if I have an interface:
public interface ISample
{
int SampleID {get; set;}
string SampleName {get; set;}
}
Is there a snippet, T4 template, or some other means of automatically generating the class below from the interface above? As you can see, I want to put the underscore before the name of the field and then make the field the same name as the property, but lower-case the first letter:
public class Sample
{
private int _sampleID;
public int SampleID
{
get { return _sampleID;}
set { _sampleID = value; }
}
private string _sampleName;
public string SampleName
{
get { return _sampleName;}
set { _sampleName = value; }
}
}
I am not sure if T4 would be the easiest solution here in terms of readability but you can also use another code generation tool at your disposal: the CodeDom provider.
The concept is very straightforward: code consists of building blocks that you put together.
When the time is ripe, these building blocks are then parsed into the language of choice . What you end up with is a string that contains the source code of your newly created program. Afterwards you can write this to a textfile to allow for further use.
As you have noticed: there is no compile-time result, everything is runtime. If you really want compiletime then you should use T4 instead.
The code:
using System;
using System.CodeDom;
using System.CodeDom.Compiler;
using System.IO;
using System.Reflection;
using System.Text;
namespace TTTTTest
{
internal class Program
{
private static void Main(string[] args)
{
new Program();
}
public Program()
{
// Create namespace
var myNs = new CodeNamespace("MyNamespace");
myNs.Imports.AddRange(new[]
{
new CodeNamespaceImport("System"),
new CodeNamespaceImport("System.Text")
});
// Create class
var myClass = new CodeTypeDeclaration("MyClass")
{
TypeAttributes = TypeAttributes.Public
};
// Add properties to class
var interfaceToUse = typeof (ISample);
foreach (var prop in interfaceToUse.GetProperties())
{
ImplementProperties(ref myClass, prop);
}
// Add class to namespace
myNs.Types.Add(myClass);
Console.WriteLine(GenerateCode(myNs));
Console.ReadKey();
}
private string GenerateCode(CodeNamespace ns)
{
var options = new CodeGeneratorOptions
{
BracingStyle = "C",
IndentString = " ",
BlankLinesBetweenMembers = false
};
var sb = new StringBuilder();
using (var writer = new StringWriter(sb))
{
CodeDomProvider.CreateProvider("C#").GenerateCodeFromNamespace(ns, writer, options);
}
return sb.ToString();
}
private void ImplementProperties(ref CodeTypeDeclaration myClass, PropertyInfo property)
{
// Add private backing field
var backingField = new CodeMemberField(property.PropertyType, GetBackingFieldName(property.Name))
{
Attributes = MemberAttributes.Private
};
// Add new property
var newProperty = new CodeMemberProperty
{
Attributes = MemberAttributes.Public | MemberAttributes.Final,
Type = new CodeTypeReference(property.PropertyType),
Name = property.Name
};
// Get reference to backing field
var backingRef = new CodeFieldReferenceExpression(new CodeThisReferenceExpression(), backingField.Name);
// Add statement to getter
newProperty.GetStatements.Add(new CodeMethodReturnStatement(backingRef));
// Add statement to setter
newProperty.SetStatements.Add(
new CodeAssignStatement(
new CodeFieldReferenceExpression(new CodeThisReferenceExpression(), backingField.Name),
new CodePropertySetValueReferenceExpression()));
// Add members to class
myClass.Members.Add(backingField);
myClass.Members.Add(newProperty);
}
private string GetBackingFieldName(string name)
{
return "_" + name.Substring(0, 1).ToLower() + name.Substring(1);
}
}
internal interface ISample
{
int SampleID { get; set; }
string SampleName { get; set; }
}
}
This produces:
Magnificent, isn't it?
Sidenote: a property is given Attributes = MemberAttributes.Public | MemberAttributes.Final because omitting the MemberAttributes.Final would make it become virtual.
And last but not least: the inspiration of this awesomeness. Metaprogramming in .NET by Kevin Hazzard and Jason Bock, Manning Publications.