I am using csvhelper to write data to a csv file. I am using C# and VS2010. The object I wish to write is a complex data type object of type Dictionary< long,List<HistoricalProfile>>
Below is the code I am using to write the data to the csv file:
Dictionary<long, List<HistoricalProfile>> profiles
= historicalDataGateway.GetAllProfiles();
var fileName = CSVFileName + ".csv";
CsvWriter writer = new CsvWriter(new StreamWriter(fileName));
foreach (var items in profiles.Values)
{
writer.WriteRecords(items);
}
writer.Dispose();
When it loops the second time I get an error
The header record has already been written. You can't write it more than once.
Can you tell me what I am doing wrong here. My final goal is to have a single csv file with a huge list of records.
Thanks in advance!
Have you seen this library http://www.filehelpers.net/ this makes it very easy to read and write CSV files
Then your code would just be
var profiles = historicalDataGateway.GetAllProfiles(); // should return a list of HistoricalProfile
var engine = new FileHelperEngine<HistoricalProfile>();
// To Write Use:
engine.WriteFile("FileOut.txt", res);
I would go more low-level and iterate through the collections myself:
var fileName = CSVFileName + ".csv";
var writer = new StreamWriter(fileName);
foreach (var items in profiles.Values)
{
writer.WriteLine(/*header goes here, if needed*/);
foreach(var item in items)
{
writer.WriteLine(item.property1 +"," +item.propery2...);
}
}
writer.Close();
If you want to make the routine more useful, you could use reflection to get the list of properties you wish to write out and construct your record from there.
Related
I have a method, the input of which is a list of file addresses that I want to open this files and process it. this address contains the file extension. I know for sure that I have 3 file extensions (txt, xlsx, xls)
in the code pathWithFilesName it input list with file path;
then I want to send them to methods that will open and process them
pathWithFilesName.Add("ds.xlsx");
pathWithFilesName.Add("ds.txt");
var listExcel=new List<string>();
var listTxt= new List<string>();
var validExcelFileTypes = new List<string>{ ".xls", ".xlsx" };
foreach (var path in pathWithFilesName)
{
foreach (var valid in validExcelFileTypes)
{
if (path.EndsWith(valid))
{
listExcel.Add(path);
}
else
{
listTxt.Add(path);
}
}
}
this variant not optimal at all but work)
i know how take excel files on link
var list= (from path in pathWithFilesName from valid in validExcelFileTypes where path.EndsWith(valid) select path).ToList();
but with this approach I need then compare 2 lists. for example some kind of Intersect
what is the best way to make a sample?
Here is a variation using LinQ and lambda. It should not be more efficient not better or worse. It may be more readable.
The listExcel can be find that way :
var listExcel = pathWithFilesName.Where(path=>validExcelFileTypes.Any(ext=> path.EndsWith(ext)));
Enumerable.Any
Enumerable.Where
If you need both list in one go. You can group the source on the same condition:
var listGrp = pathWithFilesName.GroupBy(path=>validExcelFileTypes.Any(ext=> path.EndsWith(ext)));
You can use MoreLinQ Partition: "Partitions a sequence by a predicate,..".
var (listExcel, listTxt) = pathWithFilesName
.Partition(p =>
validExcelFileTypes.Any(ext => p.EndsWith(ext))
);
Under the hood it's just a GroupBy source code. Unrolled into a Named Tuple.
Live demo
How can I read multiple csv file in a folder?
I have a program that map a csv file into its correct format using csvhelper library. And here is my code:
static void Main()
{
var filePath = Path.Combine(Environment.GetFolderPath(Environment.SpecialFolder.Desktop), "test.csv");<br>
var tempFilePath = Path.GetTempFileName();
using (var reader = new StreamReader(filePath))
using (var csvReader = new CsvReader(reader))
using (var writer = new StreamWriter(tempFilePath))
using (var csvWriter = new CsvWriter(writer))
{
csvReader.Configuration.RegisterClassMap<TestMapOld>();
csvWriter.Configuration.RegisterClassMap<TestMapNew>();
csvReader.Configuration.Delimiter = ",";
var records = csvReader.GetRecords<Test>();
csvReader.Configuration.PrepareHeaderForMatch = header =>
{
var newHeader = Regex.Replace(header, #"\s", string.Empty);
newHeader = newHeader.Trim();
newHeader = newHeader.ToLower();
return newHeader;
};
csvWriter.WriteRecords(records);
}
File.Delete(filePath);
File.Move(tempFilePath, filePath);
}
Since this is homework (and/or you are seemingly new to coding), I'll give you a very suitable answer that will give you many hours of fun and excitement as you go through the documentation and samples provided
You need
GetFiles(String, String)
Returns the names of files (including their paths) that match the
specified search pattern in the specified directory.
searchPattern
String The search string to match against the names of files in path.
This parameter can contain a combination of valid literal path and
wildcard (* and ?) characters, but it doesn't support regular
expressions.
foreach, in (C# reference)
The foreach statement executes a statement or a block of statements
for each element in an instance of the type that implements the
System.Collections.IEnumerable or
System.Collections.Generic.IEnumerable interface.
I'll leave the details up to you.
One Module reads some files using
File.ReadAllBytes("path")
and stores the result in a database table.
Later I take the result from the table and normaly use
File.WriteAllBytes("outputPath", binary.Data)
to write the file back.
Now I have to change the content of the file. Of course I can write the file to a temp folder, read it back in as a File object, change it, write it back to the destination folder.
But is there a smarter way to do that? Create the File object directly out of the binary data?
Got a solution. Its the Encoding Class:
var binary = GetBinaryFileFromDatabase();
var inputLines = Encoding.Default.GetString(binary).Split('\n');
var outputLines = new List<string>();
foreach (var line in inputLines)
{
var columns = line.Split(';');
if (columns.Length > 28 && columns[28] != null && columns[28].Length > 0)
{
columns[28] = columns[28].Replace(Path.GetFileName(columns[28]), "SomeValue");
}
outputLines.Add(string.Join(";", columns));
}
File.WriteAllLines(outputPath, outputLines);
Thanks everybody!
So I've been reading that I shouldn't write my own CSV reader/writer, so I've been trying to use the CsvHelper library installed via nuget. The CSV file is a grey scale image, with the number of rows being the image height and the number columns the width. I would like to read the values row-wise into a single List<string> or List<byte>.
The code I have so far is:
using CsvHelper;
public static List<string> ReadInCSV(string absolutePath)
{
IEnumerable<string> allValues;
using (TextReader fileReader = File.OpenText(absolutePath))
{
var csv = new CsvReader(fileReader);
csv.Configuration.HasHeaderRecord = false;
allValues = csv.GetRecords<string>
}
return allValues.ToList<string>();
}
But allValues.ToList<string>() is throwing a:
CsvConfigurationException was unhandled by user code
An exception of type 'CsvHelper.Configuration.CsvConfigurationException' occurred in CsvHelper.dll but was not handled in user code
Additional information: Types that inherit IEnumerable cannot be auto mapped. Did you accidentally call GetRecord or WriteRecord which acts on a single record instead of calling GetRecords or WriteRecords which acts on a list of records?
GetRecords is probably expecting my own custom class, but I'm just wanting the values as some primitive type or string. Also, I suspect the entire row is being converted to a single string, instead of each value being a separate string.
According to #Marc L's post you can try this:
public static List<string> ReadInCSV(string absolutePath) {
List<string> result = new List<string>();
string value;
using (TextReader fileReader = File.OpenText(absolutePath)) {
var csv = new CsvReader(fileReader);
csv.Configuration.HasHeaderRecord = false;
while (csv.Read()) {
for(int i=0; csv.TryGetField<string>(i, out value); i++) {
result.Add(value);
}
}
}
return result;
}
If all you need is the string values for each row in an array, you could use the parser directly.
var parser = new CsvParser( textReader );
while( true )
{
string[] row = parser.Read();
if( row == null )
{
break;
}
}
http://joshclose.github.io/CsvHelper/#reading-parsing
Update
Version 3 has support for reading and writing IEnumerable properties.
The whole point here is to read all lines of CSV and deserialize it to a collection of objects. I'm not sure why do you want to read it as a collection of strings. Generic ReadAll() would probably work the best for you in that case as stated before. This library shines when you use it for that purpose:
using System.Linq;
...
using (var reader = new StreamReader(path))
using (var csv = new CsvReader(reader))
{
var yourList = csv.GetRecords<YourClass>().ToList();
}
If you don't use ToList() - it will return a single record at a time (for better performance), please read https://joshclose.github.io/CsvHelper/examples/reading/enumerate-class-records
Please try this. This had worked for me.
TextReader reader = File.OpenText(filePath);
CsvReader csvFile = new CsvReader(reader);
csvFile.Configuration.HasHeaderRecord = true;
csvFile.Read();
var records = csvFile.GetRecords<Server>().ToList();
Server is an entity class. This is how I created.
public class Server
{
private string details_Table0_ProductName;
public string Details_Table0_ProductName
{
get
{
return details_Table0_ProductName;
}
set
{
this.details_Table0_ProductName = value;
}
}
private string details_Table0_Version;
public string Details_Table0_Version
{
get
{
return details_Table0_Version;
}
set
{
this.details_Table0_Version = value;
}
}
}
You are close. It isn't that it's trying to convert the row to a string. CsvHelper tries to map each field in the row to the properties on the type you give it, using names given in a header row. Further, it doesn't understand how to do this with IEnumerable types (which string implements) so it just throws when it's auto-mapping gets to that point in testing the type.
That is a whole lot of complication for what you're doing. If your file format is sufficiently simple, which yours appear to be--well known field format, neither escaped nor quoted delimiters--I see no reason why you need to take on the overhead of importing a library. You should be able to enumerate the values as needed with System.IO.File.ReadLines() and String.Split().
//pseudo-code...you don't need CsvHelper for this
IEnumerable<string> GetFields(string filepath)
{
foreach(string row in File.ReadLines(filepath))
{
foreach(string field in row.Split(',')) yield return field;
}
}
static void WriteCsvFile(string filename, IEnumerable<Person> people)
{
StreamWriter textWriter = File.CreateText(filename);
var csvWriter = new CsvWriter(textWriter, System.Globalization.CultureInfo.CurrentCulture);
csvWriter.WriteRecords(people);
textWriter.Close();
}
I am attempting to read in a CSV dynamically via FileHelpers and work with the CSV data as a datatable. My CSV files will not be the same. They will have different column headers and different amounts of columns. I am using the ReadStreamAsDT method, but it seems to still want a structured class to initialize the FileHelperEngine. Any ideas?
I had to use FileHelpers.RunTime and the DelimitedClassBuilder to create a DataTable from the file. Here is my method. If I get more time, I will explain this better.
private static DataTable CreateDataTableFromFile(byte[] importFile) {
var cb = new DelimitedClassBuilder("temp", ",") { IgnoreFirstLines = 0, IgnoreEmptyLines = true, Delimiter = "," };
var ms = new MemoryStream(importFile);
var sr = new StreamReader(ms);
var headerArray = sr.ReadLine().Split(',');
foreach (var header in headerArray) {
cb.AddField(header, typeof(string));
cb.LastField.FieldQuoted = true;
cb.LastField.QuoteChar = '"';
}
var engine = new FileHelperEngine(cb.CreateRecordClass());
return engine.ReadStreamAsDT(sr);
}
Obviously, there is a lot of validation along with other logic taking place around this method, but I do not have much time right now to dive into it. Hope this helps!
Have you tried using http://www.codeproject.com/KB/database/CsvReader.aspx ? You could leverage this library's parsing. It's fast and reliable.