Output not correctly being written to CSV - c#

So I am having an issue with my output being written to a CSV file. The output to this code is in the correct format when being written to the CSV file but it is only entering a single row in the file. There should be much more. Around 150 lines. Current out put is:
(859.85 7N830127 185)
Which is correct, but there should be more of these. It seems like to me that it is only writing the first line of the parsed EDI file and then stopping. I need to find a way to make sure it writes all data that is being parsed can anyone help me?
static void Main(string[] args)
{
StreamReader sr = new StreamReader("edifile.txt");
string[] ediMapTemp = sr.ReadLine().Split('|');
List<string[]> ediMap = new List<string[]>();
List<object[]> outputMatrix = new List<object[]>();
foreach (var line in ediMapTemp)
{
ediMap.Add(line.Split('~'));
}
DetailNode node = new DetailNode(0, null, 0);
int hierarchicalDepth = 0;
int hierarchicalIdNumber;
int hierarchicalParentIdNumber;
int hierarchicalLevelCode;
int hierarchicalChildCode = 0;
for (int i = 0; i < ediMap.Count; i++)
{
string segmentHeader = ediMap[i][0];
if (segmentHeader == "HL")
{
hierarchicalIdNumber = Convert.ToInt32(ediMap[i][1]);
hierarchicalParentIdNumber = Convert.ToInt32(ediMap[i][2]);
hierarchicalLevelCode = Convert.ToInt32(ediMap[i][3]);
hierarchicalChildCode = Convert.ToInt32(ediMap[i][4]);
List<string[]> levelDetails = new List<string[]>();
for (int v = i + 1; v < ediMap.Count; v++)
{
if (ediMap[v][0] == "HL") break;
levelDetails.Add(ediMap[v]);
}
DetailNode getNode = node.Find(node, hierarchicalParentIdNumber);
getNode.headList.Add(new DetailNode(hierarchicalIdNumber, levelDetails, getNode.depth + 1));
}
}
node.Traversal(new VID(), node);
foreach (var vid in VIDList.vidList)
using (StreamWriter writer = new StreamWriter("Import.csv"))
{
//probably a loop here
writer.WriteLine(String.Join(",", vid.totalCurrentCharges, vid.assetId, vid.componentName, vid.recurringCharge));
}
}

Quick Review, should the code be the following:
using (StreamWriter writer = new StreamWriter("Import.csv"))
{
//a loop here
foreach (var vid in VIDList.vidList)
{
writer.WriteLine(String.Join(",", vid.totalCurrentCharges, vid.assetId, vid.componentName, vid.recurringCharge));
}
}
You would open the file once and then loop thru your collection, writing each one.

It looks to me like you're re-opening the output file for each line you're trying to write, so you're overwriting the output file with a new file for each line. That would mean only the last entry remains in the file. Try moving this line
using (StreamWriter writer = new StreamWriter("Import.csv"))
Outside the foreach loop.

Related

CsvHelper doesn't write to file as expected

Below is the code I am using, I expect that each line will be wrote to the file during each iteration of the loop. What I am finding is that if I end the program early using ctrl-c or just closing the window the file is empty.
I need to write to the file quite often, I have 33k items to add to the file and its almost inevitable something goes wrong before the 33k items are added to the CsvWriter. If I short circuit the loop to say 20 items and let it finish, the output file is correct. How do I solve this issue?
int i = 0;
IList<string> pcodes = FindAutobidPcodes();
using (StreamWriter sw = new StreamWriter($"HPH_Catalog_{DateTime.Now.Ticks}.csv"))
using (CsvWriter cw = new CsvWriter(sw, CultureInfo.InvariantCulture))
{
cw.Configuration.RegisterClassMap<ProductMap>();
foreach (string code in pcodes)
{
string search = $"&search=%7b\"catalog\"%3a\"{catalog}\"%2c\"text\"%3a%7b\"{code}\"%3a\"{code}\"%7d%7d";
searchQuery.Query = $"{RandomString()}{limit}{page}{search}{gtemplate}&token={token}";
WebPage webPage = QueryProductData(searchQuery.ToString());
cw.WriteRecord(Product.ParseJSON(webPage));
cw.NextRecord();
i++;
if(i == 20) { return; }
}
}
What you can do is flushing the StreamWriter after each iteration eg:
static void Main(string[] args)
{
using (StreamWriter sw = new StreamWriter($"HPH_Catalog_{DateTime.Now.Ticks}.csv"))
using (CsvWriter cw = new CsvWriter(sw, CultureInfo.InvariantCulture))
{
for (int i = 0; i < 34000; i++)
{
cw.WriteField(i);
cw.NextRecord();
sw.Flush();
System.Threading.Thread.Sleep(100);
}
}
}
You can also encapsulate the inner foor-each loop into a try statement- if something goes wrong whilst looping then in the catch statement flush and dispose.

create lines with strings for each blank line until it reaches line 100

I'm new to C# i need help on reading a file that currently has 7 lines of text but I need it to write "Line PlaceHolder" after those 7 lines until it reaches line 100 in the text file. This is what i have so far and i know it's my failed attempt: EDIT: It's good but only issue is an exception is throw that a process is already using the text file, how do I solve this to read/write that file at the same time??
public void ReadFile()
{
if (File.Exists(AccountsFile))
{
using (StreamReader Reader = new StreamReader(AccountsFile))
using (StreamWriter Writer = new StreamWriter((AccountsFile)))
{
for (int i = 0; i < 100; i++)
{
string line;
if ((line = Reader.ReadLine()) == null)
{
Writer.WriteLine("Line Placeholder");
}
}
}
}
else
{
File.Create(AccountsFile);
}
}
You could first read the file contents into an array using File.ReadAllLines, get the array .Length (representing the number of lines in the file), and subtract that number from 100 to see how many lines you need to write. If the number is greater than zero, then create a List<string> with that many empty lines and write those lines to the end of the file using File.AppendAllLines:
// See how many lines we need to add
var newLinesNeeded = 100 - File.ReadAllLines(AccountsFile).Length;
// Add them if needed
if (newLinesNeeded > 0)
{
// Create a list of empty lines
var blankLines = new List<string>();
for(int i = 0; i < newLinesNeeded; i++)
{
blankLines.Add("");
}
// Append them to our file
File.AppendAllLines(AccountsFile, blankLines);
}
Looks like you are just missing an else:
public void ReadFile()
{
if (File.Exists(AccountsFile))
{
using (StreamReader Reader = new StreamReader(AccountsFile))
using (StreamWriter Writer = new StreamWriter((AccountsFile)))
{
for (int i = 0; i < 100; i++)
{
string line;
if ((line = Reader.ReadLine()) == null)
{
Writer.WriteLine("Line Placeholder");
}
else
Writer.WriteLine(line);
}
}
}
else
{
File.Create(AccountsFile);
}
}
this may work if you do not mind opening the file as Read/Write
using (FileStream fileStream = File.Open(AccountsFile, FileMode.OpenOrCreate, FileAccess.ReadWrite))
{
var streamWriter = new StreamWriter(fileStream);
var streamReader = new StreamReader(fileStream);
var i = 0;
// read and count the lines
while (streamReader.ReadLine() != null){
i++;
}
// if any more lines are needed write them
while (i++ < 100)
{
streamWriter.WriteLine("Line Placeholder");
}
streamWriter.Flush();
}

How to write back to the CSV file

I am trying to write back to CSV file , How can I change this script to write back to CSV file,I am new to C#? Please help me
public void Main()
{
List<String> lines = new List<string>();
string line;
System.IO.StreamReader file = new System.IO.StreamReader("C:\\NewFolder\\Test.csv");
while ((line = file.ReadLine()) != null)
{
lines.Add(line);
}
lines.RemoveAll(l => l.Contains(",,,,,"));
using (System.IO.StreamWriter writer = new System.IO.StreamWriter("C:\\NewFolder\\Test.csv", false))
{
for (int i = 0; i < lines.Count; i++)
{
writer.WriteLine(lines[i]);
}
}
Dts.TaskResult = (int)ScriptResults.Success;
}
Simply this
//Convert your list to csv string
string toWrite = string.Join(",",lines );
File.WriteAllText(YourFilePath, toWrite);
You're using a StreamReader to read from the file. Use a StreamWriter to write back to it. This code should do it
using (System.IO.StreamWriter writer = new System.IO.StreamWriter("c:\\test.txt", false))
{
for (int i = 0; i < lines.Count; i++)
{
writer.WriteLine(lines[i]);
}
}
Note this assumes 2 things:
You've closed your reader
You're wanting to replace the original file
EDIT: Alternatively you can rewrite the whole thing as:
string fileName = "C:\\NewFolder\\Test.csv";
List<string> lines = System.IO.File.ReadAllLines(fileName).ToList();
lines.RemoveAll(l => l.Contains(",,,,,"));
System.IO.File.WriteAllLines(fileName, lines.ToArray());

c# Remove rows from csv

I have two csv files. In the first file i have a list of users, and in the second file i have a list of duplicate users. Im trying to remove the rows in the first file that are equal to the second file.
Heres the code i have so far:
StreamWriter sw = new StreamWriter(path3);
StreamReader sr = new StreamReader(path2);
string[] lines = File.ReadAllLines(path);
foreach (string line in lines)
{
string user = sr.ReadLine();
if (line != user)
{
sw.WriteLine(line);
}
File 1 example:
Modify,ABAMA3C,Allpay - Free State - HO,09072701
Modify,ABCG327,Processing Centre,09085980
File 2 Example:
Modify,ABAA323,Group HR Credit Risk & Finance
Modify,ABAB959,Channel Sales & Service,09071036
Any suggestions?
Thanks.
All you'd have to do is change the following file paths in the code below and you will get a file back (file one) without the duplicate users from file 2. This code was written with the idea in mind that you want something that is easy to understand. Sure there are other more elegant solutions, but I wanted to make it as basic as possible for you:
(Paste this in the main method of your program)
string line;
StreamReader sr = new StreamReader(#"C:\Users\J\Desktop\texts\First.txt");
StreamReader sr2 = new StreamReader(#"C:\Users\J\Desktop\texts\Second.txt");
List<String> fileOne = new List<string>();
List<String> fileTwo = new List<string>();
while (sr.Peek() >= 0)
{
line = sr.ReadLine();
if(line != "")
{
fileOne.Add(line);
}
}
sr.Close();
while (sr2.Peek() >= 0)
{
line = sr2.ReadLine();
if (line != "")
{
fileTwo.Add(line);
}
}
sr2.Close();
var t = fileOne.Except(fileTwo);
StreamWriter sw = new StreamWriter(#"C:\Users\justin\Desktop\texts\First.txt");
foreach(var z in t)
{
sw.WriteLine(z);
}
sw.Flush();
If this is not homework, but a production thing, and you can install assemblies, you'll save 3 hours of your life if you swallow your pride and use a piece of the VB library:
There are many exceptions (CR/LF between commas=legal in quotes; different types of quotes; etc.) This will handle anything excel will export/import.
Sample code to load a 'Person' class pulled from a program I used it in:
Using Reader As New Microsoft.VisualBasic.FileIO.TextFieldParser(CSVPath)
Reader.TextFieldType = Microsoft.VisualBasic.FileIO.FieldType.Delimited
Reader.Delimiters = New String() {","}
Reader.TrimWhiteSpace = True
Reader.HasFieldsEnclosedInQuotes = True
While Not Reader.EndOfData
Try
Dim st2 As New List(Of String)
st2.addrange(Reader.ReadFields())
If iCount > 0 Then ' ignore first row = field names
Dim p As New Person
p.CSVLine = st2
p.FirstName = st2(1).Trim
If st2.Count > 2 Then
p.MiddleName = st2(2).Trim
Else
p.MiddleName = ""
End If
p.LastNameSuffix = st2(0).Trim
If st2.Count >= 5 Then
p.TestCase = st2(5).Trim
End If
If st2(3) > "" Then
p.AccountNumbersFromCase.Add(st2(3))
End If
While p.CSVLine.Count < 15
p.CSVLine.Add("")
End While
cases.Add(p)
End If
Catch ex As Microsoft.VisualBasic.FileIO.MalformedLineException
MsgBox("Line " & ex.Message & " is not valid and will be skipped.")
End Try
iCount += 1
End While
End Using
this to close the streams properly:
using(var sw = new StreamWriter(path3))
using(var sr = new StreamReader(path2))
{
string[] lines = File.ReadAllLines(path);
foreach (string line in lines)
{
string user = sr.ReadLine();
if (line != user)
{
sw.WriteLine(line);
}
}
}
for help on the real logic of removal or compare, answer the comment of El Ronnoco above...
You need to close the streams or utilize using clause
sw.Close();
using(StreamWriter sw = new StreamWriter(#"c:\test3.txt"))
You can use LINQ...
class Program
{
static void Main(string[] args)
{
var fullList = "TextFile1.txt".ReadAsLines();
var removeThese = "TextFile2.txt".ReadAsLines();
//Change this line if you need to change the filter results.
//Note: this assume you are wanting to remove results from the first
// list when the entire record matches. If you want to match on
// only part of the list you will need to split/parse the records
// and then filter your results.
var cleanedList = fullList.Except(removeThese);
cleanedList.WriteAsLinesTo("result.txt");
}
}
public static class Tools
{
public static IEnumerable<string> ReadAsLines(this string filename)
{
using (var reader = new StreamReader(filename))
while (!reader.EndOfStream)
yield return reader.ReadLine();
}
public static void WriteAsLinesTo(this IEnumerable<string> lines, string filename)
{
using (var writer = new StreamWriter(filename) { AutoFlush = true, })
foreach (var line in lines)
writer.WriteLine(line);
}
}
using(var sw = new StreamWriter(path3))
using(var sr = new StreamReader(path))
{
string []arrRemove = File.ReadAllLines(path2);
HashSet<string> listRemove = new HashSet<string>(arrRemove.Count);
foreach(string s in arrRemove)
{
string []sa = s.Split(',');
if( sa.Count < 2 ) continue;
listRemove.Add(sa[1].toUpperCase());
}
string line = sr.ReadLine();
while( line != null )
{
string []sa = line.Split(',');
if( sa.Count < 2 )
sw.WriteLine(line);
else if( !listRemove.contains(sa[1].toUpperCase()) )
sw.WriteLine(line);
line = sr.ReadLine();
}
}

Reading/writing CSV/tab delimited files in c#

I need to read from a CSV/Tab delimited file and write to such a file as well from .net.
The difficulty is that I don't know the structure of each file and need to write the cvs/tab file to a datatable, which the FileHelpers library doesn't seem to support.
I've already written it for Excel using OLEDB, but can't really see a way to write a tab file for this, so will go back to a library.
Can anyone help with suggestions?
.NET comes with a CSV/tab delminited file parser called the TextFieldParser class.
http://msdn.microsoft.com/en-us/library/microsoft.visualbasic.fileio.textfieldparser.aspx
It supports the full RFC for CSV files and really good error reporting.
I used this CsvReader, it is really great and well configurable. It behaves well with all kinds of escaping for strings and separators. The escaping in other quick and dirty implementations were poor, but this lib is really great at reading. With a few additional codelines you can also add a cache if you need to.
Writing is not supported but it rather trivial to implement yourself. Or inspire yourself from this code.
Simple example with CsvHelper
using (TextWriter writer = new StreamWriter(filePath)
{
var csvWriter = new CsvWriter(writer);
csvWriter.Configuration.Delimiter = "\t";
csvWriter.Configuration.Encoding = Encoding.UTF8;
csvWriter.WriteRecords(exportRecords);
}
Here are a couple CSV reader implementations:
http://www.codeproject.com/KB/database/CsvReader.aspx
http://www.heikniemi.fi/jhlib/ (just one part of the library; includes a CSV writer too)
I doubt there is a standard way to convert CSV to DataTable or database 'automatically', you'll have to write code to do that. How to do that is a separate question.
You'll create your datatable in code, and (presuming a header row) can create columns based on your first line in the file. After that, it will simply be a matter of reading the file and creating new rows based on the data therein.
You could use something like this:
DataTable Tbl = new DataTable();
using(StreamReader sr = new StreamReader(path))
{
int count = 0;
string headerRow = sr.Read();
string[] headers = headerRow.split("\t") //Or ","
foreach(string h in headers)
{
DataColumn dc = new DataColumn(h);
Tbl.Columns.Add(dc);
count++;
}
while(sr.Peek())
{
string data = sr.Read();
string[] cells = data.Split("\t")
DataRow row = new DataRow();
foreach(string c in cells)
{
row.Columns.Add(c);
}
Tbl.Rows.Add(row);
}
}
The above code has not been compiled, so it may have some errors, but it should get you on the right track.
You can read and write csv file..
This may be helpful for you.
pass split char to this parameter "serparationChar"
Example : -
private DataTable dataTable = null;
private bool IsHeader = true;
private string headerLine = string.Empty;
private List<string> AllLines = new List<string>();
private StringBuilder sb = new StringBuilder();
private char seprateChar = ',';
public DataTable ReadCSV(string path, bool IsReadHeader, char serparationChar)
{
seprateChar = serparationChar;
IsHeader = IsReadHeader;
using (StreamReader sr = new StreamReader(path,Encoding.Default))
{
while (!sr.EndOfStream)
{
AllLines.Add( sr.ReadLine());
}
createTemplate(AllLines);
}
return dataTable;
}
public void WriteCSV(string path,DataTable dtable,char serparationChar)
{
AllLines = new List<string>();
seprateChar = serparationChar;
List<string> StableHeadrs = new List<string>();
int colCount = 0;
using (StreamWriter sw = new StreamWriter(path))
{
foreach (DataColumn col in dtable.Columns)
{
sb.Append(col.ColumnName);
if(dataTable.Columns.Count-1 > colCount)
sb.Append(seprateChar);
colCount++;
}
AllLines.Add(sb.ToString());
for (int i = 0; i < dtable.Rows.Count; i++)
{
sb.Clear();
for (int j = 0; j < dtable.Columns.Count; j++)
{
sb.Append(Convert.ToString(dtable.Rows[i][j]));
if (dataTable.Columns.Count - 1 > j)
sb.Append(seprateChar);
}
AllLines.Add(sb.ToString());
}
foreach (string dataline in AllLines)
{
sw.WriteLine(dataline);
}
}
}
private DataTable createTemplate(List<string> lines)
{
List<string> headers = new List<string>();
dataTable = new DataTable();
if (lines.Count > 0)
{
string[] argHeaders = null;
for (int i = 0; i < lines.Count; i++)
{
if (i > 0)
{
DataRow newRow = dataTable.NewRow();
// others add to rows
string[] argLines = lines[i].Split(seprateChar);
for (int b = 0; b < argLines.Length; b++)
{
newRow[b] = argLines[b];
}
dataTable.Rows.Add(newRow);
}
else
{
// header add to columns
argHeaders = lines[0].Split(seprateChar);
foreach (string c in argHeaders)
{
DataColumn column = new DataColumn(c, typeof(string));
dataTable.Columns.Add(column);
}
}
}
}
return dataTable;
}
I have found best solution
http://www.codeproject.com/Articles/415732/Reading-and-Writing-CSV-Files-in-Csharp
Just I had to re-write
void ReadTest()
{
// Read sample data from CSV file
using (CsvFileReader reader = new CsvFileReader("ReadTest.csv"))
{
CsvRow row = new CsvRow();
while (reader.ReadRow(row))
{
foreach (string s in row)
{
Console.Write(s);
Console.Write(" ");
}
Console.WriteLine();
row = new CsvRow(); //this line added
}
}
}
Well, there is another library Cinchoo ETL - an open source one, for reading and writing CSV files.
Couple of ways you can read CSV files
Id, Name
1, Tom
2, Mark
This is how you can use this library to read it
using (var reader = new ChoCSVReader("emp.csv").WithFirstLineHeader())
{
foreach (dynamic item in reader)
{
Console.WriteLine(item.Id);
Console.WriteLine(item.Name);
}
}
If you have POCO object defined to match up with CSV file like below
public class Employee
{
public int Id { get; set; }
public string Name { get; set; }
}
You can parse the same file using this POCO class as below
using (var reader = new ChoCSVReader<Employee>("emp.csv").WithFirstLineHeader())
{
foreach (var item in reader)
{
Console.WriteLine(item.Id);
Console.WriteLine(item.Name);
}
}
Please check out articles at CodeProject on how to use it.
Disclaimer: I'm the author of this library

Categories