I would like to export different DataTable objects into different .csv files using a foreach loop and the method "WriteDataTable". The size of the Datatable objects is very small (just 15 columns and 2 rows).
This is the foreach loop:
foreach (var dt in DataTableList)
{
// Name of the file
string fileName = DateTime.Now.Year.ToString();
// Call method
WriteDataTable(fileName, dt);
}
This is the method:
public void WriteDataTable(string fileName, DataTable dt)
{
StringBuilder sb = new StringBuilder();
IEnumerable<string> columnNames = dt.Columns.Cast<DataColumn>()
.Select(column => column.ColumnName);
sb.AppendLine(string.Join(";", columnNames));
foreach (DataRow row in dt.Rows)
{
IEnumerable<string> fields = row.ItemArray.Select(field => field.ToString());
sb.AppendLine(string.Join(";", fields));
}
File.WriteAllText(fileName, sb.ToString());
}
The program runs properly but only some of the DataTable objects are really exported. It looks like the previous DataTable object is still being exported when the new DataTable object comes. If I pause the process using "Thread.Sleep (500)" all objects are exported:
foreach (var dt in DataTableList)
{
// Name of the file
string fileName = DateTime.Now.Year.ToString();
// Call method
WriteDataTable(fileName, dt);
// Pause the process
Thread.Sleep (500);
}
Any suggestion to export all the DataTable objects without slowing down the process?.
Thank you in advance.
I found the problem. Since I use "DateTime.Now" (until seconds) as file names, when I export two files within the same second the first export file is overwrited by the second one.
Related
What i´m trying to do is to use the documentnode method to find a specific table from the internet and put it into datagridview. My code can be seen below:
`
List<string> list = new List<string>();
DataTable dt1 = new DataTable();
var table = doc.DocumentNode.SelectNodes("xpath link")
.Descendants("tr")
.Where(tr=>tr.Elements("td").Count()>1)
.Select(td => td.InnerText.Trim())
.ToList();
foreach (var tables in table)
{ list.Add(tables.ToString());}
dataGridView1.DataSource = list
`
The result I get in the table is a list of numbers instead of text (datagridview table). As I have tried to see it the text actually appears I changed the foreach with the following code:
`
foreach (var tables in table)
{
list.Add(tables.ToString());
richTextBox1.Text += tables;
}
`
The result I get from the change is a string of the table in richTextBox1 but still a table of numbers in datagridview1 richtextbox1 text. This means I´m getting the right table from the internet and its being loaded correctly but i´m still missing something for the datagridview1 as I get a list of numbers instead of text that is being shown in richtextbox1. I followed this up by changing the DocumentNode function with removing parts in the .select part of the code and the datagridview1 stilled showed numbers (I added for example .ToString, .ToList() etc.).
What exactly have I missed in my code that makes this happen and should I have added something else to make it show the text instead of numbers?
Edit:
New code.
`
List<string> list = new List<string>();
DataTable dt1 = new DataTable();
dt1.Columns.Add("td", typeof(int));
var table = doc.DocumentNode.SelectNodes("//div[#id=\"cr_cashflow\"]/div[2]/div/table")
.Descendants("tr")
.Select(td => td.InnerText.Trim())
.ToList();
foreach (var tables in table)
{
dt1.Rows.Add(new object[] { int.Parse(tables) });
}
dataGridView1.DataSource= dt1;
`
Try something like this
List<string> list = new List<string>();
DataTable dt1 = new DataTable();
dt1.Columns.Add("td",typeof(int));
var rows = doc.DocumentNode.SelectNodes("xpath link")
.Descendants("tr")
.Where(tr=>tr.Elements("td").Count()>1)
.Select(td => td.InnerText.Trim())
.ToList();
foreach (var row in rows)
{
dt.Rows.Add(new object[] { int.Parse(row)});
}
So I'll explain my situation first.
I have a WPF View for my customer that is populated based on SQL strings that the customer defines. They can change these and add/remove these at any point and the structure of the result set is not in my control.
My expected output for this is
Populating the DataGrid at runtime without prior knowledge of the structure so using AutoGenerateColumns and providing dataTable.DefaultView as the ItemsSource for the DataGrid. This is bound to my DataGrid.
GetItemsSource = dataTable.DefaultView;
Export this DataGrid to a CSV for the customer to check whenever they want.
Now I already have a Generic List function to Save to CSV but since the structure is not known I can't change my dataTable to a list to use this.
My current solution is Save To CSV function that uses a dataTable instead of a List.
Is there some other type of data structure I could use instead of dataTable that would make using my generic function possible or do I have just have an extra Save To CSV function just for this scenario?
UPDATE
My generic list function
public static void SaveToCsv<T>(List<T> data, string filePath) where T : class
{
CreateDirectoryIfNotExists(filePath);
List<string> lines = new();
StringBuilder line = new();
if (data == null || data.Count == 0)
{
throw new ArgumentNullException("data", "You must populate the data parameter with at least one value.");
}
var cols = data[0].GetType().GetProperties();
foreach (var col in cols)
{
line.Append(col.Name);
line.Append(",");
}
lines.Add(line.ToString().Substring(0, line.Length - 1));
foreach (var row in data)
{
line = new StringBuilder();
foreach (var col in cols)
{
line.Append(col.GetValue(row));
line.Append(",");
}
lines.Add(line.ToString().Substring(0, line.Length - 1));
}
System.IO.File.WriteAllLines(filePath, lines);
}
My current Data Table function
public static void SaveToCsv(DataTable data, string filePath)
{
CreateDirectoryIfNotExists(filePath);
List<string> lines = new();
StringBuilder line = new();
if(data == null)
{
throw new ArgumentNullException("data", "The DataTable has no values to Save to CSV.");
}
IEnumerable<string> columnNames = data.Columns.Cast<DataColumn>().Select(column => column.ColumnName);
line.AppendLine(string.Join(",", columnNames));
lines.Add(line.ToString().Substring(0, line.Length - 3));
int prevlinelength = line.Length - 1;
foreach (DataRow row in data.Rows)
{
IEnumerable<string> fields = row.ItemArray.Select(field => field.ToString());
line.AppendLine(string.Join(",", fields));
lines.Add(line.ToString().Substring(prevlinelength + 1, line.Length - 3 - prevlinelength));
prevlinelength = line.Length - 1;
}
File.WriteAllLines(filePath, lines);
}
Is it possible to convert a DataTable to IEnumerable where the T can not be defined at compile time and is not known beforehand?
you can create generic objects at runtime, but it is not simple, so I would avoid it if possible.
Is there some other type of data structure I could use instead of dataTable that would make using my generic function possible or do I have just have an extra Save To CSV function just for this scenario?
You could simply convert the Rows property on your datatable and convert it to a List<DataRow> and give to your function. But it would probably not do what you want.
What you need is a some way to convert a DataRow into an object of a class with properties for each column, and while it is possible to create classes from a database model, it will be a lot of work to do so at runtime. I would guess far more than your current solution.
To conclude, keep your current solution if it works. Messing around with reflection and runtime code generation will just make things more complicated.
I've been struggling with a small piece of code for a little while now. I have a CSV file with one column that contains a string of numbers. I can import that file without issues and display it.
My goal is to take the numbers in each of the tables and put it into a separate string, run that string through a function and then put the results back into my datagrid in column two. Is there a way that I should be doing this using the code below; the foreach statement is where I believe this should be done.
Edit: I tweaked the code and it now works the way that I want it to but I can't insert my result into any columns except for the first one. Is there a way that I should be targeting the results so they go in the second column?
using (var fs = File.OpenRead(Dialog.FileName))
using (var reader = new StreamReader(fs))
{
List<string> lista = new List<string>();
List<string> listb = new List<string>();
while (!reader.EndOfStream)
{
var line = reader.ReadLine();
var values = line.Split(',');
lista.Add(values[0]);
dt1.Rows.Add(values[0]);
}
foreach (var item in lista)
{
string temp;
GetLuhnCheckDigit(item);
listb.Add(last.ToString());
temp = item + last.ToString();
dt1.Rows.Add(temp); //This only adds to the first column
}
dataGridView1.DataSource = dt1;
Without knowing what GetLuhnCheckDigit method does, it is not possible to determine what values you want the second column to contain. Looking at the posted code, there are many things missing like how many columns the data table has, where is the Dialog variable definition? What is last?
Assuming there are at least two columns in the DataTable dt1, I am not sure why you are adding the items to the first column then loop through a list of those items to set the second column. It appears adding both of the columns at the same time would be easier.
You could do all this while reading the file like below:
try {
using (var fs = File.OpenRead(Dialog.FileName)) {
using (var reader = new StreamReader(fs)) {
List<string> lista = new List<string>();
List<string> listb = new List<string>();
string temp;
while (!reader.EndOfStream) {
var line = reader.ReadLine();
var values = line.Split(',');
lista.Add(values[0]);
GetLuhnCheckDigit(values[0]); // <-- What is this method doing???
listb.Add(last.ToString());
temp = values[0] + last.ToString();
dt1.Rows.Add(values[0], temp); // <-- this adds both columns
}
dataGridView1.DataSource = dt1;
}
}
}
catch (Exception e) {
MessageBox.Show("Error: " + e.Message);
}
Let me know if I am missing something, as I am clueless as to what the GetLuhnCheckDigit method could be doing.
This question already has answers here:
Which is faster: clear collection or instantiate new
(8 answers)
Closed 8 years ago.
I have a performance related query where i am trying to understand which logic would be faster.
My code is something like below
list<string> test=new list<string>();
foreach (DataTable Dt in Product_DataSet.Tables)
{
foreach (DataRow dr in Product_DataTable.Rows)
{
test.add(dr["ColumnName"].tostring());
// Code to concatenate all string in list test to single variable
// Code to pass single variable to DB
}
}
Here I encountered Problem as the list "test" kept growing. So i modified my inner loop like below
foreach (DataRow dr in Product_DataTable.Rows)
{
test.add(dr["ColumnName"].tostring());
// Code to concatenate all string in list test to single variable
// Code to pass single variable to DB
test.Clear();
}
Is this a better option than
foreach (DataRow dr in Product_DataTable.Rows)
{
list<string> test=new list<string>();
test.add(dr["ColumnName"].tostring());
// Code to concatenate all string in list test to single variable
// Code to pass single variable to DB
}
The last option is not comparable to the first because you are always creating a new list instead of filling the same. So at the end it contains only the rows of the last table and not of all tables.
So it depends on your needs. If you only need one for each DataTable it is slightly better to use the list-constructor instead of List.Clear since the latter has to do more things.
So if you need one large list which contains all strings of all tables, why didn't you fill the list in the first place instead of loading the tables? You could have used a DataReader.
It looks to me as though your two options are:
for foreach (DataTable Dt in Product_DataSet.Tables)
{
list<string> test=new list<string>();
foreach (DataRow dr in Product_DataTable.Rows)
{
test.add(dr["ColumnName"].tostring();
}
//code to concat and send to db
}
or
list<string> test=new list<string>();
for foreach (DataTable Dt in Product_DataSet.Tables)
{
test.Clear();
foreach (DataRow dr in Product_DataTable.Rows)
{
test.add(dr["ColumnName"].tostring();
}
//code to concat and send to db
}
I very much doubt there's much in it, performance-wise: in each case, all the contents will have to be garbage collected. But you could test it, with a lot of data, to see what happens.
I have a DataSet. I would like to convert dataset column as header and row data as data into a tab delimited text file.
Is there any technique I can do in my end or I have to do the looping manually?
Sincerely Thanks,
- Sel
private static string GetTextFromDataTable(DataTable dataTable)
{
var stringBuilder = new StringBuilder();
stringBuilder.AppendLine(string.Join("\t", dataTable.Columns.Cast<DataColumn>().Select(arg => arg.ColumnName)));
foreach (DataRow dataRow in dataTable.Rows)
stringBuilder.AppendLine(string.Join("\t", dataRow.ItemArray.Select(arg => arg.ToString())));
return stringBuilder.ToString();
}
Usage:
var text = GetTextFromDataTable(dataSet.Tables[0]);
File.WriteAllText(filePath, text);
Exporting to XML is built right in, but exporting to CSV, you can use the following code - from http://social.msdn.microsoft.com/Forums/en-US/vbgeneral/thread/d2071fd4-8c7d-4d0e-94c3-9586df754df8/
this only writes the data, not the columns, you'll need to loop the column headers first..
Edit: Updated to include column names... I have not run this, and this is an edit from the link above, so it may or may not work, but the concept is here
StringBuilder str = new StringBuilder();
// get the column headers
foreach (var c in NorthwindDataSet.Customers.Columns) {
str.Append("\"" + c.ColumnName.ToString() + "\"\t");
}
str.Append("\r\n");
// write the data here
foreach (DataRow dr in this.NorthwindDataSet.Customers) {
foreach (var field in dr.ItemArray) {
str.Append("\"" + field.ToString() + "\"\t");
}
str.Append("\r\n");
}
try {
My.Computer.FileSystem.WriteAllText("C:\\temp\\testcsv.csv", str.ToString(), false);
} catch (Exception ex) {
MessageBox.Show("Write Error");
}
Note you will need to be using Linq for this solution to work. Add the following using statement to your code:
using System.Linq;