I have a large text file, i.e. Samples.txt file, every 8 line is a one row to be inserted into a table in sql server and the data is in the following format in the mentioned text file,
Company Name:Xpress Care
Sector:Transportation and storage
Operation Type:Logistic Services
License Number:D-39277
Expiry Date:2012-07-18
Contact Numbers:0771709155 / 0789444211
Email:naikmalemail#hotmail.com
Address:House 119, Street 4, Taemany, District 4
So far I wrote the following code in an attempt to bring it into a format so that I can insert into the table like the following.
insert into table(company, sector, operation, license, expiry, contact, email, address) Values ('Xpress Care','Transportation and storage','Logistic Services','D-39277','2012-07-18', '0771709155 / 0789444211','naikmalemail#hotmail.com','House 119, Street 4, Taemany, District 4');
Here is the code I wrote:
static void Main(string[] args)
{
int counter = 0;
int linecounter = 1;
string line;
// Read the file and display it line by line.
System.IO.StreamReader file =
new System.IO.StreamReader("c:\\sample.txt");
while ((line = file.ReadLine()) != null)
{
Console.WriteLine(line);
// splite with the : delimeter
string[] values = line.Split(':');
//Console.WriteLine("column name:- {0} value:- {1}",values[0],values[1]);
//hashtable to store the key value pairs from the text file
Hashtable myHT = new Hashtable();
// I AM STUCK here!!! I want to add to and keep the values for 8 lines
myHT.Add(values[0], values[1]);
//if linecounter is 8 then I have the values for one new row to be inserted in the table
if (linecounter == 8)
{
Console.WriteLine("\n\r code to insert the values in the query example below from the hashtable\n\r");
// insert into table(company, sector, operation, license, expiry, contact, email, address) Values ('Xpress Care','Transportation and storage','Logistic Services','D-39277','2012-07-18', '0771709155 / 0789444211','naikmalemail#hotmail.com','House 119, Street 4, Taemany, District 4');
// reset the linecounter and empty the hashtable here for the next row to insert
linecounter = 0;
}
linecounter++;
counter++;
}
file.Close();
// Suspend the screen.
Console.ReadLine();
}
What I am trying to do with the code is that I want to add and keep the key value pairs into a HashTable for 8 lines, so that I can use the 8 values to insert into the 8 columns in the table in the if(linenumber==8) condition part but now it only keeps the value from the last line only.
I will really appreciate your kind help and ideas. If you have trouble understanding the problem, please let me i will explain with more details or if there is another way to do this.
It seems that you need to move the declaration and initialization of your HashTable outside the loop and clear it when you have done the eight line block read
static void Main(string[] args)
{
int counter = 0;
int linecounter = 1;
string line;
//hashtable to store the key value pairs from the text file
Hashtable myHT = new Hashtable();
// Read the file and display it line by line.
System.IO.StreamReader file =
new System.IO.StreamReader("c:\\sample.txt");
while ((line = file.ReadLine()) != null)
{
....
// Continue to add values to the hashtable until you reach the 8 row boundary
myHT.Add(values[0], values[1]);
if (linecounter == 8)
{
..... insert ...
// reset the linecounter and empty the hashtable here for the next row to insert
linecounter = 0;
myHT.Clear();
}
linecounter++;
counter++;
}
A part from this. I suggest to use a different class for your work. In your situation I would use a Dictionary<string,string> to have a strong typed approach to the problem
you have to created the table with all the destination fields first.then
LOAD DATA INFILE '../sample.txt' INTO TABLE PerformanceReport;
By default LOAD DATA INFILE uses tab delimited, one row per line, so should take it in just fine.
if the format in the TXT file is always the same why not use this..
` while ((line = file.ReadLine()) != null)
{
Console.WriteLine(line);
// splite with the : delimeter
string[] values = line.Split(':');
if (values[0]== "Company Name") company= value[1];
if ((line = file.ReadLine()) != null)
string[] values = line.Split(':');
if (values[0]== "Sector") Sector= value[1];
...
...
insert into table(company, sector, operation, license, expiry, contact, email, address)
(#company, #sector,.....
///please insure injection protection
}`
If its a big file you can store your data in a DataTable (System.Data.DataTable) and then write it quickly using SqlBulkCopy (System.Data.SqlClient.SqlBulkCopy). The code would look something like this:
System.IO.StreamReader file = new System.IO.StreamReader(#"c:\sample.txt");
string line = null;
int linecounter = 0;
//structure to hold data to be written to the database
System.Data.DataTable table = new System.Data.DataTable();
table.Columns.Add("company");
table.Columns.Add("sector");
table.Columns.Add("operation");
table.Columns.Add("license");
table.Columns.Add("expiry");
table.Columns.Add("contact");
table.Columns.Add("email");
table.Columns.Add("address");
System.Data.DataRow row = null;
while ((line = file.ReadLine()) != null)
{
//create a new table row if the line is {0,8,16,...}
if (linecounter % 8 == 0)
row = table.NewRow();
string[] values = line.Split(':');
//put the data in the appropriate column based on "linecounter % 8"
row[linecounter % 8] = values[1];
//add the row to the table if its been fully populated
if (linecounter % 8 == 7)
table.Rows.Add(row);
linecounter++;
}
file.Close();
string connectionString = "<CONNECTION STRING GOES HERE>";
using (System.Data.SqlClient.SqlBulkCopy copy = new System.Data.SqlClient.SqlBulkCopy(connectionString))
{
copy.DestinationTableName = "MyTable";
copy.WriteToServer(table);
}
You can get help creating your connection string at : Connection strings Reference
NOTE: This method assumes that you've already created a table called "MyTable" in SQL Server, and that it has the 8 varchar columns specified in the DataTable.
Related
I am new to C# and I have taken a small task on. The StackOverflow entry for reading a text file and saving to a list is a great start for me. I need to read a text file and send the data to an SQL database.
How to Read This Text File and store in a list using C#
The Data in List<Data> list = new List<Data>(); just keeps staying in red.
How can I stop this please?
I am a PLC engineer and I'm trying to collate data that cannot be handled by PLC. I am just trying to read the file so that I can then show the data in a Grid, with a view to populating the SQL database later.
The text file is held on a remote Linux machine. The collator is a WIndows 10 panel. The SQL can reside on teh Windows panel or remotely.
static void Main(string[] args)
{
List<Data> list = new List<Data>();
var dd = File.ReadAllLines(#"C:\Users\XXXX\Desktop\test.txt")
.Skip(1)
.Where(s => s.Length > 1).ToList();
foreach (var item in dd)
{
var columns = item.Split('\t').Where(c => c.Trim() != string.Empty).ToList();
if (columns != null && columns.Count > 0)
{
int id;
if (int.TryParse(columns[0], out id))
{
list.Add(new Data()
{
id = Convert.ToInt32(columns[0]),
Name = columns[1],
Description = columns[2],
Quantity = Convert.ToInt32(columns[3]),
Rate = Convert.ToDouble(columns[4]),
Discount = Convert.ToInt32(columns[5]),
Amount = int.Parse(columns[6])
});
}
else
{
list.Last().Description += columns[0];
}
}
}
Console.ReadLine();
}
I just keep receiving red squiggly lines on <Data. within Visual Studio
I got the code to work and I read the DAT/text file straight into a DatagridView. I am now writing the Grid to SQL.
Many thanks, sorry for latency as I've been away on-site.
String sLine = "";
try
{
//Pass the file you selected with the OpenFileDialog control to
//the StreamReader Constructor.
System.IO.StreamReader FileStream = new System.IO.StreamReader(openFileDialog1.FileName);
//You must set the value to false when you are programatically adding rows to
//a DataGridView. If you need to allow the user to add rows, you
//can set the value back to true after you have populated the DataGridView
dataGridView1.AllowUserToAddRows = false;
// Clear the DataGridView prior to reading a new text file
dataGridView1.Rows.Clear();
dataGridView1.Columns.Clear();
//Read the first line of the text file
sLine = FileStream.ReadLine();
//The Split Command splits a string into an array, based on the delimiter you pass.
//I chose to use a semi-colon for the text delimiter.
//Any character can be used as a delimeter in the split command.
//string[] s = sLine.Split(';');
string[] s = sLine.Split('\t');
//In this example, I placed the field names in the first row.
//The for loop below is used to create the columns and use the text values in
//the first row for the column headings.
for (int i = 0; i <= s.Count() - 1; i++)
{
DataGridViewColumn colHold = new DataGridViewTextBoxColumn();
colHold.Name = "col" + System.Convert.ToString(i);
colHold.HeaderText = s[i].ToString();
dataGridView1.Columns.Add(colHold);
}
//Read the next line in the text file in order to pass it to the
//while loop below
sLine = FileStream.ReadLine();
//The while loop reads each line of text.
while (sLine != null)
{
//Adds a new row to the DataGridView for each line of text.
dataGridView1.Rows.Add();
//This for loop loops through the array in order to retrieve each
//line of text.
for (int i = 0; i <= s.Count() - 1; i++)
{
//Splits each line in the text file into a string array
//s = sLine.Split(';');
s = sLine.Split('\t');
//Sets the value of the cell to the value of the text retreived from the text file.
dataGridView1.Rows[dataGridView1.Rows.Count - 1].Cells[i].Value = s[i].ToString();
}
sLine = FileStream.ReadLine();
}
//Close the selected text file.
FileStream.Close();
}
catch (Exception err)
{
//Display any errors in a Message Box.
System.Windows.Forms.MessageBox.Show("Error "+ err.Message, "Program Error", MessageBoxButtons.OK, MessageBoxIcon.Error);
}
I am trying to develop a tool that will take a CSV file and import it into a datatable with the first column in the datatable being a row counter.
The CSV files are from different customers and so have different structures. Some have a header line; some have several header lines; some have no header line. They have also have varying columns.
So far, I have the code below.
public void Import_CSV()
{
OpenFileDialog dialog = new OpenFileDialog();
dialog.Filter = "CSV Files (*.csv)|*.csv";
bool? result = dialog.ShowDialog();
if (result ?? false)
{
string[] headers;
string CSVFilePathName = dialog.FileName;
string delimSelect = cboDelimiter.Items.GetItemAt(cboDelimiter.SelectedIndex).ToString();
// If user hasn't selected a delimiter, assume comma
if (delimSelect == "")
{
delimSelect = ",";
}
string[] delimiterType = new string[] {cboDelimiter.Items.GetItemAt(cboDelimiter.SelectedIndex).ToString()};
DataTable dt = new DataTable();
// Read first line of file to get number of fields and create columns and column numbers in data table
using (StreamReader sr1 = new StreamReader(CSVFilePathName))
{
headers = sr1.ReadLine().Split(delimiterType, StringSplitOptions.None);
//dt.Columns.Add("ROW", typeof(int));
//dt.Columns["ROW"].AutoIncrement = true;
//dt.Columns["ROW"].AutoIncrementSeed = 1;
//dt.Columns["ROW"].AutoIncrementStep = 1;
int colCount = 1;
foreach (string header in headers)
{
dt.Columns.Add("C" + colCount.ToString());
colCount++;
}
}
using (StreamReader sr = new StreamReader(CSVFilePathName))
{
while (!sr.EndOfStream)
{
string[] rows = sr.ReadLine().Split(delimiterType, StringSplitOptions.None);
DataRow dr = dt.NewRow();
for (int i = 0; i < headers.Length; i++)
{
dr[i] = rows[i];
}
dt.Rows.Add(dr);
}
}
dtGrid.ItemsSource = dt.DefaultView;
txtColCount.Text = dtGrid.Columns.Count.ToString();
txtRowCount.Text = dtGrid.Items.Count.ToString();
}
}
This works, in as much as it creates column headers (C1, C2....according to how many there are in the csv file) and then the rows are written in, but I want to add a column at the far left with a row number as the rows are added. In the code, you can see I've got a section commented out that creates an auto-number column, but I'm totally stuck on how the rows are written into the datatable. If I uncomment that section, I get errors as the first column in the csv file tries to write into an int field. I know you can specify which field in each row can go in which column, but that won't help here as the columns are unknown at this point. I just need it to be able to read ANY file in, regardless of the structure, but with the row counter.
Hope that makes sense.
You write in your question, that uncommenting the code that adds the first column leads to errors. This is because of your loop: it starts at 0, but the 0-th column is the one you have added manually. So you need just to skip it in your loop, starting at 1. However, the source array has to be processed from the 0-th element.
So the solution is:
First, uncomment the row adding code.
Then, in your loop, introduce an offset to leave the first column untouched:
for (int i = 0; i < headers.Length; i++)
{
dr[i + 1] = rows[i];
}
I'm trying to import a CSV file to my C# site and save it in the database. While doing research I learned about CSV parsing, I've tried to implement this but I've ran into some trouble. Here is a portion of my code so far:
string fileext = Path.GetExtension(fupcsv.PostedFile.FileName);
if (fileext == ".csv")
{
string csvPath = Server.MapPath("~/CSVFiles/") + Path.GetFileName(fupcsv.PostedFile.FileName);
fupcsv.SaveAs(csvPath);
// Add Columns to Datatable to bind data
DataTable dtCSV = new DataTable();
dtCSV.Columns.AddRange(new DataColumn[2] { new DataColumn("ModuleId", typeof(int)), new DataColumn("CourseId", typeof(int))});
// Read all the lines of the text file and close it.
string[] csvData = File.ReadAllLines(csvPath);
// iterate over each row and Split it to New line.
foreach (string row in csvData)
{
// Check for is null or empty row record
if (!string.IsNullOrEmpty(row))
{
using (TextFieldParser parser = new TextFieldParser(csvPath))
{
parser.TextFieldType = FieldType.Delimited;
parser.SetDelimiters(",");
while (!parser.EndOfData)
{
//Process row
string[] fields = parser.ReadFields();
int i = 1;
foreach (char cell in row)
{
dtCSV.NewRow()[i] = cell;
i++;
}
}
}
}
}
}
I keep getting the error "There is no row at position -1" at " dtCSV.Rows[dtCSV.Rows.Count - 1][i] = cell;"
Any help would be greatly appreciated, thanks
You are trying to index rows that you have not created. Instead of
dtCSV.Rows[dtCSV.Rows.Count - 1][i] = cell;
use
dtCSV.NewRow()[i] = cell;
I also suggest you start indexing i from 0 and not from 1.
All right so it turns out there were a bunch of errors with your code, so I made some edits.
string fileext = Path.GetExtension(fupcsv.PostedFile.FileName);
if (fileext == ".csv")
{
string csvPath = Server.MapPath("~/CSVFiles/") + Path.GetFileName(fupcsv.PostedFile.FileName);
fupcsv.SaveAs(csvPath);
DataTable dtCSV = new DataTable();
dtCSV.Columns.AddRange(new DataColumn[2] { new DataColumn("ModuleId", typeof(int)), new DataColumn("CourseId", typeof(int))});
var csvData = File.ReadAllLines(csvPath);
bool headersSkipped = false;
foreach (string line in csvData)
{
if (!headersSkipped)
{
headersSkipped = true;
continue;
}
// Check for is null or empty row record
if (!string.IsNullOrEmpty(line))
{
//Process row
int i = 0;
var row = dtCSV.NewRow();
foreach (var cell in line.Split(','))
{
row[i] = Int32.Parse(cell);
i++;
}
dtCSV.Rows.Add(row);
dtCSV.AcceptChanges();
}
}
}
I ditched the TextFieldParser solution solely because I'm not familiar with it, but if you want to stick with it, it shouldn't be hard to reintegrate it.
Here are some of the things you got wrong:
Not calling NewRow() to create a new row or adding it to the table with AddRow(row)
Iterating through the characters in row instead of the fields you parsed
Not parsing the value of cell - it's value type is string and you are trying to add to an int column
Some other things worth noting (just to improve your code's performance and readability :))
Consider using var when declaring new variables, it takes a lot of the stress away from having to worry about exactly what type of variable you are creating
As others in the comments said, use ReadAllLines() it parses your text file into lines neatly, making it easier to iterate through.
Most of the times when working with arrays or lists, you need to index from 0, not from 1
You have to use AcceptChanges() to commit all the changes you've made
Hi all i have my database structure as follows
Field Type
FileHeader longblob
BatchHeader longblob
Entry longblob
BtchEntry longblob
FileControl longblob
I will have the data to be inserted is as follows
101 111111111 1111111111104021031A094101
52201 1 1 PPD1 110402110402 1111000020000001
6221110000251 00000000011 1 1 0111000020000001
822000000100111000020000000000000000000000011 111000020000001
52251 1 1 CCD1 110402110402 1111000020000002
6281110000251 00000000011 1 1 0111000020000002
822500000100111000020000000000010000000000001 111000020000002
9000006000001000000060066600012000000000003000000000003
as you can observe there are multiple lines that starts with 5,6 and 8. I would like to save those individually to the corresponding columns of my table. Is it possible to do if so can any mention the best method to do it. If unclear please specify
The code i written is
using (StreamReader srRead = new StreamReader(filePath))
{
while (srRead.Peek() >= 0)
{
strLine = srRead.ReadLine();
if (strLine.StartsWith("1"))
{
strFileHeader = strLine;
}
if (strLine.StartsWith("5"))
{
strBatchHeader = strLine;
}
if (strLine.StartsWith("6"))
{
strEntry = strLine;
}
if (strLine.StartsWith("8"))
{
strBtchcntrl = strLine;
}
if (strLine.StartsWith("9"))
{
strFileCntrl = strLine;
}
}
string strQuery = "insert into tblfiles(FName, FData,FileHeader,BatchHeader,Entry,BtchEntry,FileControl) values (#_FName,#_FData,#_FileHeader,#_BtchHeader,#_EntryDets,#_BtchCntrl,#_FileCntrl)";
MySqlCommand cmd = new MySqlCommand(strQuery);
cmd.Parameters.Add("#_FName", MySqlDbType.VarChar).Value = filename;
cmd.Parameters.Add("#_FData", MySqlDbType.LongBlob).Value = bytes;
cmd.Parameters.Add("#_FileHeader", MySqlDbType.LongBlob).Value = strFileHeader;
cmd.Parameters.Add("#_BtchHeader", MySqlDbType.LongBlob).Value = strBatchHeader;
cmd.Parameters.Add("#_EntryDets", MySqlDbType.LongBlob).Value = strEntry;
cmd.Parameters.Add("#_BtchCntrl", MySqlDbType.LongBlob).Value = strBtchcntrl;
cmd.Parameters.Add("#_FileCntrl", MySqlDbType.LongBlob).Value = strFileCntrl;
InsertUpdateData(cmd);
But this will insert the latest to the DB but i would like to save each and every line as per i stated
No - a column can only store one value per row. You could combine all your batch headers into one blob and store that as a single value, but you would have to be able to split them apart again when your read the data.
Instead - it looks as though:
each file starts with a '1' record and ends with a '9' record
each file contains zero or more batches
each batch starts with a '5' record and ends with an '8' record
each batch contains zero or more entries ('6' records)
If that is all correct, then you need 3 tables that would look something like:
File table:
Field Type
----------- --------
FileID integer # unique file ID - see AUTO_INCREMENT in the MySQL reference
FName varchar
FData longblob
FileHeader longblob # '1' record
FileControl longblob # '9' record
Batch table:
Field Type
----------- --------
FileID integer # references a row in the File table
BatchID integer # unique batch ID
BatchHeader longblob # '5' record
BatchControl longblob # '8' record
BatchEntry table:
Field Type
----------- --------
BatchID integer # references a row in the Batch table
EntryId integer # unique file ID
Entry longblob # '6' record
That should get you started. Good luck.
Why don't you use Stringbuilder and append the required lines to that string builder and write them to the DB instead of using strings. Seperating each column will be a tough one to retrieve the data if you need. So declare a string builder and append the lines to each and every one you required and after all write to the DB
string strFileHeader = string.Empty;
StringBuilder strBatchHeader=new StringBuilder();
StringBuilder strEntry=new StringBuilder();
StringBuilder strBtchcntrl=new StringBuilder();
string strFileCntrl = string.Empty;
using (StreamReader srRead = new StreamReader(filePath))
{
while (srRead.Peek() >= 0)
{
strLine = srRead.ReadLine();
if (strLine.StartsWith("1"))
{
strFileHeader = strLine;
}
if (strLine.StartsWith("5"))
{
strBatchHeader.AppendLine(strLine);
}
if (strLine.StartsWith("6"))
{
strEntry.AppendLine(strLine);
}
if (strLine.StartsWith("8"))
{
strBtchcntrl.AppendLine(strLine);
}
if (strLine.StartsWith("9"))
{
strFileCntrl = strLine;
}
}
string strQuery = "insert into tblfiles(FName, FData,FileHeader,BatchHeader,Entry,BtchEntry,FileControl) values (#_FName,#_FData,#_FileHeader,#_BtchHeader,#_EntryDets,#_BtchCntrl,#_FileCntrl)";
MySqlCommand cmd = new MySqlCommand(strQuery);
cmd.Parameters.Add("#_FName", MySqlDbType.VarChar).Value = filename;
cmd.Parameters.Add("#_FData", MySqlDbType.LongBlob).Value = bytes;
cmd.Parameters.Add("#_FileHeader", MySqlDbType.LongBlob).Value = strFileHeader;
cmd.Parameters.Add("#_BtchHeader", MySqlDbType.LongBlob).Value = strBatchHeader.ToString();
cmd.Parameters.Add("#_EntryDets", MySqlDbType.LongBlob).Value = strEntry.ToString();
cmd.Parameters.Add("#_BtchCntrl", MySqlDbType.LongBlob).Value = strBtchcntrl.ToString();
cmd.Parameters.Add("#_FileCntrl", MySqlDbType.LongBlob).Value = strFileCntrl;
InsertUpdateData(cmd);
i have text file that looks like this:
1 \t a
2 \t b
3 \t c
4 \t d
i have dataset: DataSet ZX = new DataSet();
is there any way for inserting the text file values to this dataset ?
thanks in advance
You will have to parse the file manually. Maybe like this:
string data = System.IO.File.ReadAllText("myfile.txt");
DataRow row = null;
DataSet ds = new DataSet();
DataTable tab = new DataTable();
tab.Columns.Add("First");
tab.Columns.Add("Second");
string[] rows = data.Split(new char[] { '\n' }, StringSplitOptions.RemoveEmptyEntries);
foreach (string r in rows)
{
string[] columns = r.Split(new char[] { '\t' }, StringSplitOptions.RemoveEmptyEntries);
if (columns.Length <= tab.Columns.Count)
{
row = tab.NewRow();
for (int i = 0; i < columns.Length; i++)
row[i] = columns[i];
tab.Rows.Add(row);
}
}
ds.Tables.Add(tab);
UPDATE
If you don't know how many columns in the text file you can modify my original example as the following (assuming that the number of columns is constant for all rows):
// ...
string[] columns = r.Split(new char[] { '\t' }, StringSplitOptions.RemoveEmptyEntries);
if (tab.Columns.Count == 0)
{
for(int i = 0; i < columns.Length; i++)
tab.Columns.Add("Column" + (i + 1));
}
if (columns.Length <= tab.Columns.Count)
{
// ...
Also remove the initial creation of table columns:
// tab.Columns.Add("First");
// tab.Columns.Add("Second")
-- Pavel
Sure there is,
Define a DataTable, Add DataColumn with data types that you want,
ReadLine the file, split the values by tab, and add each value as a DataRow to DataTable by calling NewRow.
There is a nice sample code at MSDN, take a look and follow the steps
Yes, create data tabel on the fly, refer this article for how-to
Read your file line by line and add those value to your data table , refer this article for how-to read text file
Try this
private DataTable GetTextToTable(string path)
{
try
{
DataTable dataTable = new DataTable
{
Columns = {
{"MyID", typeof(int)},
"MyData"
},
TableName="MyTable"
};
// Create an instance of StreamReader to read from a file.
// The using statement also closes the StreamReader.
using (StreamReader sr = new StreamReader(path))
{
String line;
// Read and display lines from the file until the end of
// the file is reached.
while ((line = sr.ReadLine()) != null)
{
string[] words = line.Split(new string[] { "\\t" }, StringSplitOptions.RemoveEmptyEntries);
dataTable.Rows.Add(words[0], words[1]);
}
}
return dataTable;
}
catch (Exception e)
{
// Let the user know what went wrong.
throw new Exception(e.Message);
}
}
Call it like
GetTextToTable(Path.Combine(Server.MapPath("."), "TextFile.txt"));
You could also check out CSV File Imports in .NET
I'd like also to add to the "volpan" code the following :
String _source = System.IO.File.ReadAllText(FilePath, Encoding.GetEncoding(1253));
It's good to add the encoding of your text file, so you can be able to read the data and in my case export those after modification to another file.