I have a method I am using for a test that takes the data in a table row, turns it into a string then uses the Split() function to add each different string to an object in my class. I then take that and try and add it to a list of the same class. Each time it runs through the foreach loop it is just updating each object to the new data that is supposed to be added so that the data is the same in each index whereas it is supposed to be different. How can I fix this?
Here is the method:
public List<IMUItemModel> tableData()
{
_webDriver.Wait.UntilPageLoadIsComplete(60);
List<IMUItemModel> IMUDataList = new List<IMUItemModel>();
IMUItemModel IMUData = new IMUItemModel();
foreach (var row in IMUDataTableRows)
{
string column = row.Text;
var split = column.Split(' ');
IMUData.MedVendID = split[0];
IMUData.DeviceSerialNumber = split[1];
IMUData.ItemID = split[2];
IMUData.Station = split[3];
IMUData.Console = split[4];
IMUDataList.Add(IMUData);
Console.WriteLine(IMUData.MedVendID);
}
return IMUDataList;
}
You are creating a single instance of IMUData for entire list and so, the same reference is being assigned the values in each iteration. At the end of the iteration, all the items will have the data of last item because you just kept on updating the same reference.
To fix this,
move the below line inside the foreach loop
IMUItemModel IMUData = new IMUItemModel();
This will ensure you have a new reference each time you add an item to the list.
The txt file is of a specific form, it uses ';' as delimiter and has a specific number of columns. I also have a table that I created code-first with Entity Framework, which has the same number of columns.
So far I was able to import that kind of txt files to tables using "raw" SQL queries like BULK INSERT. But I am trying to learn how to do this from a web app using C# (or LINQ if needed).
I came across this solution from another question, but it seems that it creates a table named tbl, what I would like to do instead is to insert the data into an existing one.
public DataTable ConvertToDataTable (string filePath, int numberOfColumns)
{
DataTable tbl = new DataTable();
for(int col =0; col < numberOfColumns; col++)
tbl.Columns.Add(new DataColumn("Column" + (col+1).ToString()));
string[] lines = System.IO.File.ReadAllLines(filePath);
foreach(string line in lines)
{
var cols = line.Split(':');
DataRow dr = tbl.NewRow();
for(int cIndex=0; cIndex < 3; cIndex++)
{
dr[cIndex] = cols[cIndex];
}
tbl.Rows.Add(dr);
}
return tbl;
}
First of all, my advise would be not to read the CSV file yourself. Use a NUGET CSV file serializer like CSVHelper
With CSVHelper you directly convert the lines into your destination type:
using (TextReader txtReader = new StreamReader(sourceFileName)
{
csvReader = new CsvReader(txtReader)
IEnumerable<MyClass> result = csvReader.GetRecords<MyClass>()
// TODO: put result into database
}
One of the constructors of CsvReader takes a configuration object in which you can define your delimiter (":"); header rows; Comment lines; what to do with empty lines etc.
If you decide not to use CsvHelper you will need to convert your lines into MyClass objects:
IEnumerable<MyClass> ConvertTxtFile(string fileName)
{
// TODO: checks to see if fileName is proper file
IEnumerable<string> lines = System.IO.File.ReadAllLines(fileName);
foreach(string line in lines)
{
yield return StringToMyClass(line);
}
}
MyClass StringToMyClass(string line)
{
// TODO: code to convert your line into a MyClass.
}
As you don't ask how to convert a line into a MyClass, I leave this to you.
After a while, you have a sequence of MyClass objects. Your question is how to add them to your database using Entity Framework and Linq
Well, that will be the easy part (once you've learned how to use entity framework).
Supposing your DbContext has a DbSet<MyClass>, representing a table of MyClass objects
IEnumerable<MyClass> readItems = ConvertTxtFile(fileName);
using (var dbContext = new MyDbContext())
{
dbContext.MyClasses.AddRange(readItems.ToList());
dbContext.SaveChanges();
}
with the following code Im trying to read a csv file that contains double values and convert it into a list. If I want to print that list The output just contains "system.collections.generic.list1 system.string". What is wrong in my code?
var filePath = #"C:\Users\amuenal\Desktop\Uni\test.csv";
var contents = File.ReadAllText(filePath).Split(';');
var csv = from line in contents
select line.Split(';').ToList();
foreach (var i in csv)
{
Console.WriteLine(i);
}
You got a couple things wrong with your code. First, you should most likely be using ReadAllLines() instead of ReadAllText(). Secondly, your LINQ query was returning a List<List<string>> which I imagine is not what you wanted. I would try something like this:
var filePath = #"C:\Users\amuenal\Desktop\Uni\test.csv";
//iterate through all the rows
foreach (var row in File.ReadAllLines(filePath))
{
//iterate through each column in each row
foreach(var col in row.Split(';'))
{
Console.WriteLine(col);
}
}
This should do good. Hope this helps.
var filePath = #"C:\Users\amuenal\Desktop\Uni\test.csv";
var contents = File.ReadAllLines(filePath);
var csv = (from line in contents
select line.Split(';')).SelectMany(x1 => x1);
foreach (var i in csv)
{
Console.WriteLine(i);
}
csv is an IEnumerable of a List of string. (in other words, each "i" is a list of string).
You need two loops:
foreach (var list in csv)
{
foreach(var str in list)
{
Console.WriteLine(str);
}
}
I have an excel sheet with a single column of data.
I've only really done it with a csv but I've forgotten how to do it with a single row. I can't seem to get it to work. What I have now with what I think is right but is not working.
List<string> strList = new List<string>();
var reader = new StreamReader(File.OpenRead(#"C:\vcl\CompletionTest.xlsx"));
while (!reader.EndOfStream)
{
var line = reader.ReadLine();
var value = reader[0].ToString();
strList.Add(values[0]);
}
Please and thank you!
public partial class Form1 : Form
{
public SqlConnection con = new SqlConnection(#"Data Source=(LocalDB)\v11.0;AttachDbFilename=C:\Users\Alexander\Desktop\Archivos.mdf;Integrated Security=True;Connect Timeout=30");
private void Readbtn_Click(object sender, EventArgs e)
{
con.Open();
Propiedades prop = new Propiedades();
List<string> myValues = new List<string>();
string line;
StreamReader file = new StreamReader(#"c:\temp\archivo.txt");
if ((line = file.ReadLine()) != null)
{
string[] fields = line.Split(',');
prop.matricula = fields[0].ToString();
prop.nombre = fields[1].ToString();
prop.sueldo = decimal.Parse(fields[2]);
for (int i = 0; i < fields.Length; i++)
{
listBox1.Items.Add(fields[i]);
}
}
SqlCommand cmd = new SqlCommand("INSERT INTO Archivos(Codigo, Nombre, Sueldo) VALUES (#Matricula, #Nombre, #Sueldo", con);
cmd.Parameters.AddWithValue("#Matricula", prop.matricula);
cmd.Parameters.AddWithValue("#Nombre", prop.nombre);
cmd.Parameters.AddWithValue("#Sueldo", prop.sueldo);
cmd.ExecuteNonQuery();
con.Close();
}
Hello Guys, I just need a little bit of help on modifying this code. I already have this to save the first line of the text file but I dont have any ideas on how to make it read the other lines. Also I would like to validate that if the SQL table contains already that info, it will launch an exception or message box letting the user know that the file already exist. Please help
There are a few things you need to do the the first that jumps out at me is that you want to use a while loop in place of your first if statement
while ((line = file.ReadLine()) != null)
{
string[] fields = line.Split(',');
prop.matricula = fields[0].ToString();
prop.nombre = fields[1].ToString();
prop.sueldo = decimal.Parse(fields[2]);
for (int i = 0; i < fields.Length; i++)
{
listBox1.Items.Add(fields[i]);
}
}
Also depending how you want to display the items in the list box you may want to use instead of that internal for loop:
listBox1.Items.Add(prop.matricula+','+prop.nombre+',prop.sueldo.toString());
I would recomend reading the file line by line, adding the values to a DataTable and then using SQL Bulk Insert to store it to the database.
It would be something like
int batchSize = 10000;
SqlBulkCopy bc = new SqlBulkCopy(con);
DataTable dtArchivos = new DataTable();
dtArchivos.Columns.AddRange(new []
{
new DataColumn("Codigo", typeof(string)),
new DataColumn("Nombre", typeof(string)),
new DataColumn("Sueldo", typeof(decimal)),
});
StreamReader file = new StreamReader(#"c:\temp\archivo.txt");
string line = file.ReadLine();
while (line != null)
{
string[] fields = line.Split(',');
DataRow dr = dtArchivos.NewRow();
dr["Codigo"] = fields[0].ToString();
dr["Nombre"] = fields[1].ToString();
dr["Sueldo"] = decimal.Parse(fields[2]);
dtArchivos.Rows.Add(dr);
if (dtArchivos.Rows.Count == batchSize)
{
bc.WriteToServer(dtArchivos);
dtArchivos.Clear();
}
line = file.ReadLine();
}
bc.WriteToServer(dtArchivos);
I added some logic to do batching, as a large file might cause the application to run out of memory due to the size of the DataTable.
First of all, consider using LINQ for your code - if you are not overly concerned about slightly more overhead, LINQ will make your life a whole lot easier. It will automatically cache the information for you, and can be more efficient for a lot of things, not to mention that the code is a lot easier to understand at times. For this, you would need to, in Visual Studio, create a ado.net model of the database, which shouldn't take too long. The precise way, I think, would be to
File -> Add Item -> Add new Item
or something of the like, then searching for ado.net entity. Then, just fill in the details that it asks. You can then use this entity (if you can't find out how to do it, then please reply so that I can fix it as need be) to add data, using the way that you specified.
Now, the reason why it may not be working code be a code problem. I think using the seperator may be the problem, and the following ammended code may be helpful:
StreamReader file = new StreamReader(#"c:\temp\archivo.txt");
while(file.Read()){
//YOUR CODE FOR CONCATENATING DATA HERE
}
The reason is that the reader would read until it has a return value of False or Null. Your code for some reason seems to be leaving after the first line, and this may solve it.