boolean value always coming false in csv import - c#

I am importing a csv file wherein there are 10 columns. Among those there is Boolean column which could be (1 , true ,false, 0) .
I am reading the csv file and creating a data-table all is working fine except the boolean value field , it always return false.
Below code for creating the data tabel from csv.
public static DataTable GetDataTable(bool firstRowColumnName, string path)
{
StreamReader sr = new StreamReader(path);
string line = null;
DataTable dtResult = new DataTable();
if (firstRowColumnName)
{
line = sr.ReadLine();
string[] headers = line.Split(',');
for (int i = 0; i < headers.Length; i++)
{
dtResult.Columns.Add(headers[i]);
}
}
while ((line = sr.ReadLine()) != null)
{
if (line.Trim() != string.Empty)
{
string[] lineData = line.Split(',');
if (dtResult.Columns.Count == 0)
{
for (int i = 0; i < lineData.Length; i++)
{
dtResult.Columns.Add("Column" + i.ToString());
}
}
DataRow drNew = dtResult.NewRow();
for (int i = 0; i < lineData.Length; i++)
{
drNew[i] = lineData[i];
}
dtResult.Rows.Add(drNew);
}
}
return dtResult;
}
below is the code where i am reading the datatable.
machine is an object of a class.
machine.IsLaptop = dt.Rows[i]["Laptop"].ToString() == "1" || dt.Rows[i]["Laptop"].ToString().ToLower() == "true";
Kindly suggest why the value is always false.

Since you're working with string values, the value being read probably is " true " including the spaces.

Try using Convert.ToBoolean and/or with Convert.ToInt32 to translate your values.
http://msdn.microsoft.com/en-us/library/system.convert.toboolean%28v=vs.110%29.aspx
You may need some logic to handle cases where you do not get a Boolean.TrueString or Boolean.FalseString. In that case you can Boolean.TryParse

Related

Import from large Excel file a Excel.Range to DataTabe(MS.Interop or ExcelDataReader)

One condition its I cant use File.Open or smth like this cause I've already opened Excel file.
So I Trying like this to use AsDataSet() method but throw exception cause (Access denied, even I using FileShare.ReadWrite)
//ExcelDataReader - Config with filter range
var i = 0;
ExcelDataSetConfiguration excelconfig = new ExcelDataSetConfiguration
{
ConfigureDataTable = _ => new ExcelDataTableConfiguration
{
UseHeaderRow = addHeaders,
FilterRow = rowreader => rangeExcel.Row <= ++i - 1,
FilterColumn = (rowreader, colindex) => rangeExcel.Column <= colindex,
},
UseColumnDataType = true,
};
//ExcelDataReader using AsDataSet() below
private static DataTable CreateDataReader(Excel.Workbook book, Excel.Range range, ExcelDataSetConfiguration configuration, string sheetName)
{
var dataSet = new DataSet(sheetName);
using (var stream = File.Open(book.Path, FileMode.OpenOrCreate, FileAccess.Read, FileShare.ReadWrite))
{
using (IExcelDataReader reader = ExcelReaderFactory.CreateReader(stream))
{
return reader.AsDataSet(configuration).Tables[sheetName];
}
}
}
so second variant its =>
private static DataTable CreateDT(Excel.Range range, bool addHeaders)
{
var dt = new DataTable();
int i = 1;
int row = range.Rows.Count;
int col = range.Columns.Count;
if (addHeaders)
{
//Add headers to dt
i++;
}
else
{
//just add columns to dt
}
// So here below Two variants =>
// 1st Working variant on prod
//here throw excetion "Out or Range" while Excel File is Large 40MB
var arrayRowValue = (object[,])range.Cells.Value;
for (; i < range.Rows.Count + 1; i++)
{
var dataRow = dt.NewRow();
object[] arrayToValue = new object[dt.Columns.Count];
for (int iterator = 0; iterator < arrayToValue.Length; iterator++)
{
arrayToValue[iterator] = arrayRowValue[i, iterator + 1];
}
dataRow.ItemArray = arrayToValue;
dt.Rows.Add(dataRow);
}
}
//2nd variant I want smth like this
DataRow dataRow;
for (; i < row + 1; i++)
{
dataRow = dt.NewRow();
for (int iterator = 0; iterator < col; iterator++)
{
dataRow.ItemArray = range.Rows[i, iterator + 1].Value;// Not contain Value
}
dt.Rows.Add(dataRow);
}
What can help me to better import large excel range into DataTable and most important, faster.
I already read this thread Using ExcelDataReader to read Excel data starting from a particular cell
Not helping because every example using File.Open, in my case i cant :) I already have Excel Application but its usless property for me :(

Reading .txt file content to DataTable with Column headers on first line

I am trying to load data from a .txt file which looks like this:
|ABC|DEF|GHI|
|111|222|333|
|444|555|666|
With code:
using (StringReader reader = new StringReader(new StreamReader(fileStream, Encoding.Default).ReadToEnd()))
{
string line;
//reader.ReadLine(); //skip first line
while (reader.Peek() != -1)
{
line = reader.ReadLine();
if (line == null || line.Length == 0)
continue;
string[] values = line.Split('|').Skip(1).ToArray();
if (!isColumnCreated)
{
for (int i = 0; i < values.Count(); i++)
{
table.Columns.Add(values[i]);
}
isColumnCreated = true;
}
DataRow row = table.NewRow();
for (int i = 0; i < values.Count(); i++)
{
row[i] = values[i];
}
table.Rows.Add(row);
products++;
}
}
The problem is, when I generate a DataTable, I have first line as Column, but first line:
|ABC|DEF|GHI|
is visible also in the rows:
How to put first line as column headers and rest as rows?
I do not want to use CSVHelper for that if it possible.
Just need to skip when the first line after header is created
string line;
bool bheader= false;
//reader.ReadLine(); //skip first line
while (reader.Peek() != -1)
{
line = reader.ReadLine();
if (line == null || line.Length == 0)
continue;
string[] values = line.Split('|').Skip(1).ToArray();
if (!isColumnCreated)
{
for (int i = 0; i < values.Count(); i++)
{
table.Columns.Add(values[i]);
}
isColumnCreated = true;
bheader = true;
}
if(bheader ==false){
DataRow row = table.NewRow();
for (int i = 0; i < values.Count(); i++)
{
row[i] = values[i];
}
table.Rows.Add(row);
products++;
}
}
bheader = false;
}
The issue with your current code is that you handle when isColumnCreated is false, but not true. If you change this:
if (!isColumnCreated)
{
for (int i = 0; i < values.Count(); i++)
{
table.Columns.Add(values[i]);
}
isColumnCreated = true;
}
DataRow row = table.NewRow();
for (int i = 0; i < values.Count(); i++)
{
row[i] = values[i];
}
table.Rows.Add(row);
products++;
to this
if (!isColumnCreated)
{
for (int i = 0; i < values.Count(); i++)
{
table.Columns.Add(values[i]);
}
isColumnCreated = true;
}
DataRow row = table.NewRow();
else if (isColumnCreated)
{
for (int i = 0; i < values.Count(); i++)
{
row[i] = values[i];
}
table.Rows.Add(row);
}
it should work just fine. By only creating a row if the column headers have been created you're creating a situation wherein only on the first pass do you do anything with the first row, then it gets dumped.
I would think you want to add the columns or add a row.
if (!isColumnCreated)
{
for (int i = 0; i < values.Count(); i++)
{
table.Columns.Add(values[i]);
}
isColumnCreated = true;
}
}
else
{
DataRow row = table.NewRow();
for (int i = 0; i < values.Count(); i++)
{
row[i] = values[i];
}
table.Rows.Add(row);
}
This would work
DataTable dt = new DataTable();
using (System.IO.StreamReader sr = new System.IO.StreamReader("PathToFile"))
{
string currentline = string.Empty;
bool doneHeader = false;
while ((currentline = sr.ReadLine()) != null)
{
if (!doneHeader)
{
foreach (string item in currentline.Split('YourDelimiter'))
{
dt.Columns.Add(item);
}
doneHeader = true;
continue;
}
dt.Rows.Add();
int colCount = 0;
foreach (string item in currentline.Split('YourDelimiter'))
{
dt.Rows[dt.Rows.Count - 1][colCount] = item;
colCount++;
}
}
}
Another method, more LINQ oriented.
Use File.ReadAllLines to parse all the File lines into a string array.
Create a List<string[]> containing all the data Rows. The columns values are composed splitting the string row using a
provided Delimiter.
The first Row values are used to build a DataTable Columns elements.
The first Row is removed from the List.
All the other Rows are added to the DataTable.Rows collection.
Set a DataGridView.DataSource to the new DataTable.
char Delimiter = '|';
string[] Lines = File.ReadAllLines("[SomeFilePath]", Encoding.Default);
List<string[]> FileRows = Lines.Select(line =>
line.Split(new[] { Delimiter }, StringSplitOptions.RemoveEmptyEntries)).ToList();
DataTable dt = new DataTable();
dt.Columns.AddRange(FileRows[0].Select(col => new DataColumn() { ColumnName = col }).ToArray());
FileRows.RemoveAt(0);
FileRows.ForEach(row => dt.Rows.Add(row));
dataGridView1.DataSource = dt;

Value was either too large or too small for an Int32. Reading and parsing textfile

I have been unable to apply any solutions to this issue. The exception happens to this line here: currentMap[row, col] = Int32.Parse(s); What I am wanting to do is pass this method a specific file storing rows of numbers like this:
1,1,1
1,0,1
1,1,1
I then want each number to be stored in int[,] currentMap which gets returned. The file I am using contains no large numbers. I think that the size of array I am creating is right and so I don't understand why this isn't working. I am used to doing similar stuff using NextInt in java but I couldn't find any alternative for c#.
Thanks for any help.
private int[,] LoadMapArray(String filename)
{
int[,] currentMap;
int rows = 0;
int cols = 0;
StreamReader sizeReader = new StreamReader(filename);
using (var reader = File.OpenText(filename))
{
while (reader.ReadLine() != null)
{
string line = sizeReader.ReadLine();
cols = line.Length;
rows++;
}
}
currentMap = new int[rows,cols];
StreamReader sr = new StreamReader(filename);
for (int row = 0; row < rows + 1; row++)
{
string line = sr.ReadLine();
string[] split = new string[] {","};
string[] result;
result = line.Split(split, StringSplitOptions.None);
int col = 0;
foreach (string s in result)
{
currentMap[row, col] = Int32.Parse(s);
col++;
}
}
return currentMap;
}
Edit: Code was fixed after changing how I was accessing the file. I then had to change this to catch null:
for (int row = 0; row < rows + 1; row++)
{
string line = sr.ReadLine();
string[] split = new string[] { "," };
string[] result;
if (line != null)
{
result = line.Split(split, StringSplitOptions.None);
int col = 0;
foreach (string s in result)
{
currentMap[row, col] = Int32.Parse(s);
col++;
}
}
}
No, the size of your array is not correct. You read two lines at each loop but you increment the rows counter just one time.
using (var reader = File.OpenText(filename))
{
string line = string.Empty;
while ((line = reader.ReadLine()) != null)
{
rows++;
}
}
And I am sure that also the cols count is not correct, but it doesn't raise an exception because you are dimensioning the cols dimension bigger than required. (You count also the space for the commas, not just the numbers)
A simpler approach (if your file is not very big) is to use File.ReadAllLines()
string[] split = new string[] {","};
string[] lines = File.ReadAllLines(filename);
int rows = lines.Length;
int cols = lines[0].Split(split, StringSplitOptions.RemoveEmptyEntries).Count();
currentMap = new int[rows,cols];
for (int row = 0; row < rows; row++)
{
string line = lines(row);
string[] result = line.Split(split, StringSplitOptions.None);
int col = 0;
foreach (string s in result)
{
int value;
Int32.TryParse(s, out value)
currentMap[row, col] = value;
col++;
}
}
Now, the entire file is in memory with just one disk operation and you could work using the in memory strings. The parsing of the integer should be changed to use Int32.TryParse to avoid exception in case the retrieved value is not a valid integer.

Incorrect date format while exporting to csv

I am newbie to c# and I am trying to export table data from sqlite 3 database to csv but datetime format has changed in csv the format in sqlite 3 is:
2013-04-16 21:33:42.000
while datetime in csv is showing up like:
16/04/2013 21:33:42,
This is my code:
SQLiteConnection m_dbConnection;
//m_dbConnection = new SQLiteConnection("Data Source= C:/Users/IT-Administrator/Desktop/WebMobility.db; Version=3;");
m_dbConnection = new SQLiteConnection("Data source = F:/Explor/final test/WebMobility.db; Version=3;");
m_dbConnection.Open();
SQLiteCommand myCommand = new SQLiteCommand();
myCommand.Connection = m_dbConnection;
myCommand.CommandText = "select CompanyId,`DateTime`,Serial,ShortDeviceId,MatricolaA,Upper(Targa),CommonRoadDescription,RoadCivicNumber,GpsAddress,VerbaliVehicleTypeDescription,VehicleBrandDescription,VehicleModelDescription,CommonColorVehicleDescription,VerbaliRuleOneCode,VerbaliRuleOneDescription,VerbaliClosedNoteDescription,VerbaliRuleOnePoints,VerbaliMissedNotificationDescription from VerbaliData";
//myCommand.Connection = myConn;
DataTable data = new DataTable();
SQLiteDataAdapter myAdapter = new SQLiteDataAdapter(myCommand);
//myAdapter.SelectCommand = myCommand;
myAdapter.Fill(data);
dataGridView1.DataSource = data;
this.dataGridView1.Refresh();
if (dataGridView1.RowCount > 0)
{
string value = "";
DataGridViewRow dr = new DataGridViewRow();
StreamWriter swOut = new StreamWriter("F:/Explor/final test/finaltest12.csv");
//write header rows to csv
for (int i = 0; i <= dataGridView1.Columns.Count - 1; i++)
{
if (i > 0)
{
swOut.Write(",");
}
swOut.Write(dataGridView1.Columns[i].HeaderText);
}
swOut.WriteLine();
//write DataGridView rows to csv
for (int j = 0; j <= dataGridView1.Rows.Count - 1; j++)
{
if (j > 0)
{
swOut.WriteLine();
}
dr = dataGridView1.Rows[j];
for (int i = 0; i <= dataGridView1.Columns.Count - 1; i++)
{
if (i > 0)
{
swOut.Write(",");
}
value = dr.Cells[i].Value.ToString();
//replace comma's with spaces
value = value.Replace(',', ' ');
//replace embedded newlines with spaces
value = value.Replace(Environment.NewLine, " ");
swOut.Write(value);
}
}
swOut.Close();
}
m_dbConnection.Close();
DateTime columns have no formats. It is just the tool used to show the column content that formats the column value in a particular format.
Said that it is obvious that you need to do the same thing to print your value on the CSV file.
Inside your internal loop, check if the column is the datetime one and then format its value to your likes.
for (int i = 0; i <= dataGridView1.Columns.Count - 1; i++)
{
if (i > 0)
{
swOut.Write(",");
}
// Datetime column content transformed in a formatted string....
if(i == 1)
value = Convert.ToDateTime(dr.Cells[i].Value).ToString("yyyy-MM-dd hh:mm:ss.fff");
else
....
EDIT in case you have null or empty values in your cell then you need to check for that before trying to convert
// Datetime column content transformed in a formatted string....
if(i == 1)
{
object cellValue = dr.Cells[i].Value;
value = (cellValue == DBNull.Value ?
string.Empty : Convert.ToDateTime(cellValue).ToString("yyyy-MM-dd hh:mm:ss.fff");
}
This will write an empty column where the date is expected. This could be correct or not, depending on how the CSV file is required. Instead of an empty string you could save a predefined value.
if (i==1)
{
value = dr.Cells[i].Value.ToString();
if (value!null)
{
string dt=DateTime.Parse(value).ToString("yyyy/MM/dd hh:mm:ss");
swOut.Write(dt);
}
......
}

Reading from CSV file and store to an array

Can you please help me with C#.
I am trying to create a function in C# that opens a CSV file and save them to an array:
FileStream fileStream = new FileStream(guid.ToString(), FileMode.Open);
for (int i = 1; i > 200; i++) // it checks the first 200 lines
{
int j = 0;
string[] str = new string[j];
do
{
// saving each character to the variable until comma is found
} while(str == '\n'); // read each character in a for loop until new line character found
}
Can you please help me out?
Something like this:
using (StreamReader r = new StreamReader(guid.ToString()))
{
string line;
int linesCount;
ArrayList result = new ArrayList();
while ((line = r.ReadLine()) != null && linesCount++ <= 200)
{
result.AddRange(line.Split(','));
}
}
Parsing CSV by hand is actually pretty tricky. You might be better off reusing the TextFieldParser (add a reference to the Microsoft.VisualBasic assembly).
using Microsoft.VisualBasic.FileIO;
....
string[,] parsedCsv;
List<string[]> csvLines = new List<string[]>();
TextFieldParser parser = new TextFieldParser(new FileStream(guid.ToString(), FileMode.Open));
parser.Delimiters = new string[] { "," };
parser.TextFieldType = FieldType.Delimited;
int maxLines = 200, lineCount = 0;
try
{
while (!parser.EndOfData && lineCount++ < maxLines)
{
csvLines.Add(parser.ReadFields());
}
}
catch (MalformedLineException)
{
Console.WriteLine("Line Number: {0} Value: {1}", parser.ErrorLineNumber, parser.ErrorLine);
return;
}
parsedCsv = new string[csvLines.Count, csvLines[0].Length];
for (int i = 0; i < csvLines.Count; i++)
{
for (int j = 0; j < csvLines[i].Length; j++)
{
parsedCsv[i, j] = csvLines[i][j];
}
}
I have assumed here that the output is going to be a 2-D array of strings - you may need to adjust this code depending on what you are after, especially if you have to cope with the situation where each line does not have the same number of fields (perhaps unlikely, but still).
The really useful thing about TextFieldParser is that it will cope with different kinds of delimeters. By setting parser.Delimiters = new string[] { "\t" };, for example, this same code could parse tab-delimited text.
What about:
string[] lines = File.ReadAllLines(path);
if(lines.Length >= 200){
for(int i = 0; i < 200; i++){
string[] str = lines[i].Split(',');
//do something here
}
}
You can just use the string.Split(',') extension method.
using (StreamReader streamReader = new StreamReader(File.OpenRead(guid.ToString())))
{
for (int i = 0; i <= 200; ++i)
{
string[] str = streamReader.ReadLine().Split(',');
}
}
The Split extension method will return a string array of the individual values separated by a comma.

Categories