DateTime.Parse date conversion is not consistent - c#

I have an Excel sheet where one column has a date. I have C# code to take that column, convert it to a date and then insert it to a SQL database.
The conversion is done like this:
r.transactionDate = DateTime.Parse(Convert.ToString(xlRange.Cells[i, 1].Value));
The data transfer is working without an issue.
When the date in the Excel sheet is something like 25/05/2022 (25th of May), it is converted properly and the date I get in the SQL database is the 25/05/2022.
However, the problem is when the date is something like 12/05/2022 (12th May); it is converted to 05/12/2022 (05th of December).
How can I fix this issue?

It's an excel sheet downloaded from Paypal that has all the dates of payments. There is a date column with dates.
Paypal does not offer a download in XLS or XLSX format. What you have is, most likely, a CSV file that Excel can open. It appears as an Excel file in your file system because Excel has registered the CSV file extension.
CSV (Comma-Separated Values) is a text file with all values represented as strings, separated by the , character. It has been around for a very long time, but was never formally specified. Or more precisely, formal specifications for CSV have been created several times, and nobody really knows which one is the right one. CSV files that work perfectly with one program can be unreadable to another.
Excel's CSV import is notorious for using US date formats regardless of your computer's regional settings. Each value is separated and examined individually. If the value looks like a date format then Excel attempts to parse it as a US date. If the first set of digits is in the range 1-12 then it is interpreted as the month. If the first set of digits is 13 or more then it tries the regional date strings, and may fall back to day-first if that fails. (This varies apparently.)
For this reason (among others) I strongly recommend that you never open a CSV file via Excel automation. Ever. In fact you should never open a CSV file with dates in it using Excel unless you know that the file was generated with month-first date formats only. Even then it is a needless waste of resources to open a text file with Excel.
You can do better.
There are plenty of libraries out there that will help you with CSV import. I've used a few (CsvHelper is a reasonable starting place), but usually I just write something simple to do the job. Read the file a line at a time (StreamReader.ReadLine), split each line into a collection of values (via my SplitCSV method), then write a ParseCSV method for the type that takes a collection of strings and returns a configured object. That way I have direct control over the way the input data is interpreted without having Excel mess things up for me.
Here's a simple example:
const string InputFile = #"C:\Temp\test.csv";
static void Main()
{
var rows = ReadCSV(InputFile, RowData.ParseCSV);
foreach (var row in rows)
{
Console.WriteLine($"#{row.RowNum}, Date: {row.SomeDate}, Amount: ${row.Amount:#,0.00}");
}
}
class RowData
{
public int RowNum { get; set; }
public DateTime SomeDate { get; set; }
public decimal Amount { get; set; }
public static RowData ParseCSV(string[] values)
{
if (values is null || values.Length < 3)
return null;
if (!int.TryParse(values[0], out var rownum) ||
!DateTime.TryParse(values[1], out var somedate) ||
!decimal.TryParse(values[2], out var amount)
)
return null;
return new RowData
{
RowNum = rownum,
SomeDate = somedate,
Amount = amount
};
}
}
static IEnumerable<T> ReadCSV<T>(string filename, Func<string[], T> parser, bool skipHeaders = true)
where T : class
{
bool first = true;
foreach (var line in Lines(filename))
{
if (first)
{
first = false;
if (skipHeaders)
continue;
}
T curr = null;
try
{
var values = SplitCSV(line);
curr = parser(values);
}
catch
{
// Do something here if you care about bad data.
}
if (curr != null)
yield return curr;
}
}
static string[] SplitCSV(string line)
{
var sb = new StringBuilder();
return internal_split().ToArray();
// The actual split, done as an enumerator.
IEnumerable<string> internal_split()
{
bool inQuote = false;
foreach (char c in line)
{
if (c == ',' && !inQuote)
{
// yield value
yield return sb.ToString();
sb.Clear();
}
else if (c == '"')
inQuote = !inQuote;
else
sb.Append(c);
}
// yield last field
yield return sb.ToString();
}
}
static IEnumerable<string> Lines(string filename)
{
string line;
using var reader = File.OpenText(filename);
while ((line = reader.ReadLine()) != null)
{
yield return line;
}
}
(It's not the best way, but it's a way to do it.)

Related

C# Export ListView to CSV

I need export ListView to csv and i found solution from this answer.Here
The problem is that I am getting full line into one column.
I got this :
I need this:
Code below:
public static void ListViewToCSV(ListView listView, string filePath, bool includeHidden)
{
//make header string
StringBuilder result = new StringBuilder();
WriteCSVRow(result, listView.Columns.Count, i => includeHidden || listView.Columns[i].Width > 0, i => listView.Columns[i].Text);
//export data rows
foreach (ListViewItem listItem in listView.Items)
WriteCSVRow(result, listView.Columns.Count, i => includeHidden || listView.Columns[i].Width > 0, i => listItem.SubItems[i].Text);
File.WriteAllText(filePath, result.ToString());
}
private static void WriteCSVRow(StringBuilder result, int itemsCount, Func<int, bool> isColumnNeeded, Func<int, string> columnValue)
{
bool isFirstTime = true;
for (int i = 0; i < itemsCount; i++)
{
if (!isColumnNeeded(i))
continue;
if (!isFirstTime)
result.Append(",");
isFirstTime = false;
result.Append(String.Format("\"{0}\"", columnValue(i)));
}
result.AppendLine();
}
Any ideas what I have to do ?
This has nothing to do with CSVs. The first screenshot shows a perfectly good CSV. It also shows an attempt to open that file in Excel.
Excel will use the end-user's locale to decide what column, decimal separators and date formats to use. When you double-click on a text file Excel will import that data using the end-user's locale settings as the default.
In most European countries , is the decimal separator so it can't be used as a list separator too. Otherwise you wouldn't know what 100,00,234 mean. Is that 2 or 3 columns?
In such cases, the typical list separator is ;. This is specified in the end user's locale settings. In most European countries you'd see 100,00;234
If you want your end users to be able to open the generated files in Excel with double-clicking you'll have to use the list and decimal separators that match their locales.
A better option would be to generate real Excel files with, eg Epplus. It doesn't require installing Excel on the client or server, and generates real Excel (xlsx) files.
Exporting your data can be as easy as a call to :
using (var pck = new ExcelPackage())
{
var sheet = pck.Workbook.Worksheets.Add("sheet");
sheet.Cells["C1"].LoadFromCollection(items);
pck.SaveAs(new FileInfo(#"c:\workbooks\myworkbook.xlsx"));
}

Use continue key word to processed with the loop

I am reading data from excel file(which is actually a comma separated csv file) columns line-by-line, this file gets send by an external entity.Among the columns to be read is the time, which is in 00.00 format, so a split method is used read all the different columns, however the file sometimes comes with extra columns(commas between the elements) so the split elements are now always correct. Below is the code used to read and split the different columns, this elements will be saved in the database.
public void SaveFineDetails()
{
List<string> erroredFines = new List<string>();
try
{
log.Debug("Start : SaveFineDetails() - Saving Downloaded files fines..");
if (!this.FileLines.Any())
{
log.Info(string.Format("End : SaveFineDetails() - DataFile was Empty"));
return;
}
using (RAC_TrafficFinesContext db = new RAC_TrafficFinesContext())
{
this.FileLines.RemoveAt(0);
this.FileLines.RemoveAt(FileLines.Count - 1);
int itemCnt = 0;
int errorCnt = 0;
int duplicateCnt = 0;
int count = 0;
foreach (var line in this.FileLines)
{
count++;
log.DebugFormat("Inserting {0} of {1} Fines..", count.ToString(), FileLines.Count.ToString());
string[] bits = line.Split(',');
int bitsLength = bits.Length;
if (bitsLength == 9)
{
string fineNumber = bits[0].Trim();
string vehicleRegistration = bits[1];
string offenceDateString = bits[2];
string offenceTimeString = bits[3];
int trafficDepartmentId = this.TrafficDepartments.Where(tf => tf.DepartmentName.Trim().Equals(bits[4], StringComparison.InvariantCultureIgnoreCase)).Select(tf => tf.DepartmentID).FirstOrDefault();
string proxy = bits[5];
decimal fineAmount = GetFineAmount(bits[6]);
DateTime fineCreatedDate = DateTime.Now;
DateTime offenceDate = GetOffenceDate(offenceDateString, offenceTimeString);
string username = Constants.CancomFTPServiceUser;
bool isAartoFine = bits[7] == "1" ? true : false;
string fineStatus = "Sent";
try
{
var dupCheck = db.GetTrafficFineByNumber(fineNumber);
if (dupCheck != null)
{
duplicateCnt++;
string ExportFileName = (base.FileName == null) ? string.Empty : base.FileName;
DateTime FileDate = DateTime.Now;
db.CreateDuplicateFine(ExportFileName, FileDate, fineNumber);
}
else
{
var adminFee = db.GetAdminFee();
db.UploadFTPFineData(fineNumber, fineAmount, vehicleRegistration, offenceDate, offenceDateString, offenceTimeString, trafficDepartmentId, proxy, false, "Imported", username, adminFee, isAartoFine, dupCheck != null, fineStatus);
}
itemCnt++;
}
catch
{
errorCnt++;
}
}
else
{
erroredFines.Add(line);
continue;
}
}
Now the problem is, this file doesn't always come with 9 elements as we expect, for example on this image, the lines are not the same(ignore first line, its headers)
On first line FM is supposed to be part of 36DXGP instead of being two separated elements. This means the columns are now extra. Now this brings us to the issue at hand, which is the time element, beacuse of extra coma, the time is now something else, is now read as 20161216, so the split on the time element is not working at all. So what I did was, read the incorrect line, check its length, if the length is not 9 then, add it to the error list and continue.
But my continue key word doesn't seem to work, it gets into the else part and then goes back to read the very same error line.
I have checked answers on Break vs Continue and they provide good example on how continue works, I introduced the else because the format on this examples did not work for me(well the else did not made any difference neither). Here is the sample data,
NOTE the first line to be read starts with 96
H,1789,,,,,,,,
96/17259/801/035415,FM,36DXGP,20161216,17.39,city hall-cape town,Makofane,200,0,0
MA/80/034808/730,CA230721,20170117,17.43,malmesbury,PATEL,200,0,0,
what is it that I am doing so wrong here
I have found a way to solve my problem, there was an issue with the length of the line because of the trailing comma which caused an empty element, I then got rid of this empty element with this code and determined the new length
bits = bits.Where(x => !string.IsNullOrEmpty(x)).ToArray();
int length = bits.Length
All is well now
I suggest you use the following overload for performance and readability reasons:
line.Split(new char[] {','}, StringSplitOptions.RemoveEmptyEntries)l

Read Specific Strings from Text File

I'm trying to get certain strings out of a text file and put it in a variable.
This is what the structure of the text file looks like keep in mind this is just one line and each line looks like this and is separated by a blank line:
Date: 8/12/2013 12:00:00 AM Source Path: \\build\PM\11.0.64.1\build.11.0.64.1.FileServerOutput.zip Destination Path: C:\Users\Documents\.NET Development\testing\11.0.64.1\build.11.0.55.5.FileServerOutput.zip Folder Updated: 11.0.64.1 File Copied: build.11.0.55.5.FileServerOutput.zip
I wasn't entirely too sure of what to use for a delimiter for this text file or even if I should be using a delimiter so it could be subjected to change.
So just a quick example of what I want to happen with this, is I want to go through and grab the Destination Path and store it in a variable such as strDestPath.
Overall the code I came up with so far is this:
//find the variables from the text file
string[] lines = File.ReadAllLines(GlobalVars.strLogPath);
Yeah not much, but I thought perhaps if I just read one line at at a time and tried to search for what I was looking for through that line but honestly I'm not 100% sure if I should stick with that way or not...
If you are skeptical about how large your file is, you should come up using ReadLines which is deferred execution instead of ReadAllLines:
var lines = File.ReadLines(GlobalVars.strLogPath);
The ReadLines and ReadAllLines methods differ as follows:
When you use ReadLines, you can start enumerating the collection of strings before the whole collection is returned; when you use ReadAllLines, you must wait for the whole array of strings be returned before you can access the array. Therefore, when you are working with very large files, ReadLines can be more efficient.
As weird as it might sound, you should take a look to log parser. If you are free to set the file format you could use one that fits with log parser and, believe me, it will make your life a lot more easy.
Once you load the file with log parse you can user queries to get the information you want. If you don't care about using interop in your project you can even add a com reference and use it from any .net project.
This sample reads a HUGE csv file a makes a bulkcopy to the DB to perform there the final steps. This is not really your case, but shows you how easy is to do this with logparser
COMTSVInputContextClass logParserTsv = new COMTSVInputContextClass();
COMSQLOutputContextClass logParserSql = new COMSQLOutputContextClass();
logParserTsv.separator = ";";
logParserTsv.fixedSep = true;
logParserSql.database = _sqlDatabaseName;
logParserSql.server = _sqlServerName;
logParserSql.username = _sqlUser;
logParserSql.password = _sqlPass;
logParserSql.createTable = false;
logParserSql.ignoreIdCols = true;
// query shortened for clarity purposes
string SelectPattern = #"Select TO_STRING(UserName),TO_STRING(UserID) INTO {0} From {1}";
string query = string.Format(SelectPattern, _sqlTable, _csvPath);
logParser.ExecuteBatch(query, logParserTsv, logParserSql);
LogParser in one of those hidden gems Microsoft has and most people don't know about. I have use to read iis logs, CSV files, txt files, etc. You can even generate graphics!!!
Just check it here http://support.microsoft.com/kb/910447/en
Looks like you need to create a Tokenizer. Try something like this:
Define a list of token values:
List<string> gTkList = new List<string>() {"Date:","Source Path:" }; //...etc.
Create a Token class:
public class Token
{
private readonly string _tokenText;
private string _val;
private int _begin, _end;
public Token(string tk, int beg, int end)
{
this._tokenText = tk;
this._begin = beg;
this._end = end;
this._val = String.Empty;
}
public string TokenText
{
get{ return _tokenText; }
}
public string Value
{
get { return _val; }
set { _val = value; }
}
public int IdxBegin
{
get { return _begin; }
}
public int IdxEnd
{
get { return _end; }
}
}
Create a method to Find your Tokens:
List<Token> FindTokens(string str)
{
List<Token> retVal = new List<Token>();
if (!String.IsNullOrWhitespace(str))
{
foreach(string cd in gTkList)
{
int fIdx = str.IndexOf(cd);
if(fIdx > -1)
retVal.Add(cd,fIdx,fIdx + cd.Length);
}
}
return retVal;
}
Then just do something like this:
foreach(string ln in lines)
{
//returns ordered list of tokens
var tkns = FindTokens(ln);
for(int i=0; i < tkns.Length; i++)
{
int len = (i == tkns.Length - 1) ? ln.Length - tkns[i].IdxEnd : tkns[i+1].IdxBegin - tkns[i].IdxEnd;
tkns[i].value = ln.Substring(tkns[i].IdxEnd+1,len).Trim();
}
//Do something with the gathered values
foreach(Token tk in tkns)
{
//stuff
}
}

Convert a single datarow in a CSV like string in C#

How can I convert a single datarow (of a datatable) into a CSV like string with only few C# commands.
In other words into a string like "value_1;value_2;...;value_n"
Quick and dirty for a single DataRow:
System.Data.DataRow row = ...;
string csvLine = String.Join(
CultureInfo.CurrentCulture.TextInfo.ListSeparator,
row.ItemArray);
If you do not care about the culture specific separator you may do this to convert a full DataTable:
public static string ToCsv(System.Data.DataTable table)
{
StringBuilder csv = new StringBuilder();
foreach (DataRow row in table.Rows)
csv.AppendLine(String.Join(";", row.ItemArray));
return csv.ToString();
}
Here a more complex example if you need to handle a little bit of formatting for the values (in case they're not just numbers).
public static string ToCsv(DataTable table)
{
StringBuilder csv = new StringBuilder();
foreach (DataRow row in table.Rows)
{
for (int i = 0; i < row.ItemArray.Length; ++i)
{
if (i > 0)
csv.Append(CultureInfo.CurrentCulture.TextInfo.ListSeparator);
csv.Append(FormatValue(row.ItemArray[i]));
}
csv.AppendLine();
}
return csv.ToString();
}
Or, if you prefer LINQ (and assuming table is not empty):
public static string ToCsv(DataTable table, string separator = null)
{
if (separator == null)
separator = CultureInfo.CurrentCulture.TextInfo.ListSeparator;
return table.Rows
.Cast<DataRow>()
.Select(r => String.Join(separator, r.ItemArray.Select(c => FormatValue(c)))
.Aggregate(new StringBuilder(), (result, line) => result.AppendLine(line))
.ToString();
}
Using this private function to format a value. It's a very naive implementation, for non primitive types you should use TypeConverter (if any, see this nice library: Universal Type Converter) and quote the text only if needed (2.6):
private static string FormatValue(object value)
{
if (Object.ReferenceEquals(value, null))
return "";
Type valueType = value.GetType();
if (valueType.IsPrimitive || valueType == typeof(DateTime))
return value.ToString();
return String.Format("\"{0}\"",
value.ToString().Replace("\"", "\"\"");
}
Notes
Even if there is a RFC for CSV many applications does not follow its rules and they handle special cases in their very own way (Microsoft Excel, for example, uses the locale list separator instead of comma and it doesn't handle newlines in strings as required by the standard).
Here is a start:
StringBuilder line = new StringBuilder();
bool first = true;
foreach (object o in theDataRow.ItemArray) {
string s = o.Tostring();
if (s.Contains("\"") || s.Contains(",")) {
s = "\"" + s.Replace("\"", "\"\"") + "\"";
}
if (first) {
first = false;
} else {
line.Adppend(',');
}
line.Append(s);
}
String csv = line.ToString();
It will handle quoted values and values containing the separator, i.e. a value that contains quotation marks or the separator needs to be surrounded by quotation marks, and the quotation marks inside it needs to be escaped by doubling them.
Note that the code uses comma as separator, as that's what the C in CSV stands for. Some programs might be more comfortable using semicolon.
Room for imrovement: There are also other characters that should trigger quoting, like padding spaces and line breaks.
Note: Even if there is a standard defined for CSV files now, it's rarely followed because many programs were developed long before the standard existed. You just have to adapt to the peculiarities of any program that you need to communicate with.

How do i parse a text file in c# [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 9 years ago.
Improve this question
How do i parse a text file in c#?
Check this interesting approach, Linq To Text Files, very nice, you only need a IEnumerable<string> method, that yields every file.ReadLine(), and you do the query.
Here is another article that better explains the same technique.
using (TextReader rdr = new StreamReader(fullFilePath))
{
string line;
while ((line = rdr.ReadLine()) != null)
{
// use line here
}
}
set the variable "fullFilePath" to the full path eg. C:\temp\myTextFile.txt
The algorithm might look like this:
Open Text File
For every line in the file:
Parse Line
There are several approaches to parsing a line.
The easiest from a beginner standpoint is to use the String methods.
System.String at MSDN
If you are up for more of a challenge, then you can use the System.Text.RegularExpression library to parse your text.
RegEx at MSDN
You might want to use a helper class such as the one described at http://www.blackbeltcoder.com/Articles/strings/a-text-parsing-helper-class.
From years of analyzing CSV files, including ones that are broken or have edge cases, here is my code that passes virtually all of my unit tests:
/// <summary>
/// Read in a line of text, and use the Add() function to add these items to the current CSV structure
/// </summary>
/// <param name="s"></param>
public static bool TryParseCSVLine(string s, char delimiter, char text_qualifier, out string[] array)
{
bool success = true;
List<string> list = new List<string>();
StringBuilder work = new StringBuilder();
for (int i = 0; i < s.Length; i++) {
char c = s[i];
// If we are starting a new field, is this field text qualified?
if ((c == text_qualifier) && (work.Length == 0)) {
int p2;
while (true) {
p2 = s.IndexOf(text_qualifier, i + 1);
// for some reason, this text qualifier is broken
if (p2 < 0) {
work.Append(s.Substring(i + 1));
i = s.Length;
success = false;
break;
}
// Append this qualified string
work.Append(s.Substring(i + 1, p2 - i - 1));
i = p2;
// If this is a double quote, keep going!
if (((p2 + 1) < s.Length) && (s[p2 + 1] == text_qualifier)) {
work.Append(text_qualifier);
i++;
// otherwise, this is a single qualifier, we're done
} else {
break;
}
}
// Does this start a new field?
} else if (c == delimiter) {
list.Add(work.ToString());
work.Length = 0;
// Test for special case: when the user has written a casual comma, space, and text qualifier, skip the space
// Checks if the second parameter of the if statement will pass through successfully
// e.g. "bob", "mary", "bill"
if (i + 2 <= s.Length - 1) {
if (s[i + 1].Equals(' ') && s[i + 2].Equals(text_qualifier)) {
i++;
}
}
} else {
work.Append(c);
}
}
list.Add(work.ToString());
// If we have nothing in the list, and it's possible that this might be a tab delimited list, try that before giving up
if (list.Count == 1 && delimiter != DEFAULT_TAB_DELIMITER) {
string[] tab_delimited_array = ParseLine(s, DEFAULT_TAB_DELIMITER, DEFAULT_QUALIFIER);
if (tab_delimited_array.Length > list.Count) {
array = tab_delimited_array;
return success;
}
}
// Return the array we parsed
array = list.ToArray();
return success;
}
However, this function does not actually parse every valid CSV file out there! Some files have embedded newlines in them, and you need to enable your stream reader to parse multiple lines together to return an array. Here's a tool that does that:
/// <summary>
/// Parse a line whose values may include newline symbols or CR/LF
/// </summary>
/// <param name="sr"></param>
/// <returns></returns>
public static string[] ParseMultiLine(StreamReader sr, char delimiter, char text_qualifier)
{
StringBuilder sb = new StringBuilder();
string[] array = null;
while (!sr.EndOfStream) {
// Read in a line
sb.Append(sr.ReadLine());
// Does it parse?
string s = sb.ToString();
if (TryParseCSVLine(s, delimiter, text_qualifier, out array)) {
return array;
}
}
// Fails to parse - return the best array we were able to get
return array;
}
For reference, I placed my open source CSV code on code.google.com.
If you have more than a trivial language, use a parser generator. It drove me nuts but I've heard good things about ANTLR (Note: get the manual and read it before you start. If you have used a parser generator other than it before you will not approach it correctly right off the bat, at least I didn't)
Other tools also exist.
What do you mean by parse? Parse usually means to split the input into tokens, which you might do if you're trying to implement a programming language. If you're just wanting to read the contents of a text file, look at System.IO.FileInfo.
Without really knowing what sort of text file you're on about, its hard to answer. However, the FileHelpers library has a broad set of tools to help with fixed length file formats, multirecord, delimited etc.
A small improvement on Pero's answer:
FileInfo txtFile = new FileInfo("c:\myfile.txt");
if(!txtFile.Exists) { // error handling }
using (TextReader rdr = txtFile.OpenText())
{
// use the text file as Pero suggested
}
The FileInfo class gives you the opportunity to "do stuff" with the file before you actually start reading from it. You can also pass it around between functions as a better abstraction of the file's location (rather than using the full path string). FileInfo canonicalizes the path so it's absolutely correct (e.g. turning / into \ where appropriate) and lets you extract extra data about the file -- parent directory, extension, name only, permissions, etc.
To begin with, make sure that you have the following namespaces:
using System.Data;
using System.IO;
using System.Text.RegularExpressions;
Next, we build a function that parses any CSV input string into a DataTable:
public DataTable ParseCSV(string inputString) {
DataTable dt=new DataTable();
// declare the Regular Expression that will match versus the input string
Regex re=new Regex("((?<field>[^\",\\r\\n]+)|\"(?<field>([^\"]|\"\")+)\")(,|(?<rowbreak>\\r\\n|\\n|$))");
ArrayList colArray=new ArrayList();
ArrayList rowArray=new ArrayList();
int colCount=0;
int maxColCount=0;
string rowbreak="";
string field="";
MatchCollection mc=re.Matches(inputString);
foreach(Match m in mc) {
// retrieve the field and replace two double-quotes with a single double-quote
field=m.Result("${field}").Replace("\"\"","\"");
rowbreak=m.Result("${rowbreak}");
if (field.Length > 0) {
colArray.Add(field);
colCount++;
}
if (rowbreak.Length > 0) {
// add the column array to the row Array List
rowArray.Add(colArray.ToArray());
// create a new Array List to hold the field values
colArray=new ArrayList();
if (colCount > maxColCount)
maxColCount=colCount;
colCount=0;
}
}
if (rowbreak.Length == 0) {
// this is executed when the last line doesn't
// end with a line break
rowArray.Add(colArray.ToArray());
if (colCount > maxColCount)
maxColCount=colCount;
}
// create the columns for the table
for(int i=0; i < maxColCount; i++)
dt.Columns.Add(String.Format("col{0:000}",i));
// convert the row Array List into an Array object for easier access
Array ra=rowArray.ToArray();
for(int i=0; i < ra.Length; i++) {
// create a new DataRow
DataRow dr=dt.NewRow();
// convert the column Array List into an Array object for easier access
Array ca=(Array)(ra.GetValue(i));
// add each field into the new DataRow
for(int j=0; j < ca.Length; j++)
dr[j]=ca.GetValue(j);
// add the new DataRow to the DataTable
dt.Rows.Add(dr);
}
// in case no data was parsed, create a single column
if (dt.Columns.Count == 0)
dt.Columns.Add("NoData");
return dt;
}
Now that we have a parser for converting a string into a DataTable, all we need now is a function that will read the content from a CSV file and pass it to our ParseCSV function:
public DataTable ParseCSVFile(string path) {
string inputString="";
// check that the file exists before opening it
if (File.Exists(path)) {
StreamReader sr = new StreamReader(path);
inputString = sr.ReadToEnd();
sr.Close();
}
return ParseCSV(inputString);
}
And now you can easily fill a DataGrid with data coming off the CSV file:
protected System.Web.UI.WebControls.DataGrid DataGrid1;
private void Page_Load(object sender, System.EventArgs e) {
// call the parser
DataTable dt=ParseCSVFile(Server.MapPath("./demo.csv"));
// bind the resulting DataTable to a DataGrid Web Control
DataGrid1.DataSource=dt;
DataGrid1.DataBind();
}
Congratulations! You are now able to parse CSV into a DataTable. Good luck with your programming.

Categories