How to efficiently write to file from SQL datareader in c#? - c#

I have a remote sql connection in C# that needs to execute a query and save its results to the users's local hard disk. There is a fairly large amount of data this thing can return, so need to think of an efficient way of storing it. I've read before that first putting the whole result into memory and then writing it is not a good idea, so if someone could help, would be great!
I am currently storing the sql result data into a DataTable, although I am thinking it could be better doing something in while(myReader.Read(){...}
Below is the code that gets the results:
DataTable t = new DataTable();
string myQuery = QueryLoader.ReadQueryFromFileWithBdateEdate(#"Resources\qrs\qryssysblo.q", newdate, newdate);
using (SqlDataAdapter a = new SqlDataAdapter(myQuery, sqlconn.myConnection))
{
a.Fill(t);
}
var result = string.Empty;
for(int i = 0; i < t.Rows.Count; i++)
{
for (int j = 0; j < t.Columns.Count; j++)
{
result += t.Rows[i][j] + ",";
}
result += "\r\n";
}
So now I have this huge result string. And I have the datatable. There has to be a much better way of doing it?
Thanks.

You are on the right track yourself. Use a loop with while(myReader.Read(){...} and write each record to the text file inside the loop. The .NET framework and operating system will take care of flushing the buffers to disk in an efficient way.
using(SqlConnection conn = new SqlConnection(connectionString))
using(SqlCommand cmd = conn.CreateCommand())
{
conn.Open();
cmd.CommandText = QueryLoader.ReadQueryFromFileWithBdateEdate(
#"Resources\qrs\qryssysblo.q", newdate, newdate);
using(SqlDataReader reader = cmd.ExecuteReader())
using(StreamWriter writer = new StreamWriter("c:\temp\file.txt"))
{
while(reader.Read())
{
// Using Name and Phone as example columns.
writer.WriteLine("Name: {0}, Phone : {1}",
reader["Name"], reader["Phone"]);
}
}
}

I came up with this, it's a better CSV writer than the other answers:
public static class DataReaderExtension
{
public static void ToCsv(this IDataReader dataReader, string fileName, bool includeHeaderAsFirstRow)
{
const string Separator = ",";
StreamWriter streamWriter = new StreamWriter(fileName);
StringBuilder sb = null;
if (includeHeaderAsFirstRow)
{
sb = new StringBuilder();
for (int index = 0; index < dataReader.FieldCount; index++)
{
if (dataReader.GetName(index) != null)
sb.Append(dataReader.GetName(index));
if (index < dataReader.FieldCount - 1)
sb.Append(Separator);
}
streamWriter.WriteLine(sb.ToString());
}
while (dataReader.Read())
{
sb = new StringBuilder();
for (int index = 0; index < dataReader.FieldCount; index++)
{
if (!dataReader.IsDBNull(index))
{
string value = dataReader.GetValue(index).ToString();
if (dataReader.GetFieldType(index) == typeof(String))
{
if (value.IndexOf("\"") >= 0)
value = value.Replace("\"", "\"\"");
if (value.IndexOf(Separator) >= 0)
value = "\"" + value + "\"";
}
sb.Append(value);
}
if (index < dataReader.FieldCount - 1)
sb.Append(Separator);
}
if (!dataReader.IsDBNull(dataReader.FieldCount - 1))
sb.Append(dataReader.GetValue(dataReader.FieldCount - 1).ToString().Replace(Separator, " "));
streamWriter.WriteLine(sb.ToString());
}
dataReader.Close();
streamWriter.Close();
}
}
usage: mydataReader.ToCsv("myfile.csv", true)

Rob Sedgwick answer is more like it, but can be improved and simplified. This is how I did it:
string separator = ";";
string fieldDelimiter = "";
bool useHeaders = true;
string connectionString = "xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx";
using (SqlConnection conn = new SqlConnection(connectionString))
{
using (SqlCommand cmd = conn.CreateCommand())
{
conn.Open();
string query = #"SELECT whatever";
cmd.CommandText = query;
using (SqlDataReader reader = cmd.ExecuteReader())
{
if (!reader.Read())
{
return;
}
List<string> columnNames = GetColumnNames(reader);
// Write headers if required
if (useHeaders)
{
first = true;
foreach (string columnName in columnNames)
{
response.Write(first ? string.Empty : separator);
line = string.Format("{0}{1}{2}", fieldDelimiter, columnName, fieldDelimiter);
response.Write(line);
first = false;
}
response.Write("\n");
}
// Write all records
do
{
first = true;
foreach (string columnName in columnNames)
{
response.Write(first ? string.Empty : separator);
string value = reader[columnName] == null ? string.Empty : reader[columnName].ToString();
line = string.Format("{0}{1}{2}", fieldDelimiter, value, fieldDelimiter);
response.Write(line);
first = false;
}
response.Write("\n");
}
while (reader.Read());
}
}
}
And you need to have a function GetColumnNames:
List<string> GetColumnNames(IDataReader reader)
{
List<string> columnNames = new List<string>();
for (int i = 0; i < reader.FieldCount; i++)
{
columnNames.Add(reader.GetName(i));
}
return columnNames;
}

I agree that your best bet here would be to use a SqlDataReader. Something like this:
StreamWriter YourWriter = new StreamWriter(#"c:\testfile.txt");
SqlCommand YourCommand = new SqlCommand();
SqlConnection YourConnection = new SqlConnection(YourConnectionString);
YourCommand.Connection = YourConnection;
YourCommand.CommandText = myQuery;
YourConnection.Open();
using (YourConnection)
{
using (SqlDataReader sdr = YourCommand.ExecuteReader())
using (YourWriter)
{
while (sdr.Read())
YourWriter.WriteLine(sdr[0].ToString() + sdr[1].ToString() + ",");
}
}
Mind you, in the while loop, you can write that line to the text file in any format you see fit with the column data from the SqlDataReader.

Keeping your original approach, here is a quick win:
Instead of using String as a temporary buffer, use StringBuilder. That will allow you to use the function .append(String) for concatenations, instead of using the operator +=.
The operator += is specially inefficient, so if you place it on a loop and it is repeated (potentially) millions of times, the performance will be affected.
The .append(String) method won't destroy the original object, so it's faster

Using the response object without a response.Close() causes at least in some instances the html of the page writing out the data to be written to the file. If you use Response.Close() the connection can be closed prematurely and cause an error producing the file.
It is recommended to use the HttpApplication.CompleteRequest() however this appears to always cause the html to be written to the end of the file.
I have tried the stream in conjunction with the response object and have had success in the development environment. I have not tried it in production yet.

I used .CSV to export data from database by DataReader. in my project i read datareader and create .CSV file manualy. in a loop i read datareader and for every rows i append cell value to result string. for separate columns i use "," and for separate rows i use "\n". finally i saved result string as result.csv.
I suggest this high performance extension. i tested it and quickly export 600,000 rows as .CSV .

I use:
private void SaveData(string path)
{
DataTable tblResult = new DataTable();
using(SqlCommand cm = new SqlCommand("select something", objConnect))
{
tblResult.Load(cm.ExecuteLoad());
}
if (tblResult != null)
{
using(FileStream fs = new FileStream(path, FileMode.Create, FileAccess.Write))
{
BinaryFormatter bin = new BinaryFormatter();
bin.Serialize(fs, tblResult);
}
}
}
ease to use, and easy to load, with:
private DataTable LoadData(string path)
{
DataTable t = new DataTable();
using(FileStream fs = new FileStream(path, FileMode.Open, FileAccess.Read))
{
BinaryFormatter bin = new BinaryFormatter();
t = (DataTable)bin.Deserialize(fs);
}
return t;
}
you can use this method also to save a DataSet.

Related

how to add double quotes to a string which contains comma?

I successfully imported following file in database but my import method removes double quotes during saving process. but i want to export this file as it is , i.e add quotes to a string which contains delimiter so how to achieve this .
here is my csv file with headers and 1 record.
PTNAME,REGNO/ID,BLOOD GRP,WARD NAME,DOC NAME,XRAY,PATHO,MEDICATION,BLOOD GIVEN
Mr. GHULAVE VASANTRAO PANDURANG,SH1503/00847,,RECOVERY,SHELKE SAMEER,"X RAY PBH RT IT FEMUR FRACTURE POST OP XRAY -ACCEPTABLE WITH IMPLANT IN SITU 2D ECHO MILD CONC LVH GOOD LV SYSTOLIC FUN, ALTERED LV DIASTOLIC FUN.", HB-11.9gm% TLC-8700 PLT COUNT-195000 BSL-173 UREA -23 CREATININE -1.2 SR.ELECTROLYTES-WNR BLD GROUP-B + HIV-NEGATIVE HBsAG-NEGATIVE PT INR -15/15/1.0. ECG SINUS TACHYCARDIA ,IV TAXIMAX 1.5 GM 1-0-1 IV TRAMADOL DRIP 1-0-1 TAB NUSAID SP 1-0-1 TAB ARCOPAN D 1-0-1 CAP BONE C PLUS 1 -0-1 TAB ANXIT 0.5 MG 0-0-1 ANKLE TRACTION 3 KG RT LL ,NOT GIVEN
Here is my method of export:
public void DataExport(string SelectQuery, string fileName)
{
try
{
DataTable dt = new DataTable();
SqlDataAdapter da = new SqlDataAdapter(SelectQuery, con);
da.Fill(dt);
//Sets file path and print Headers
// string filepath = txtreceive.Text + "\\" + fileName;
string filepath = #"C:\Users\Priya\Desktop\R\z.csv";
StreamWriter sw = new StreamWriter(filepath);
int iColCount = dt.Columns.Count;
// First we will write the headers if IsFirstRowColumnNames is true: //
for (int i = 0; i < iColCount; i++)
{
sw.Write(dt.Columns[i]);
if (i < iColCount - 1)
{
sw.Write(',');
}
}
sw.Write(sw.NewLine);
foreach (DataRow dr in dt.Rows) // Now write all the rows.
{
for (int i = 0; i < iColCount; i++)
{
if (!Convert.IsDBNull(dr[i]))
{
sw.Write(dr[i].ToString());
}
if (i < iColCount - 1)
{
sw.Write(',');
}
}
sw.Write(sw.NewLine);
}
sw.Close();
}
catch { }
}
if (myString.Contains(","))
{
myWriter.Write("\"{0}\"", myString);
}
else
{
myWriter.Write(myString);
}
The very simplest thing that you can do is replace the line sw.Write(dr[i].ToString()); with this:
var text = dr[i].ToString();
text = text.Contains(",") ? String.Format("\"{0}\"", text) : text;
sw.Write(text);
However, there are quite a few other issues with your code - most importantly you are opening a lot of disposable resources without disposing them properly.
I'd suggest a bit of a rewrite, like this:
public void DataExport(string SelectQuery, string fileName)
{
using (var dt = new DataTable())
{
using (var da = new SqlDataAdapter(SelectQuery, con))
{
da.Fill(dt);
var header = String.Join(
",",
dt.Columns.Cast<DataColumn>().Select(dc => dc.ColumnName));
var rows =
from dr in dt.Rows.Cast<DataRow>()
select String.Join(
",",
from dc in dt.Columns.Cast<DataColumn>()
let t1 = Convert.IsDBNull(dr[dc]) ? "" : dr[dc].ToString()
let t2 = t1.Contains(",") ? String.Format("\"{0}\"", t1) : t1
select t2);
using (var sw = new StreamWriter(fileName))
{
sw.WriteLine(header);
foreach (var row in rows)
{
sw.WriteLine(row);
}
sw.Close();
}
}
}
}
I've also broken apart the querying of the data from the data adapter from the writing of the data to the stream writer.
And, of course, I'm adding double-quotes to text that contains commas.
The only other thing I was concerned about was the fact that con is clearly a class-level variable and it is being left open. That's bad. Connections should be opened and closed each time they are used. You should probably consider making that change too.
You could also remove the stream write entirely by replacing that block with this:
File.WriteAllLines(fileName, new [] { header }.Concat(rows));
And, finally, wrapping your code in a try { ... } catch { } is just a bad practice. It's like saying "I'm writing some code that could fail, but I don't care and I don't want to be told if it does fail". You should only even catch specific exceptions that you do deal with. In this code you should consider catching file exceptions like running out of hard drive space, or writing to a read-only file, etc.

SqlDataReader change column names when exporting to csv

I have a query that gets report data via a SqlDataReader and sends the SqlDataReader to a method that exports the content out to a .CSV file; however, the column names are showing up in the .CSV file the way that they appear in the database which is not ideal.
I do not want to alter the query itself (changing the names to have spaces) because this query is called in another location where it maps to an object and spaces would not work. I would prefer not to create a duplicate query because maintenance could be problematic. I also do not want to modify the method that writes out the .CSV as this is a method that is globally used.
Can I modify the column names after I fill the data reader but before I send it to the .CSV method? If so, how?
If I can't do it this way, could I do it if it was a DataTable instead?
Here is the general flow:
public static SqlDataReader RunMasterCSV(Search search)
{
SqlDataReader reader = null;
using (Network network = new Network())
{
using (SqlCommand cmd = new SqlCommand("dbo.MasterReport"))
{
cmd.CommandType = CommandType.StoredProcedure;
//Parameters here...
network.FillSqlReader(cmd, ref reader);
<-- Ideally would like to find a solution here -->
return reader;
}
}
}
public FileInfo CSVFileWriter(SqlDataReader reader)
{
DeleteOldFolders();
FileInfo file = null;
if (reader != null)
{
using (reader)
{
var WriteDirectory = GetExcelOutputDirectory();
double folderToSaveInto = Math.Ceiling((double)DateTime.Now.Hour / Folder_Age_Limit.TotalHours);
string uploadFolder = GetExcelOutputDirectory() + "\\" + DateTime.Now.ToString("ddMMyyyy") + "_" + folderToSaveInto.ToString();
//Add directory for today if one does not exist
if (!Directory.Exists(uploadFolder))
Directory.CreateDirectory(uploadFolder);
//Generate random GUID fileName
file = new FileInfo(uploadFolder + "\\" + Guid.NewGuid().ToString() + ".csv");
if (file.Exists)
file.Delete();
using (file.Create()) { /*kill the file stream immediately*/};
StringBuilder sb = new StringBuilder();
if (reader.Read())
{
//write the column names
for (int i = 0; i < reader.FieldCount; i++)
{
AppendValue(sb, reader.GetName(i), (i == reader.FieldCount - 1));
}
//write the column names
for (int i = 0; i < reader.FieldCount; i++)
{
AppendValue(sb, reader[i] == DBNull.Value ? "" : reader[i].ToString(), (i == reader.FieldCount - 1));
}
int rowcounter = 1;
while (reader.Read())
{
for (int i = 0; i < reader.FieldCount; i++)
{
AppendValue(sb, reader[i] == DBNull.Value ? "" : reader[i].ToString(), (i == reader.FieldCount - 1));
}
rowcounter++;
if (rowcounter == MaxRowChunk)
{
using (var sw = file.AppendText())
{
sw.Write(sb.ToString());
sw.Close();
sw.Dispose();
}
sb = new StringBuilder();
rowcounter = 0;
}
}
if (sb.Length > 0)
{
//write the last bit
using (var sw = file.AppendText())
{
sw.Write(sb.ToString());
sw.Close();
sw.Dispose();
sb = new StringBuilder();
}
}
}
}
}
return file;
}
I would try a refactoring of your CSVFileWriter.
First you should add a delegate declaration
public delegate string onColumnRename(string);
Then create an overload of your CSVFileWriter where you pass the delegate together with the reader
public FileInfo CSVFileWriter(SqlDataReader reader, onColumnRename renamer)
{
// Move here all the code of the old CSVFileWriter
.....
}
Move the code of the previous CSVFileWriter to the new method and, from the old one call the new one
public FileInfo CSVFileWriter(SqlDataReader reader)
{
// Pass null for the delegate to the new version of CSVFileWriter....
return this.CSVFileWriter(reader, null)
}
This will keep existing clients of the old method happy. For them nothing has changed.....
Inside the new version of CSVFileWriter you change the code that prepare the column names
for (int i = 0; i < reader.FieldCount; i++)
{
string colName = (renamer != null ? renamer(reader.GetName(i))
: reader.GetName(i))
AppendValue(sb, colName, (i == reader.FieldCount - 1));
}
Now it is just a matter to create the renamer function that translates your column names
private string myColumnRenamer(string columnName)
{
if(columnName == "yourNameWithoutSpaces")
return "your Name with Spaces";
else
return text;
}
This could be optimized with a static dictionary to remove the list of ifs
At this point your could call the new CSVFileWriter passing your function
FileInfo fi = CSVFileWrite(reader, myColumnRenamer);

How to export large SQL Server table into a CSV file using the FileHelpers library?

I'm looking to export a large SQL Server table into a CSV file using C# and the FileHelpers library.
I could consider C# and bcp as well, but I thought FileHelpers would be more flexible than bcp. Speed is not a special requirement.
OutOfMemoryException is thrown on the storage.ExtractRecords() when the below code is run (some less essential code has been omitted):
SqlServerStorage storage = new SqlServerStorage(typeof(Order));
storage.ServerName = "SqlServer";
storage.DatabaseName = "SqlDataBase";
storage.SelectSql = "select * from Orders";
storage.FillRecordCallback = new FillRecordHandler(FillRecordOrder);
Order[] output = null;
output = storage.ExtractRecords() as Order[];
When the below code is run, 'Timeout expired' is thrown on the link.ExtractToFile():
SqlServerStorage storage = new SqlServerStorage(typeof(Order));
string sqlConnectionString = "Server=SqlServer;Database=SqlDataBase;Trusted_Connection=True";
storage.ConnectionString = sqlConnectionString;
storage.SelectSql = "select * from Orders";
storage.FillRecordCallback = new FillRecordHandler(FillRecordOrder);
FileDataLink link = new FileDataLink(storage);
link.FileHelperEngine.HeaderText = headerLine;
link.ExtractToFile("file.csv");
The SQL query run takes more than the default 30 sec and therefore the timeout exception. Unfortunately, I can't find in the FileHelpers docs how to set the SQL Command timeout to a higher value.
I could consider to loop an SQL select on small data sets until the whole table gets exported, but the procedure would be too complicated.
Is there a straightforward method to use FileHelpers on large DB tables export?
Rei Sivan's answer is on the right track, as it will scale well with large files, because it avoids reading the entire table into memory. However, the code can be cleaned up.
shamp00's solution requires external libraries.
Here is a simpler table-to-CSV-file exporter that will scale well to large files, and does not require any external libraries:
using System;
using System.Collections.Generic;
using System.Data;
using System.Data.SqlClient;
using System.IO;
using System.Linq;
public class TableDumper
{
public void DumpTableToFile(SqlConnection connection, string tableName, string destinationFile)
{
using (var command = new SqlCommand("select * from " + tableName, connection))
using (var reader = command.ExecuteReader())
using (var outFile = File.CreateText(destinationFile))
{
string[] columnNames = GetColumnNames(reader).ToArray();
int numFields = columnNames.Length;
outFile.WriteLine(string.Join(",", columnNames));
if (reader.HasRows)
{
while (reader.Read())
{
string[] columnValues =
Enumerable.Range(0, numFields)
.Select(i => reader.GetValue(i).ToString())
.Select(field => string.Concat("\"", field.Replace("\"", "\"\""), "\""))
.ToArray();
outFile.WriteLine(string.Join(",", columnValues));
}
}
}
}
private IEnumerable<string> GetColumnNames(IDataReader reader)
{
foreach (DataRow row in reader.GetSchemaTable().Rows)
{
yield return (string)row["ColumnName"];
}
}
}
I wrote this code, and declare it CC0 (public domain).
I incorporate 2 The code above. I use this code. I use VS 2010.
//this is all lib that i used|||||||||||||||
using System;
using System.Collections.Generic;
using System.ComponentModel;
using System.Data;
using System.Drawing;
using System.Linq;
using System.Text;
using System.Windows.Forms;
using UsbLibrary;
using System.Data;
using System.Data.SqlClient;
using System.Configuration;
using System.Globalization;
//cocy in a button||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
SqlConnection _connection = new SqlConnection();
SqlDataAdapter _dataAdapter = new SqlDataAdapter();
SqlCommand _command = new SqlCommand();
DataTable _dataTable = new DataTable();
_connection = new SqlConnection();
_dataAdapter = new SqlDataAdapter();
_command = new SqlCommand();
_dataTable = new DataTable();
//dbk is my database name that you can change it to your database name
_connection.ConnectionString = "Data Source=.;Initial Catalog=dbk;Integrated Security=True";
_connection.Open();
SaveFileDialog saveFileDialogCSV = new SaveFileDialog();
saveFileDialogCSV.InitialDirectory = Application.ExecutablePath.ToString();
saveFileDialogCSV.Filter = "CSV files (*.csv)|*.csv|All files (*.*)|*.*";
saveFileDialogCSV.FilterIndex = 1;
saveFileDialogCSV.RestoreDirectory = true;
string path_csv="";
if (saveFileDialogCSV.ShowDialog() == DialogResult.OK)
{
// Runs the export operation if the given filenam is valid.
path_csv= saveFileDialogCSV.FileName.ToString();
}
DumpTableToFile(_connection, "tbl_trmc", path_csv);
}
//end of code in button|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
public void DumpTableToFile(SqlConnection connection, string tableName, string destinationFile)
{
using (var command = new SqlCommand("select * from " + tableName, connection))
using (var reader = command.ExecuteReader())
using (var outFile = System.IO.File.CreateText(destinationFile))
{
string[] columnNames = GetColumnNames(reader).ToArray();
int numFields = columnNames.Length;
outFile.WriteLine(string.Join(",", columnNames));
if (reader.HasRows)
{
while (reader.Read())
{
string[] columnValues =
Enumerable.Range(0, numFields)
.Select(i => reader.GetValue(i).ToString())
.Select(field => string.Concat("\"", field.Replace("\"", "\"\""), "\""))
.ToArray();
outFile.WriteLine(string.Join(",", columnValues));
}
}
}
}
private IEnumerable<string> GetColumnNames(IDataReader reader)
{
foreach (DataRow row in reader.GetSchemaTable().Rows)
{
yield return (string)row["ColumnName"];
}
}
try this one:
private void exportToCSV()
{
//Asks the filenam with a SaveFileDialog control.
SaveFileDialog saveFileDialogCSV = new SaveFileDialog();
saveFileDialogCSV.InitialDirectory = Application.ExecutablePath.ToString();
saveFileDialogCSV.Filter = "CSV files (*.csv)|*.csv|All files (*.*)|*.*";
saveFileDialogCSV.FilterIndex = 1;
saveFileDialogCSV.RestoreDirectory = true;
if (saveFileDialogCSV.ShowDialog() == DialogResult.OK)
{
// Runs the export operation if the given filenam is valid.
exportToCSVfile(saveFileDialogCSV.FileName.ToString());
}
}
* Exports data to the CSV file.
*/
private void exportToCSVfile(string fileOut)
{
// Connects to the database, and makes the select command.
string sqlQuery = "select * from dbo." + this.lbxTables.SelectedItem.ToString();
SqlCommand command = new SqlCommand(sqlQuery, objConnDB_Auto);
// Creates a SqlDataReader instance to read data from the table.
SqlDataReader dr = command.ExecuteReader();
// Retrives the schema of the table.
DataTable dtSchema = dr.GetSchemaTable();
// Creates the CSV file as a stream, using the given encoding.
StreamWriter sw = new StreamWriter(fileOut, false, this.encodingCSV);
string strRow; // represents a full row
// Writes the column headers if the user previously asked that.
if (this.chkFirstRowColumnNames.Checked)
{
sw.WriteLine(columnNames(dtSchema, this.separator));
}
// Reads the rows one by one from the SqlDataReader
// transfers them to a string with the given separator character and
// writes it to the file.
while (dr.Read())
{
strRow = "";
for (int i = 0; i < dr.FieldCount; i++)
{
switch (Convert.ToString(dr.GetFieldType(i)))
{
case "System.Int16":
strRow += Convert.ToString(dr.GetInt16(i));
break;
case "System.Int32" :
strRow += Convert.ToString(dr.GetInt32(i));
break;
case "System.Int64":
strRow += Convert.ToString(dr.GetInt64(i));
break;
case "System.Decimal":
strRow += Convert.ToString(dr.GetDecimal(i));
break;
case "System.Double":
strRow += Convert.ToString(dr.GetDouble(i));
break;
case "System.Float":
strRow += Convert.ToString(dr.GetFloat(i));
break;
case "System.Guid":
strRow += Convert.ToString(dr.GetGuid(i));
break;
case "System.String":
strRow += dr.GetString(i);
break;
case "System.Boolean":
strRow += Convert.ToString(dr.GetBoolean(i));
break;
case "System.DateTime":
strRow += Convert.ToString(dr.GetDateTime(i));
break;
}
if (i < dr.FieldCount - 1)
{
strRow += this.separator;
}
}
sw.WriteLine(strRow);
}
// Closes the text stream and the database connenction.
sw.Close();
dr.Close();
// Notifies the user.
MessageBox.Show("ready");
}
Very appreciative of Jay Sullivan's answer -- was very helpful for me.
Building on that, I observed that in his solution the string formatting of varbinary and string data types was not good -- varbinary fields would come out as literally "System.Byte" or something like that, while datetime fields would be formatted MM/dd/yyyy hh:mm:ss tt, which is not desirable for me.
Below I is my hacked-together solution which converts to string differently based on data type. It is uses nested ternary operators, but it works!
Hope it is helpful for someone.
public static void DumpTableToFile(SqlConnection connection, Dictionary<string, string> cArgs)
{
string query = "SELECT ";
string z = "";
if (cArgs.TryGetValue("top_count", out z))
{
query += string.Format("TOP {0} ", z);
}
query += string.Format("* FROM {0} (NOLOCK) ", cArgs["table"]);
string lower_bound = "", upper_bound = "", column_name = "";
if (cArgs.TryGetValue("lower_bound", out lower_bound) && cArgs.TryGetValue("column_name", out column_name))
{
query += string.Format("WHERE {0} >= {1} ", column_name, lower_bound);
if (cArgs.TryGetValue("upper_bound", out upper_bound))
{
query += string.Format("AND {0} < {1} ", column_name, upper_bound);
}
}
Console.WriteLine(query);
Console.WriteLine("");
using (var command = new SqlCommand(query, connection))
using (var reader = command.ExecuteReader())
using (var outFile = File.CreateText(cArgs["out_file"]))
{
string[] columnNames = GetColumnNames(reader).ToArray();
int numFields = columnNames.Length;
Console.WriteLine(string.Join(",", columnNames));
Console.WriteLine("");
if (reader.HasRows)
{
Type datetime_type = Type.GetType("System.DateTime");
Type byte_arr_type = Type.GetType("System.Byte[]");
string format = "yyyy-MM-dd HH:mm:ss.fff";
int ii = 0;
while (reader.Read())
{
ii += 1;
string[] columnValues =
Enumerable.Range(0, numFields)
.Select(i => reader.GetValue(i).GetType()==datetime_type?((DateTime) reader.GetValue(i)).ToString(format):(reader.GetValue(i).GetType() == byte_arr_type? String.Concat(Array.ConvertAll((byte[]) reader.GetValue(i), x => x.ToString("X2"))) :reader.GetValue(i).ToString()))
///.Select(field => string.Concat("\"", field.Replace("\"", "\"\""), "\""))
.Select(field => field.Replace("\t", " "))
.ToArray();
outFile.WriteLine(string.Join("\t", columnValues));
if (ii % 100000 == 0)
{
Console.WriteLine("row {0}", ii);
}
}
}
}
}
public static IEnumerable<string> GetColumnNames(IDataReader reader)
{
foreach (DataRow row in reader.GetSchemaTable().Rows)
{
yield return (string)row["ColumnName"];
}
}
FileHelpers has an async engine which is better suited for handling large files. Unfortunately, the FileDataLink class does not use it, so there's no easy way to use it with SqlStorage.
It's not very easy to modify the SQL timeout either. The easiest way would be to copy the code for SqlServerStorage to create your own alternative storage provider and provide replacements for ExecuteAndClose() and ExecuteAndLeaveOpen() which set the timeout on the IDbCommand. (SqlServerStorage is a sealed class, so you cannot just subclass it).
You might want to check out ReactiveETL which uses the FileHelpers async engine for handling files along with a rewrite of Ayende's RhinoETL using ReactiveExtensions to handle large datasets.

create JSON string from SqlDataReader

UPDATE
I figured it out. Check out my answer below.
I'm trying to create a JSON string representing a row from a database table to return in an HTTP response. It seems like Json.NET would be a good tool to utilize. However, I'm not sure how to do build the JSON string while I'm reading from the database.
The problem is marked by the obnoxious comments /******** ********/
// connect to DB
theSqlConnection.Open(); // open the connection
SqlDataReader reader = sqlCommand.ExecuteReader();
if (reader.HasRows) {
while(reader.Read()) {
StringBuilder sb = new StringBuilder();
StringWriter sw = new StringWriter(sb);
using (JsonWriter jsonWriter = new JsonTextWriter(sw)) {
// read columns from the current row and build this JsonWriter
jsonWriter.WriteStartObject();
jsonWriter.WritePropertyName("FirstName");
// I need to read the value from the database
/******** I can't just say reader[i] to get the ith column. How would I loop here to get all columns? ********/
jsonWriter.WriteValue(... ? ...);
jsonWriter.WritePropertyName("LastName");
jsonWriter.WriteValue(... ? ...);
jsonWriter.WritePropertyName("Email");
jsonWriter.WriteValue(... ? ...);
// etc...
jsonWriter.WriteEndObject();
}
}
}
The problem is that I don't know how to read each column from the row from the SqlReader such that I can call WriteValue and give it the correct information and attach it to the correct column name. So if a row looks like this...
| FirstName | LastName | Email |
... how would I create a JsonWriter for each such row such that it contains all column names of the row and the corresponding values in each column and then use that JsonWriter to build a JSON string that is ready for returning through an HTTP Response?
Let me know if I need to clarify anything.
My version:
This doesn't use DataSchema and also wraps the results in an array, instead of using a writer per row.
SqlDataReader rdr = cmd.ExecuteReader();
StringBuilder sb = new StringBuilder();
StringWriter sw = new StringWriter(sb);
using (JsonWriter jsonWriter = new JsonTextWriter(sw))
{
jsonWriter.WriteStartArray();
while (rdr.Read())
{
jsonWriter.WriteStartObject();
int fields = rdr.FieldCount;
for (int i = 0; i < fields; i++)
{
jsonWriter.WritePropertyName(rdr.GetName(i));
jsonWriter.WriteValue(rdr[i]);
}
jsonWriter.WriteEndObject();
}
jsonWriter.WriteEndArray();
}
EDITED FOR SPECIFIC EXAMPLE:
theSqlConnection.Open();
SqlDataReader reader = sqlCommand.ExecuteReader();
DataTable schemaTable = reader.GetSchemaTable();
foreach (DataRow row in schemaTable.Rows)
{
StringBuilder sb = new StringBuilder();
StringWriter sw = new StringWriter(sb);
using (JsonWriter jsonWriter = new JsonTextWriter(sw))
{
jsonWriter.WriteStartObject();
foreach (DataColumn column in schemaTable.Columns)
{
jsonWriter.WritePropertyName(column.ColumnName);
jsonWriter.WriteValue(row[column]);
}
jsonWriter.WriteEndObject();
}
}
theSqlConnection.Close();
Got it! Here's the C#...
// ... SQL connection and command set up, only querying 1 row from the table
StringBuilder sb = new StringBuilder();
StringWriter sw = new StringWriter(sb);
JsonWriter jsonWriter = new JsonTextWriter(sw);
try {
theSqlConnection.Open(); // open the connection
// read the row from the table
SqlDataReader reader = sqlCommand.ExecuteReader();
reader.Read();
int fieldcount = reader.FieldCount; // count how many columns are in the row
object[] values = new object[fieldcount]; // storage for column values
reader.GetValues(values); // extract the values in each column
jsonWriter.WriteStartObject();
for (int index = 0; index < fieldcount; index++) { // iterate through all columns
jsonWriter.WritePropertyName(reader.GetName(index)); // column name
jsonWriter.WriteValue(values[index]); // value in column
}
jsonWriter.WriteEndObject();
reader.Close();
} catch (SqlException sqlException) { // exception
context.Response.ContentType = "text/plain";
context.Response.Write("Connection Exception: ");
context.Response.Write(sqlException.ToString() + "\n");
} finally {
theSqlConnection.Close(); // close the connection
}
// END of method
// the above method returns sb and another uses it to return as HTTP Response...
StringBuilder theTicket = getInfo(context, ticketID);
context.Response.ContentType = "application/json";
context.Response.Write(theTicket);
... so the StringBuilder sb variable is the JSON object that represents the row I wanted to query. Here is the JavaScript...
$.ajax({
type: 'GET',
url: 'Preview.ashx',
data: 'ticketID=' + ticketID,
dataType: "json",
success: function (data) {
// data is the JSON object the server spits out
// do stuff with the data
}
});
Thanks to Scott for his answer which inspired me to come to my solution.
Hristo
I made the following method where it converts any DataReader to JSON, but only for single depth serialization:
you should pass the reader, and the column names as a string array, for example:
String [] columns = {"CustomerID", "CustomerName", "CustomerDOB"};
then call the method
public static String json_encode(IDataReader reader, String[] columns)
{
int length = columns.Length;
String res = "{";
while (reader.Read())
{
res += "{";
for (int i = 0; i < length; i++)
{
res += "\"" + columns[i] + "\":\"" + reader[columns[i]].ToString() + "\"";
if (i < length - 1)
res += ",";
}
res += "}";
}
res += "}";
return res;
}

C# .net read from .xls missing cell with content 1

I tried to read data from a xls file using oledb. The sheet had a column which has numbers in all the cells. When I executed the code that particular column alone is missing everytime. The code I used is:
OleDbConnection excel_connection = new OleDbConnection("Provider=Microsoft.Jet.OLEDB.4.0;Data Source=" + file_name + ";Extended Properties=Excel 8.0");
excel_connection.Open();
DataSet excel_data = new DataSet();
OleDbDataAdapter myCommand = new OleDbDataAdapter("SELECT * FROM [XX$]", excel_connection);
myCommand.Fill(excel_data);
excel_connection.Close();
int index = 0;
string excel_columns;
foreach (DataRow excel_row in excel_data.Tables[0].Rows)
{
excel_columns = String.Join(",", cells);
richTextBox1.AppendText(excel_columns);
}
Why is this happening?
I used to read Excel document like you do but I kept having problems with column formats so I decided to switch to NPOI.
It is a good product and works pretty well. The implementation is very very easy.
Here is some code:
using (FileStream Xlfile = new FileStream(MyFileName, FileMode.Open, FileAccess.Read))
{
using (HSSFWorkbook XLBook = new HSSFWorkbook(Xlfile))
{
using (NPOI.SS.UserModel.Sheet XLSheet = XLBook.GetSheetAt(0))
{
NPOI.HSSF.UserModel.HSSFRow CurrentRow;
NPOI.SS.UserModel.Cell CurrentCell;
IEnumerator RowEnum = XLSheet.GetRowEnumerator();
while (RowEnum.MoveNext())
{
iLoopRows++;
if (RowEnum.Current != null)
{
rowCounter++;
CurrentRow = RowEnum.Current as NPOI.HSSF.UserModel.HSSFRow;
for (Int32 iLoop = 0; iLoop < CurrentRow.Cells.Count; iLoop++)
{
CurrentCell = CurrentRow.Cells[iLoop];
switch (CurrentCell.CellType)
{
case NPOI.SS.UserModel.CellType.STRING:
// Reading STRING value
// CurrentCell.StringCellValue;
break;
case NPOI.SS.UserModel.CellType.NUMERIC:
// Reading NUMERIC and DATA VALUES
// (CurrentCell.DateCellValue == null) ? "" : CurrentCell.DateCellValue.ToString();
break;
default:
break;
}
}
}
}
}
}
Xlfile.Close();
}

Categories