I'm getting content from a database and returning it for Ajax processing via Javascript. Pretty simple stuff. The issue here is that I can't seem to find a good way to loop through the data and the MSDN documentation is obscenely poor for its odbcreader methods.
using (OdbcCommand com = new OdbcCommand("SELECT * FROM pie_data WHERE Pie_ID = ?",
con)) {
if (Request.Form["reference_id"] == "") {
returnError();
} else {
com.Parameters.AddWithValue("", Request.Form["reference_id"]);
com.ExecuteNonQuery();
using (OdbcDataReader reader = com.ExecuteReader()) {
string finalstring = "";
while (reader.Read()) {
if(reader.HasRows) {
finalstring = reader.GetString(9) + ",";
for (int i = 0; i <= 8; i = i + 1) {
finalstring = finalstring + reader.GetValue(i).ToString() + ",";
}
finalstring = finalstring + "|";
reader.NextResult();
}
}
if (finalstring != "") {
finalstring = finalstring.Remove(finalstring.Length -1, 1);
Response.Write(finalstring);
}
}
noredirect = 1;
}
}
However, here is the sample output:
00001,0,Pie Johnson,piesaregreat#yum.com,,,10/7/2010 12:00:00 AM,Bakery,N/A,N/A,
As you can see, the second deliminator is not appearing at all, when it really should. Also, this query, when run in heidisql, returns a good number of rows, not just this one result. Once I have it passed to the Javascript, I can figure it out since I have much more experience with that and I've actually done this before via PHP.
I would use a DataTable and a DataAdapter:
String finalString;
var tblPieData = new DataTable();
using(var con = new OdbcConnection(connectionString))
using (OdbcDataAdapter da = new OdbcDataAdapter("SELECT * FROM pie_data WHERE Pie_ID = ?", con))
{
da.SelectCommand.Parameters.AddWithValue("Pie_ID", reference_id);
da.Fill(tblPieData);
var rowFields = tblPieData.AsEnumerable()
.Select(r => string.Join(",", r.ItemArray));
finalString = string.Join("|", rowFields);
}
You are replacing the value of finalstring in every iteration of the loop. This will cause only the values for a single row to be returned:
finalstring = reader.GetString(9) + ",";
as well as removing the last character at the end of each iteration through the columns in the rows (the pipe, not the trailing comma as I'm expecting you want):
finalstring = finalstring.Remove(finalstring.Length -1, 1);
EDIT:
It also looks like you are skipping over every other record by both looping on reader.Read() as well as calling reader.NextResult()
This line:
finalstring = finalstring.Remove(finalstring.Length -1, 1);
is going to remove the last instance of your pipe-delimiter, so if there is only one record (which there appears to be,) you shouldn't see one.
Unless I am missing something...
EDIT
Actually, if you are looking for two records, you are probably missing the second one because you start your loop with while(reader.Read()) and end it with reader.NextResult();. These will both advance the reader, causing you to miss every other record.
EDIT 2
You are also overwriting finalstring on each iteration; you probably want to append to this (making it a StringBuilder would make the most sense for efficiency's sake, if you don't know how many records to expect.)
Your looping structure with the reader appears to have some problems, not the least of which is the reset of finalAnswer within your loop and giving you only a single result combined with using both .Read() and .NextResult:
Suggested fixes...not tested, so all standard caveats apply :) :
Edited per Mark Averius' comment and OP confirmation of original intent:
using (OdbcDataReader reader = com.ExecuteReader()) {
if (reader.HasRows())
{
string finalstring = "";
while (reader.Read()) {
finalstring = finalstring + reader.GetString(9) + ",";
for (int i = 0; i <= 8; i++) {
finalstring = finalstring + reader.GetValue(i).ToString() + ",";
}
finalstring = finalstring + "|";
}
if (finalstring != "") {
finalstring = finalstring.Remove(finalstring.Length -1, 1);
Response.Write(finalstring);
}
}
}
Related
I have the code
while (reader.Read())
{
if (reader[incrementer]!=DBNull.Value){
string playerToInform = reader.GetString(incrementer).ToString();
string informClientMessage = "ULG=" + clientIP + ","; //User Left Game
byte[] informClientsMessage = new byte[informClientMessage.Length];
informClientsMessage = Encoding.ASCII.GetBytes(informClientMessage);
playerEndPoint = new IPEndPoint(IPAddress.Parse(playerToInform), 8001);
clientSocket.SendTo(informClientsMessage, playerEndPoint);
}
incrementer++;
}
which after debugging my code i see contains 4 entries. However only the first result is ever read from the reader. After the first iteration to find if the result returned is null or not the loop starts again and immediately finishes even though there are three more rows to read.
Any ideas as to why this may be occuring would be apprechiated.
edit - this is the reader i used
OleDbDataReader reader = dBConn.DataSelect("SELECT player1_IP, player2_IP, player3_IP, player4_IP FROM running_games WHERE game_name = '" + gameName + "'", updateGameList);
The indexer of DbDataReader (DataReader is something else) or a database specific subclass, returns the value of the specified (by index or name).
While DbDataReader.Read() moves to the next row.
If you want to apply the same logic to multiple columns you need to loop over the columns, and the rows:
while (db.Read()) {
for (var colIdx = 0; colIdx < columnCount. ++colIdx) {
if (!db.IsDbNll(colIdx)) {
string value = db.GetString(colIdx);
// Process value
}
}
}
You're incrementing "incrementer" as if that was the row number, but a DataReader holds only one row per Read() and the indexing is for the field number.
Use this:
while (reader.Read())
{
for(int colNum = 0; colNum < 4; colNum++)
{
if (reader[colNum]!=DBNull.Value)
{
string playerToInform = reader.GetString(colNum).ToString();
string informClientMessage = "ULG=" + clientIP + ","; //User Left Game
byte[] informClientsMessage = new byte[informClientMessage.Length];
informClientsMessage = Encoding.ASCII.GetBytes(informClientMessage);
playerEndPoint = new IPEndPoint(IPAddress.Parse(playerToInform), 8001);
clientSocket.SendTo(informClientsMessage, playerEndPoint);
}
}
}
Incrementer is unnecessary. reader.Read() advances to next record and returns false if there are no more rows.
Check documentation on msdn
I have 5 DataTables that needs to be converted to TXT files. Instead of creating them separately, I thought I use a for loop. Here's my code:
StringBuilder sb = new StringBuilder();
for (int i = 1; i < 5; i++)
{
DataTable dtFile1 = file1BLO.SelectFile1ForCSV()
foreach (DataRow dr in dtFile1.Rows)
{
string[] fields = dr.ItemArray.Select(field => field.ToString()).ToArray();
sb.AppendLine(string.Join("|", fields) + "|");
}
Response.ContentType = "application/text";
Response.AddHeader("content-disposition", "attachment;filename=CAPRES-FILE1-"
+ DateTime.Now.ToString("yyyyMMdd-HHmmss") + ".txt");
Response.Output.Write(sb);
Response.Flush();
Response.End();
sb.Clear();
}
I would want the iterator to be appended to the variable names and methods. Like this DataTable dtFile + i = file + i + BLO.SelectFile + i + ForCSV();
Thanks!
Requested Code for SelectFile1ForCSV()
public DataTable SelectFile1ForCSV()
{
return file1DAO.SelectFile1ForCSV();
}
EDIT: I am sorry, it seems that I haven't provided enough details, and my question has caused confusion. Edited now.
You cannot just append a number to a variable name at runtime to magically reference a new variable. What you should do instead is:
Define an an interface:
public interface IFileBLO
{
DataTable SelectFileForCSV();
}
Have File1BLO, File2BLO etc all implement IFileBLO and fix the method names so that they are all SelectFileForCSV rather than SelectFile1ForCSV etc.
Add a lookup for reference these objects:
var bloList = new IFileBLO[]
{
file1BLO, file2BLO, file3BLO, file4BLO, file5BLO
};
Finally, change your loop to:
for (int i = 0; i < 5; i++)
{
var dtFile = bloList[i].SelectFileForCSV();
foreach (var dr in dtFile.Rows)
{
...
There are not enough information in the question to know exactly what the problem is, so I'm just guessing here. Your problem is that you have five objects, that all have a method, and these method have different names. The methods return the same thing, though, DataTable, that can be used in a loop.
If that's the case then just take out of the loop that which is different, so that in the loop remains that which is identical. Something like this:
DataTable[] fiveTables =
{
file1BLO.SelectFile1ForCSV(),
file2BLO.SelectFile2ForCSV(),
file3BLO.SelectFile3ForCSV(),
file4BLO.SelectFile4ForCSV(),
file5BLO.SelectFile5ForCSV()
}
for (int i = 1; i <= 5; i++)
{
// Use fiveTables[i] for DataTable, and i for file name
}
Use this:
Response.AddHeader("content-disposition", "attachment;filename=CAPRES-FILE"
+ i
+ "-"
+ DateTime.Now.ToString("yyyyMMdd-HHmmss") + ".txt");
This will create a filename containing the value of i.
I have the following function which truncates a SQL Server varchar column and adds it to a string:
public void Pp()
{
strSql = #""; //query
using (SqlConnection conn = new SqlConnection(gstr))
{
try
{
SqlDataAdapter da = new SqlDataAdapter(strSql, conn);
myDataSet = new DataSet();
da.Fill(myDataSet);
string specific = "";
string generic = "";
string strTemp = "";
foreach (DataRow r in myDataSet.Tables[0].Rows)
{
if (r["MessageText"].ToString().Length <= 65)
{
strTemp = r["MessageText"].ToString();
}
else
{
strTemp = TruncateLongString(r["MessageText"].ToString(), 65);
}
specific += "</b><span class='hoverText' title='" + r["MessageText"] + "'>" + strTemp + "...</span>";
strTemp = "";
}
lblMessage.Text = "<b>SPECIFIC MESSAGES:</b> <br />" + specific;
}
catch (Exception ce)
{
}
}
}
public string TruncateLongString(string str, int maxLength)
{
return str.Substring(0, maxLength);
}
If the r["MessageText"] contains an appostrophe, it cuts off anything after it. (full text: no way no way HERE we go again but this is not working. Is it? or Is it not? I can't be too sure though. Can someone please check.)
Here is an example of a live preview (the title is shown but gets cut off because of the apostrophe):
Here is the source (the apostrophe is shown in the purple box. Also the color coding gets out of whack due to the apostrophe, which means the code is not correct):
How can I ensure it doesn't escape any escape characters, e.g. ', /, \.
You need to encode the HTML first.
Call this.Server.HtmlEncode( str ). This will also protect against other special characters like & and <.
That said, you're using single-quotes for attribute delimiters but HtmlEncode only encodes double-quotes, so you need to change your code to this:
specific = String.Format( CultureInfo.InvariantCulture, #"</b><span class=""hoverText"" title=""" + this.Server.HtmlEncode( r["MessageText"] ) + """>" + strTemp + #"...</span>";
.Replace(" ' ", "\\' "); You will probably want to do the same with double quote as well.
so basically i'm developing a networking program in c# and I'm trying to send a string from the server to the client using stream reader but I'm having a very strange issue. When I use this code...
[Server Side]
foreach (DataRow row in StocksTable.Rows)
{
stocks += row["description"] + "," + row["buy"] + "," + row["sell"] + ",";
}
[Client]
textBox3.Text = streamReader.ReadLine();
... it works but it only returns the first row. When I change the "\n" with ";" for example so that everything is on one row, the client crashes.
I tried using an iterator to print all the rows but it doesn't work as well.
I know it sounds funny and probably there's some simple explanation but I've been stuck on this for a while and i'm getting confused.
EDIT:
I tried iterating and this thing works:
for (int i = 0; i < 5; i++)
{
textBox3.Text += streamReader.ReadLine();
}
(5 is the number of rows in the string)
but this doesn't:
while (true)
{
string s = streamReader.ReadLine();
if (s != null)
{
textBox3.Text += s;
}
else
{
break;
}
}
I'm going to assume that stocks is a string. A string can contain many lines (lines defined as string delimited by a CR, LF, or CRLF). In this case it appears you are creating multiple lines in the string by using \n. There for:
textBox3.Text = streamReader.ReadLine();
Will read the first Line in the stream.
You should use the StreamReader.ReadToEnd() in order to read all the lines. ReadNextLine will only read a line and stop at a \n character.
If you iterate, you should check if more content is still available.
I have a remote sql connection in C# that needs to execute a query and save its results to the users's local hard disk. There is a fairly large amount of data this thing can return, so need to think of an efficient way of storing it. I've read before that first putting the whole result into memory and then writing it is not a good idea, so if someone could help, would be great!
I am currently storing the sql result data into a DataTable, although I am thinking it could be better doing something in while(myReader.Read(){...}
Below is the code that gets the results:
DataTable t = new DataTable();
string myQuery = QueryLoader.ReadQueryFromFileWithBdateEdate(#"Resources\qrs\qryssysblo.q", newdate, newdate);
using (SqlDataAdapter a = new SqlDataAdapter(myQuery, sqlconn.myConnection))
{
a.Fill(t);
}
var result = string.Empty;
for(int i = 0; i < t.Rows.Count; i++)
{
for (int j = 0; j < t.Columns.Count; j++)
{
result += t.Rows[i][j] + ",";
}
result += "\r\n";
}
So now I have this huge result string. And I have the datatable. There has to be a much better way of doing it?
Thanks.
You are on the right track yourself. Use a loop with while(myReader.Read(){...} and write each record to the text file inside the loop. The .NET framework and operating system will take care of flushing the buffers to disk in an efficient way.
using(SqlConnection conn = new SqlConnection(connectionString))
using(SqlCommand cmd = conn.CreateCommand())
{
conn.Open();
cmd.CommandText = QueryLoader.ReadQueryFromFileWithBdateEdate(
#"Resources\qrs\qryssysblo.q", newdate, newdate);
using(SqlDataReader reader = cmd.ExecuteReader())
using(StreamWriter writer = new StreamWriter("c:\temp\file.txt"))
{
while(reader.Read())
{
// Using Name and Phone as example columns.
writer.WriteLine("Name: {0}, Phone : {1}",
reader["Name"], reader["Phone"]);
}
}
}
I came up with this, it's a better CSV writer than the other answers:
public static class DataReaderExtension
{
public static void ToCsv(this IDataReader dataReader, string fileName, bool includeHeaderAsFirstRow)
{
const string Separator = ",";
StreamWriter streamWriter = new StreamWriter(fileName);
StringBuilder sb = null;
if (includeHeaderAsFirstRow)
{
sb = new StringBuilder();
for (int index = 0; index < dataReader.FieldCount; index++)
{
if (dataReader.GetName(index) != null)
sb.Append(dataReader.GetName(index));
if (index < dataReader.FieldCount - 1)
sb.Append(Separator);
}
streamWriter.WriteLine(sb.ToString());
}
while (dataReader.Read())
{
sb = new StringBuilder();
for (int index = 0; index < dataReader.FieldCount; index++)
{
if (!dataReader.IsDBNull(index))
{
string value = dataReader.GetValue(index).ToString();
if (dataReader.GetFieldType(index) == typeof(String))
{
if (value.IndexOf("\"") >= 0)
value = value.Replace("\"", "\"\"");
if (value.IndexOf(Separator) >= 0)
value = "\"" + value + "\"";
}
sb.Append(value);
}
if (index < dataReader.FieldCount - 1)
sb.Append(Separator);
}
if (!dataReader.IsDBNull(dataReader.FieldCount - 1))
sb.Append(dataReader.GetValue(dataReader.FieldCount - 1).ToString().Replace(Separator, " "));
streamWriter.WriteLine(sb.ToString());
}
dataReader.Close();
streamWriter.Close();
}
}
usage: mydataReader.ToCsv("myfile.csv", true)
Rob Sedgwick answer is more like it, but can be improved and simplified. This is how I did it:
string separator = ";";
string fieldDelimiter = "";
bool useHeaders = true;
string connectionString = "xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx";
using (SqlConnection conn = new SqlConnection(connectionString))
{
using (SqlCommand cmd = conn.CreateCommand())
{
conn.Open();
string query = #"SELECT whatever";
cmd.CommandText = query;
using (SqlDataReader reader = cmd.ExecuteReader())
{
if (!reader.Read())
{
return;
}
List<string> columnNames = GetColumnNames(reader);
// Write headers if required
if (useHeaders)
{
first = true;
foreach (string columnName in columnNames)
{
response.Write(first ? string.Empty : separator);
line = string.Format("{0}{1}{2}", fieldDelimiter, columnName, fieldDelimiter);
response.Write(line);
first = false;
}
response.Write("\n");
}
// Write all records
do
{
first = true;
foreach (string columnName in columnNames)
{
response.Write(first ? string.Empty : separator);
string value = reader[columnName] == null ? string.Empty : reader[columnName].ToString();
line = string.Format("{0}{1}{2}", fieldDelimiter, value, fieldDelimiter);
response.Write(line);
first = false;
}
response.Write("\n");
}
while (reader.Read());
}
}
}
And you need to have a function GetColumnNames:
List<string> GetColumnNames(IDataReader reader)
{
List<string> columnNames = new List<string>();
for (int i = 0; i < reader.FieldCount; i++)
{
columnNames.Add(reader.GetName(i));
}
return columnNames;
}
I agree that your best bet here would be to use a SqlDataReader. Something like this:
StreamWriter YourWriter = new StreamWriter(#"c:\testfile.txt");
SqlCommand YourCommand = new SqlCommand();
SqlConnection YourConnection = new SqlConnection(YourConnectionString);
YourCommand.Connection = YourConnection;
YourCommand.CommandText = myQuery;
YourConnection.Open();
using (YourConnection)
{
using (SqlDataReader sdr = YourCommand.ExecuteReader())
using (YourWriter)
{
while (sdr.Read())
YourWriter.WriteLine(sdr[0].ToString() + sdr[1].ToString() + ",");
}
}
Mind you, in the while loop, you can write that line to the text file in any format you see fit with the column data from the SqlDataReader.
Keeping your original approach, here is a quick win:
Instead of using String as a temporary buffer, use StringBuilder. That will allow you to use the function .append(String) for concatenations, instead of using the operator +=.
The operator += is specially inefficient, so if you place it on a loop and it is repeated (potentially) millions of times, the performance will be affected.
The .append(String) method won't destroy the original object, so it's faster
Using the response object without a response.Close() causes at least in some instances the html of the page writing out the data to be written to the file. If you use Response.Close() the connection can be closed prematurely and cause an error producing the file.
It is recommended to use the HttpApplication.CompleteRequest() however this appears to always cause the html to be written to the end of the file.
I have tried the stream in conjunction with the response object and have had success in the development environment. I have not tried it in production yet.
I used .CSV to export data from database by DataReader. in my project i read datareader and create .CSV file manualy. in a loop i read datareader and for every rows i append cell value to result string. for separate columns i use "," and for separate rows i use "\n". finally i saved result string as result.csv.
I suggest this high performance extension. i tested it and quickly export 600,000 rows as .CSV .
I use:
private void SaveData(string path)
{
DataTable tblResult = new DataTable();
using(SqlCommand cm = new SqlCommand("select something", objConnect))
{
tblResult.Load(cm.ExecuteLoad());
}
if (tblResult != null)
{
using(FileStream fs = new FileStream(path, FileMode.Create, FileAccess.Write))
{
BinaryFormatter bin = new BinaryFormatter();
bin.Serialize(fs, tblResult);
}
}
}
ease to use, and easy to load, with:
private DataTable LoadData(string path)
{
DataTable t = new DataTable();
using(FileStream fs = new FileStream(path, FileMode.Open, FileAccess.Read))
{
BinaryFormatter bin = new BinaryFormatter();
t = (DataTable)bin.Deserialize(fs);
}
return t;
}
you can use this method also to save a DataSet.