Put SqlDataReader data into a DataTable while i process it - c#

I want to query a database and with the results, i want to process them. While im processing them, some of them will need to be inserted into another database. Since i cannot run another query with an open SqlDataReader (that i know of). I was thinking about putting the data from the SqlDataReader into a DataTable while i process it. Is there a built in way to do this or is there another solution that can accomplish the same idea?

SqlDataReader reader = com.ExecuteReader();
DataTable dt = new DataTable();
dt.Load(reader);
With the rest of the standard set-up and tear-down of SqlCommand objects, of course.

The easiest way would be to use a DataAdapter to fill a datatable. Then process the data and update the database. Filling a dataset does not tie up the connection once the fill is complete.

Related

How can I check if data in dataset is updated and catch updated data? C#

I have timer-function. It handles data from sql database every N time (it's doesn't matter actually what time). The subsequent algorithm does not work. Or I don't know how to make it work.
I put data into Dataset.
const string query = "SELECT id FROM SomeTable LIMIT 100";
await using var cmd = new MySqlCommand(query, connection);
var sqlAdapter = new MySqlDataAdapter(cmd);
sqlAdapter.TableMappings.Add("Table", "SomeTable");
sqlAdapter.Fill(_mapping);
_mapping.AcceptChanges();
Where _mapping is dataset.
If this is the first filling of the dataset, then just fill it.
If this is not the first filling, then try to update the data in it. I tried to use sqlAdapter.Update(_mapping). But it doesn't update any data Or I cant get updated data. I tried fill it again and check _mapping.hasChanges(DataRowState.Modified | DataRowState.Added))
But it hasn't any changes.
So how can I fill dataset and catch changed or added rows while I get data from Database?
Or may be I need to use another things like dataTable or something else?

Differences between SqlDataReader or Dataset to query data from table

I have a table with millions of rows of data, and I would like to know which is the best way to query my data - using .ExecuteReader() or using a Dataset.
Using SqlDataReader like this:
myReader = cmd.ExecuteReader();
And after fill a list with the result
Or using DataSet
using (SqlDataAdapter da = new SqlDataAdapter(cmd))
{
a.Fill(ds);
}
Which are the best method?
The two objects are to be used in contexts fundamentally different.
A DataReader instance returned by the ExecuteReader doesn't return anything until you loop over it using the Read() method. It is a connected object that has a pointer to a current record on the backend database. You read the content of the record using the various GetXXXXX methods provided by the reader or simply using the indexer. When you have done with the current record you orderly pass to the following one using the Read() method. No way to go back or jump to record N + 100.
A DataSet instead is a disconnected object. It uses internally a DataReader to fill its local memory buffer with all the records returned by the command text query. It is handy if you need to work randomly on the data returned or show them on video or print them. But of course, waiting to have millions of records returned by the internal reader could be time consuming and the consuming of the local memory probably will kill your process before the end.
So, which is the best? Neither, if you have millions of records in your table, you need to put in place an appropriate WHERE condition to reduce the amount of records returned. Said that, it depends on what you need to do with the returned data. To display them on a grid probably you could use a DataSet. Instead a DataReader is better if you need to execute operations on the records one by one.
The question is what you want to fill
If you want to fill a DataSet/DataTable use DataAdapter.Fill(ds)
If you want to fill a list/array use a DataReader and a loop
The DataAdapter also uses a DataReader behind the scenes, but it loops all records. You could add a different logic to loop only part of the resultset.
"I have a table with million of rows": You should almost never need to return so many records. So don't filter in memory but in the database.
Both Are good Methods. But if you use SqlDataReader than you have to close it. its is must. Otherwise you will not able to execute any other query until SqlDataReader is not closed.

SqlCommand or SqlDataAdapter?

I'm creating something like a small cashier application that keeps record for the clients, employees, services, sales, and appointments. I'm using windows forms, and within that DataGrids. I've created the database that I'm going to be using for the application. I want to know if I should use SqlCommand-SqlDataReader or SqlDataAdapter-DataSet instead. Which approach is better?
This is highly depend upon type of operation you want.
Following is my suggetion.
If you want to read data faster go for SQLDataReader but that comes as cost of operation you need to take
during read after that also.
Open Connection
Read Data
Close Connection. If you forgot to close than it will hit performance.
Go for SQLDataAdapter
If you want to read faster and use benefit of Disconnected Arch. of ADO.net
This will automatically close/open connection.
Also it will also allow you to automatically handle update in DataSet back to DataBase. ( SqlCommandBuilder)
Use SQLCommand ( This will also comes when you read SQLDataReader for read data) and for insert and update.
This will give you better performance for insert and update.
If you are using .NET Frame 3.5 sp1 or later i would suggest Linq to SQL or Entity Framework would also
solve your purpose.
Thanks.
SqlDataAdapter
stores data on your client and updates database as necessary. So it
consumes more memory.
On the other hand you wouldn't need to be
connected to your database on insert/delete/update/select command.
It manages connections internally so you wouldn't have to worry about
that.
All good stuff from SqlDataAdapter come at a cost of more memory consumption. It's usually used for systems that need multiple users connected to database.
So I'd say if that's not your situation go for SqlCommand and the connected model.
If you are just reading data and not doing updates/inserts/deletes, then SqlDataReader will be faster. You can also combine it with a DataSet. If you wrap the data access objects with using statements, the runtime will handle the connection cleanup logic for you.
A pattern I often use for synchronous access is something like this:
DataTable result = new DataTable();
using (SqlConnection conn = new SqlConnection(MyConnectionString))
{
using (SqlCommand cmd = new SqlCommand(MyQueryText, conn))
{
// set CommandType, parameters and SqlDependency here if needed
conn.Open();
using (SqlDataReader reader = cmd.ExecuteReader())
{
result.Load(reader);
}
}
}
For updates/deletes/inserts, a SqlDataAdapter might be worth considering, but usually only if you already have your data in a DataSet. Otherwise, there are faster/better ways of doing things.
If you are aware of these components (Core ADO.NET) (Command,Connection, DataAdapter) then I'd suggest Entity Data Model or Linq-SQL.
SqlDataAdapter is helper class which implicitly uses SqlCommand, SqlConnection and SqlDataReader.
DataReader – The datareader is a forward-only, readonly stream of data
from the database. This makes the datareader a very efficient means
for retrieving data, as only one record is brought into memory at a
time. The disadvantage: A connection object can only contain one
datareader at a time, so we must explicitly close the datareader when
we are done with it. This will free the connection for other uses. The
data adapter objects will manage opening and closing a connection for
the command to execute
DataAdapter – Represents a set of SQL commands and a database
connection that are used to fill the DataSet and update the data
source. It serves as a bridge between a DataSet and a data source for
retrieving and saving data. The DataAdapter provides this bridge by
mapping Fill, which changes the data in the DataSet to match the data
in the data source, and Update, which changes the data in the data
source to match the data in the DataSet. By using it, DataAdapter also
automatically opens and closes the connection as and when required.
SQL Command is Easier but not Automated. SQL Data Adapter is Less easy but Automated.
*Automated means it manages the opening and closing of a server, etc. automatically.
Both of them shares the same functionalities on Data

C# OLEDBConnection to Excel

I am copying an Excel sheet into a Datatable as such:
OleDbCommand command = new OleDbCommand();
command = new OleDbCommand("Select * from [working sheet$]", oleDBConnection);
OleDbDataAdapter dataAdapter = new OleDbDataAdapter();
dataAdapter.SelectCommand = command;
dataAdapter.Fill(dt);
Is there a similar method where I can just simply copy the datatable back to an Excel sheet? The examples I keep finding are copying cell by cell, but this can be noticably slow with large data sets.
Thanks
You're looking for the DataAdapter.Update method, which applies any changes made in the DataTable to the database (or spreadsheet, in this case)
Don't really know much about OleDB with Excel, but since you mentioned a Database, I assume this runs on a server? Microsoft actually does not recomment running Excel on a server.
I would use OpenXML for tasks like this. It's a bit more complicated, but it's save and stable.

SQLiteDataAdapter does not fill the specified DataTable

I'm attempting to pull some data from a SQLite database so that I can populate a GridView in my GUI: here is the code that returns the DataTable:
DataTable table = new DataTable();
SQLiteDataAdapter adapter = new SQLiteDataAdapter(this.command.CommandText, this.connection);
adapter.Fill(table);
return table;
For some reason after calling adapter.Fill, the DataTable is still not populated with anything. So far I've verified that the command text is correct and that the connection contains the correct connection string. Both are used successfully in other parts of the application. No exceptions seem to be thrown... Is there any place else I should be looking for trouble? Am I using the API incorrectly?
Thanks!
This looks like correct usage.
One thing to check -- after the fill you say the datatable is not populated. Were you just checking Rows.Count? What about columns? If the Fill creates columns to match your SELECT statement, but there aren't any rows, then you know the code is working but there's a problem with either your query, or you're not hitting the same database you think you are.
Is there a constructor that lets you just pass in the command directly?

Categories