I'm creating something like a small cashier application that keeps record for the clients, employees, services, sales, and appointments. I'm using windows forms, and within that DataGrids. I've created the database that I'm going to be using for the application. I want to know if I should use SqlCommand-SqlDataReader or SqlDataAdapter-DataSet instead. Which approach is better?
This is highly depend upon type of operation you want.
Following is my suggetion.
If you want to read data faster go for SQLDataReader but that comes as cost of operation you need to take
during read after that also.
Open Connection
Read Data
Close Connection. If you forgot to close than it will hit performance.
Go for SQLDataAdapter
If you want to read faster and use benefit of Disconnected Arch. of ADO.net
This will automatically close/open connection.
Also it will also allow you to automatically handle update in DataSet back to DataBase. ( SqlCommandBuilder)
Use SQLCommand ( This will also comes when you read SQLDataReader for read data) and for insert and update.
This will give you better performance for insert and update.
If you are using .NET Frame 3.5 sp1 or later i would suggest Linq to SQL or Entity Framework would also
solve your purpose.
Thanks.
SqlDataAdapter
stores data on your client and updates database as necessary. So it
consumes more memory.
On the other hand you wouldn't need to be
connected to your database on insert/delete/update/select command.
It manages connections internally so you wouldn't have to worry about
that.
All good stuff from SqlDataAdapter come at a cost of more memory consumption. It's usually used for systems that need multiple users connected to database.
So I'd say if that's not your situation go for SqlCommand and the connected model.
If you are just reading data and not doing updates/inserts/deletes, then SqlDataReader will be faster. You can also combine it with a DataSet. If you wrap the data access objects with using statements, the runtime will handle the connection cleanup logic for you.
A pattern I often use for synchronous access is something like this:
DataTable result = new DataTable();
using (SqlConnection conn = new SqlConnection(MyConnectionString))
{
using (SqlCommand cmd = new SqlCommand(MyQueryText, conn))
{
// set CommandType, parameters and SqlDependency here if needed
conn.Open();
using (SqlDataReader reader = cmd.ExecuteReader())
{
result.Load(reader);
}
}
}
For updates/deletes/inserts, a SqlDataAdapter might be worth considering, but usually only if you already have your data in a DataSet. Otherwise, there are faster/better ways of doing things.
If you are aware of these components (Core ADO.NET) (Command,Connection, DataAdapter) then I'd suggest Entity Data Model or Linq-SQL.
SqlDataAdapter is helper class which implicitly uses SqlCommand, SqlConnection and SqlDataReader.
DataReader – The datareader is a forward-only, readonly stream of data
from the database. This makes the datareader a very efficient means
for retrieving data, as only one record is brought into memory at a
time. The disadvantage: A connection object can only contain one
datareader at a time, so we must explicitly close the datareader when
we are done with it. This will free the connection for other uses. The
data adapter objects will manage opening and closing a connection for
the command to execute
DataAdapter – Represents a set of SQL commands and a database
connection that are used to fill the DataSet and update the data
source. It serves as a bridge between a DataSet and a data source for
retrieving and saving data. The DataAdapter provides this bridge by
mapping Fill, which changes the data in the DataSet to match the data
in the data source, and Update, which changes the data in the data
source to match the data in the DataSet. By using it, DataAdapter also
automatically opens and closes the connection as and when required.
SQL Command is Easier but not Automated. SQL Data Adapter is Less easy but Automated.
*Automated means it manages the opening and closing of a server, etc. automatically.
Both of them shares the same functionalities on Data
Related
I have to get a big amount of different records (1000.000) from my db to build a report from this data. My DB is on a remote system. Now I have different sql statements for each report. This sql statements are send to the service. The service fills a DataSet and returns this to my application. Now I can bind the DataSet to my reports.
The problem is that DataSets with thid number of records have a enormous memory consumption. I mean if I load the data the memory rises to 1gb for one loading.
Is there an alternative to load data without this memory consumption?
I mean I already use ORM like NHibernate, but the problem is that I don't know the data that will be loaded, there are hundreds of reports with different sql statements that can be changed, so I cannot create hundreds of classes to map...
Edit :
Here is my example code that I am using:
DataSet dataSet = new DataSet();
try
{
using (FbConnection connection = new FbConnection(strConnString))
{
connection.Open();
using (FbCommand cmd =
new FbCommand(
"SELECT * FROM CUSTOMERS;",
connection))
{
FbDataAdapter fbd = new FbDataAdapter(cmd);
fbd.Fill(dataSet);
// This is what the default ADO.Net provider can do..
//SqlCommand command = new SqlCommand(queryString, connection);
//System.Xml.XmlReader reader = command.ExecuteXmlReader();
}
}
}
catch (Exception ex)
{
}
The question you should ask is: How much DATA is it and what is the memory OVERHEAD. If the OVERHEAD is large, you need to find a better data structure. If the DATA itself is too large for memory, you need to explore ways to only bring part of it into memory at a time.
In either case, using NHibernate for reporting on such large volumes is dubious - it can be useful to use existing mapped classes to construct queries, but you must include a projection to a simple unmapped DTO class or to an object[] or similar, to avoid NHibernate instantiating mapped classes for all the results - the latter would be bad for both performance and memory consumption.
Oh, and did you mean that you have a web service that returns a DataSet? That is considered bad style in general, because the DataSet class is Microsoft-specific, and for various other reasons (http://www.hanselman.com/blog/ReturningDataSetsFromWebServicesIsTheSpawnOfSatanAndRepresentsAllThatIsTrulyEvilInTheWorld.aspx).
I have a functionality where data from UI needs to be inserted into two database schema in oracle. There are nearly 10 tables where data needs to be added. Is there any way to maintain data integrity(consistency of data in both schema) by either adding all data to both database or delete newly added data from all tables of both database. I have not started coding, just wanted to know whether we have any good approach to implement the same.
You should be using a transaction if you wan't to control the result of all inserts. There is more than one way to solve this, but you could probably just create an instance of an OracleTransaction object, call a "begin", execute your code then at the end you can commit or rollback. The other method uses a .net System.Transactions.TransactionScope (which is better) but requires that you install further oracle components. The latter is useful if you are writing to multiple databases (oracle and SQL Server) and want to maintain integrity.
OracleConnection _cn = new OracleConnection();
OracleTransaction _trn = _cn.BeginTransaction();
// command 1
OracleCommand _cmd = new OracleCommand();
_cmd.Transaction = _trn;
// command 2, 3, etc
if (!error) {
_trn.Commit();
} else {
_trn.Rollback();
}
This is not complete code obviously, just to give you an idea of the structure.
I have a table with millions of rows of data, and I would like to know which is the best way to query my data - using .ExecuteReader() or using a Dataset.
Using SqlDataReader like this:
myReader = cmd.ExecuteReader();
And after fill a list with the result
Or using DataSet
using (SqlDataAdapter da = new SqlDataAdapter(cmd))
{
a.Fill(ds);
}
Which are the best method?
The two objects are to be used in contexts fundamentally different.
A DataReader instance returned by the ExecuteReader doesn't return anything until you loop over it using the Read() method. It is a connected object that has a pointer to a current record on the backend database. You read the content of the record using the various GetXXXXX methods provided by the reader or simply using the indexer. When you have done with the current record you orderly pass to the following one using the Read() method. No way to go back or jump to record N + 100.
A DataSet instead is a disconnected object. It uses internally a DataReader to fill its local memory buffer with all the records returned by the command text query. It is handy if you need to work randomly on the data returned or show them on video or print them. But of course, waiting to have millions of records returned by the internal reader could be time consuming and the consuming of the local memory probably will kill your process before the end.
So, which is the best? Neither, if you have millions of records in your table, you need to put in place an appropriate WHERE condition to reduce the amount of records returned. Said that, it depends on what you need to do with the returned data. To display them on a grid probably you could use a DataSet. Instead a DataReader is better if you need to execute operations on the records one by one.
The question is what you want to fill
If you want to fill a DataSet/DataTable use DataAdapter.Fill(ds)
If you want to fill a list/array use a DataReader and a loop
The DataAdapter also uses a DataReader behind the scenes, but it loops all records. You could add a different logic to loop only part of the resultset.
"I have a table with million of rows": You should almost never need to return so many records. So don't filter in memory but in the database.
Both Are good Methods. But if you use SqlDataReader than you have to close it. its is must. Otherwise you will not able to execute any other query until SqlDataReader is not closed.
Basically I have a website that I am working on where there will be over 8 listboxes filled with information from databases. I currently use SqlDataSource because of ease of use and am using it currently databound to the listboxes.
Does SqlDataSource leave the connection open the whole time? I want to eliminate from an website architectural standpoint any unnecessary continuously open connections for security reasons as well as performance reasons.
Directly in answer to your question: No. The SqlDataSource control ensures that the connection is closed as soon as the operation it is required to perform has been completed.
I used to use SQLDataAdapter + SQLCommand, but now I mostly use
using(SQLDataReader rdr = <YourSQLCommandVariable>.ExecuteReader())
{
rdr.Load(<YourDataTableVariable))
}
Reason being I was unsure what data adapter did on top of the data reader to allow it to do batches of updates, reads and deletes. If you think about it, it would be extremely difficult to write a class like the data adapter that can do all that, without introducing any overhead. The overhead may not be significant, but unless I'm reading multiple tables out of a query into a DataSet object I don't run the risk of using it.
All that being said, I doubt any overhead on these operations is worth even considering if you locally cache all of the resulting data into the local machine. In other words, the biggest improvement you can make to your SQL queries is to not make them if the data is not likely to change over some time frame. If the data is updated once a day, cache it for 24 hours or less. Caching can be done either via the Session if it is end-user-dependent or via the HttpContext.Current.Cache object.
It sounds like you might want some tier separation in your application. The Web project is ideally ignorant of the database. Ideally there is some middle tier assembly that handles communicating with the database. Then from your .aspx.cs or Controller,depending on whether or not you're using MVC, you would make 8 calls to the middle tier (one for each listbox assuming they have distinct information). The middle tier would return something like List<MyObject> which you would then bind to the listbox.
My typical pattern for data access looks like this
using (SqlConnection conn = new SqlConnection("conn string"))
{
conn.Open();
SqlCommand command = new SqlCommand()
{
CommandText = "command text",
Connection = conn,
CommandType = CommandType.StoredProcedure //could be non-stored proc.. but would reccomend stored proc assuming SQL Server
};
command.Parameters.Add(new SqlParameter("MyParam", "param1"));
command.Parameters.Add(new SqlParameter("MyParam2", "param2"));
IDataReader reader = command.ExecuteReader();
while(reader.Read())
{
//magic here
}
conn.Close();
}
i just wondering, what things i have to consider when using DataReader and DataAdapter in fetching data from the database and whats the difference between this two other the datareader needs open connection and the datadapter does not... In our projects, were using DataReader in ALL our DAL, we never use dataadapter. So I wondering what scenario would it been better to use DataAdapter + Datatable combo than using DataReader. Thanks in advance.
DataReader : This is best used when you just want to fetch data in readony mode , populate your business entity and close the reader. This is really fast.
Say suppose , you are having a customer class and you want to have fully initilized object with all your customer properties filled like( Name,Address etc..)
You will use DataReader here and just populate the entity and close reader.
You cannot do update with datareader.
DataAdapter : You can Read/Update the data with dataadapters but it is less faster when reading the data then Datareader.
You can update the data with DataAdapter but with reader you won't
I almost always favor the DataReader when doing ADO.NET stuff as well; the reason being, it does not force you to store the data on the client any longer than you must.
That's also somewhat the answer to when to use a DataAdapter to a DataSet/Table; when you want to store the data on the client, perhaps to work with it somehow - iterating back and forth through it, or operating on it as a set, as opposed to simply outputting the values into a grid, where the Reader, IMO, is a better option.
DataReader allow you to process each record and throw it away, which is good when you want to process a lot of data records with no relation to each other. For example, you might use DataReader when you want to calculate some complex statistic value from every records in the database, or to save a lot of data records into a local file.
DataAdapter is something else, it is capable to let you have data records in the memory. That allows you to make the GUI to browse data, editing data, etc.. It is more general but will not work well with large data set.
You only want to use DataAdapters when you use DataSets.
An Adapter has the 2 main methods Fill() and Updater() to read a Dataset from and write it to the Database.
Note that Fill() will open a Connnection, use a DataReader to get all records and then close the Connetion.
Without Datasets and DataTables you don't have a use for DataAdapters.
So the real question is: What kind of storage classes do you want to use in your DAL? DataSets are viable and simple but it's an aging technology (no longer improved).
Maybe you should look around for an ORM (Object Relational Mapping) library. But that will replace your DataReader/Adapter question with a much more complicated choice.
I never use DataReader.
Since I strongly layer my application, my DAL is responsible for talking to the database and my BLL is responsible for building objects, there's no way for the BLL to close the DataReader when it's done. Instead the BLL requests a DataSet/DataTable from the DAL, which the DAL fulfills. It does this by performing a Fill (to TomTom's point > look at the stack trace and yes, you will see a DataReader in there). The BLL then does what it likes with the result set.
what things i have to consider when
using DataReader and DataAdapter
DataReader: Good low level interface. PRetty much the ONLY interface - if you load data into higher up structures, the actual load is always done using a DataReader.
DataAdapter / DataSet: stuff not used by people who like structured p rograms and nice code and do not just happen to write a reporting applcation. Use an ORM instead - NHipernate (good), Linq2SQL (bad), Entity Framework (bad) or one of the other better abstractions.
I guess this question just to talk about proc and cons ,and being off side the code
*Data Reader is much faster than DataAdapter in fetching Data but you have to know what's exactly Disconnected mode
*DataReader or Connected mode and DataAdapter Disconnected mode are being used in the same scenarios but some times Disconnected mode is better in case of you're a way of your data
*But the Disconnectiod mode is provided with rich APIs like DataAdapter ,DataView ,DataTable and DataSet. The powerful thing is you simply provide your DataAdapter with SELECT,INSERT,UPDATE,DELETE Command ,Attach you Data from single table or multiple tables ,with one line of code Adapter.Fill(DataTable) or Adapter.Fill(DataSet) ,and the same way with Updating Data Adapter.Update(DataTable)
*Updating Hierarchical Data in Disconnected mode is far better than working in connected mode which has to use extra code and extra logic to maintain ,in Disconnected mode you have the the ability to update Only Inserted Rows or Updated Rows Or Deleted Rows beside updating operation is wrapped inside Dot Net Transaction
Adapter.Update(DataTable.Select("","",DataViewRowState.Added))
*in disconnected mode you got the ability get the versions of every single row in your data ,beside that you can the Changes to your Data DataTable.GetChanges()
*Disconnected mode provide you with stronglyTypedDataSet ,so you get your data definition schema and the relations ,you can get parent and child rows
*Disconnected mode provide method for Getting Rows by PrimaryKey also getting rows with specific criteria DataTable.Select("FilterExpression","SortOrder",DataRowViewState)
*you can make calculation over DataTable and don't disturb your server with calculations like select productID,ProductName,Price,Quantity,price*quantity as Total, you can easily add column with specific criteria (price*quantity)
*you can make aggregations or your snatched DataTable ,DataTable.Compute("Sum(price)","price>250")
*in Disconnected mode you have CommandBuilder which it creates the sqlcommands for you,but its only working with single table