Basically I have a website that I am working on where there will be over 8 listboxes filled with information from databases. I currently use SqlDataSource because of ease of use and am using it currently databound to the listboxes.
Does SqlDataSource leave the connection open the whole time? I want to eliminate from an website architectural standpoint any unnecessary continuously open connections for security reasons as well as performance reasons.
Directly in answer to your question: No. The SqlDataSource control ensures that the connection is closed as soon as the operation it is required to perform has been completed.
I used to use SQLDataAdapter + SQLCommand, but now I mostly use
using(SQLDataReader rdr = <YourSQLCommandVariable>.ExecuteReader())
{
rdr.Load(<YourDataTableVariable))
}
Reason being I was unsure what data adapter did on top of the data reader to allow it to do batches of updates, reads and deletes. If you think about it, it would be extremely difficult to write a class like the data adapter that can do all that, without introducing any overhead. The overhead may not be significant, but unless I'm reading multiple tables out of a query into a DataSet object I don't run the risk of using it.
All that being said, I doubt any overhead on these operations is worth even considering if you locally cache all of the resulting data into the local machine. In other words, the biggest improvement you can make to your SQL queries is to not make them if the data is not likely to change over some time frame. If the data is updated once a day, cache it for 24 hours or less. Caching can be done either via the Session if it is end-user-dependent or via the HttpContext.Current.Cache object.
It sounds like you might want some tier separation in your application. The Web project is ideally ignorant of the database. Ideally there is some middle tier assembly that handles communicating with the database. Then from your .aspx.cs or Controller,depending on whether or not you're using MVC, you would make 8 calls to the middle tier (one for each listbox assuming they have distinct information). The middle tier would return something like List<MyObject> which you would then bind to the listbox.
My typical pattern for data access looks like this
using (SqlConnection conn = new SqlConnection("conn string"))
{
conn.Open();
SqlCommand command = new SqlCommand()
{
CommandText = "command text",
Connection = conn,
CommandType = CommandType.StoredProcedure //could be non-stored proc.. but would reccomend stored proc assuming SQL Server
};
command.Parameters.Add(new SqlParameter("MyParam", "param1"));
command.Parameters.Add(new SqlParameter("MyParam2", "param2"));
IDataReader reader = command.ExecuteReader();
while(reader.Read())
{
//magic here
}
conn.Close();
}
Related
I have a function like below that I use to return a gridview (id: dgmenu) to the end users based on their role. Note that I am not allowed to apply pagination to the gridView, all items must be seen in one page.
protected DataTable MenuForUserRole(string userRole) {
DataTable dtMenus = new DataTable();
string connectionString = constr;
try {
using(SqlConnection cnn = new SqlConnection(connectionString)) {
cnn.Open();
string query = #"Select mycolumn1, mycolumn2, mycolumn3, mycolumn4m mycolumn5
From mytable
Where mykey = (select thekey from anothertable where role = #role)
order by myOrderColumn;
";
SqlCommand oCmd = new SqlCommand(query, cnn);
oCmd.Parameters.AddWithValue("#role", userRole);
using(SqlDataAdapter a = new SqlDataAdapter(oCmd)) {
a.Fill(dtMenus);
}
cnn.Close();
}
} catch (Exception ex) {
throw;
}
return dtMenus;
}
Usage:
dgMenu.DataSource = MenuForUserRole(ddlUserRoles.SelectedItem.Value.ToString());
dgMenu.DataBind();
My issue is performance-related: some of the GridViews returned has more than 1000 items, so it takes 5-6 seconds to load the complete gridView for those users, which is unacceptable. When I search online, I couldn't find more efficient code to load a gridView from SQL Server Database. Any help or advice that might increase the load speed when there is high amount of data to the gridview would be appreciated.
Used -> Visual Studio 2017 & SQL Server 2017
The most efficient way would be to realize that it is a bad idea.
1000 records is too much for any user to deal with. 1-2 Orders of Magnitude to much. There is no human on this planet, that could work with that much data at once. This data needs to be filtered, grouped or paginated way more before it comes in front of a user.
And those are all operations you should not be doing past the query. Those should be done in the query itself. Retrieving data you do not want to do filtering later just adds a tone of network load, adds race conditions, Database locks and is propably slower anyway (DBMS are really good at their job!). Worse, with ASP.Net and it's shared memory it can quickly lead to memory issues.
Profile your code. Understand where most of the time is spent. It could be SQL Server, it could be transmission over network, it could be binding the data to the control[s]. If it is SQL Server, we would need to see your schema to tell you how performance could be improved. Like, do you have an index on mykey? BTW, don't call it a key, key is something that uniquely identifies the record, which is obviously not the case here.
Use reporting (e.g. Reporting Services) and create a link to export the data into an excel spreadsheet.
I have to get a big amount of different records (1000.000) from my db to build a report from this data. My DB is on a remote system. Now I have different sql statements for each report. This sql statements are send to the service. The service fills a DataSet and returns this to my application. Now I can bind the DataSet to my reports.
The problem is that DataSets with thid number of records have a enormous memory consumption. I mean if I load the data the memory rises to 1gb for one loading.
Is there an alternative to load data without this memory consumption?
I mean I already use ORM like NHibernate, but the problem is that I don't know the data that will be loaded, there are hundreds of reports with different sql statements that can be changed, so I cannot create hundreds of classes to map...
Edit :
Here is my example code that I am using:
DataSet dataSet = new DataSet();
try
{
using (FbConnection connection = new FbConnection(strConnString))
{
connection.Open();
using (FbCommand cmd =
new FbCommand(
"SELECT * FROM CUSTOMERS;",
connection))
{
FbDataAdapter fbd = new FbDataAdapter(cmd);
fbd.Fill(dataSet);
// This is what the default ADO.Net provider can do..
//SqlCommand command = new SqlCommand(queryString, connection);
//System.Xml.XmlReader reader = command.ExecuteXmlReader();
}
}
}
catch (Exception ex)
{
}
The question you should ask is: How much DATA is it and what is the memory OVERHEAD. If the OVERHEAD is large, you need to find a better data structure. If the DATA itself is too large for memory, you need to explore ways to only bring part of it into memory at a time.
In either case, using NHibernate for reporting on such large volumes is dubious - it can be useful to use existing mapped classes to construct queries, but you must include a projection to a simple unmapped DTO class or to an object[] or similar, to avoid NHibernate instantiating mapped classes for all the results - the latter would be bad for both performance and memory consumption.
Oh, and did you mean that you have a web service that returns a DataSet? That is considered bad style in general, because the DataSet class is Microsoft-specific, and for various other reasons (http://www.hanselman.com/blog/ReturningDataSetsFromWebServicesIsTheSpawnOfSatanAndRepresentsAllThatIsTrulyEvilInTheWorld.aspx).
I have a functionality where data from UI needs to be inserted into two database schema in oracle. There are nearly 10 tables where data needs to be added. Is there any way to maintain data integrity(consistency of data in both schema) by either adding all data to both database or delete newly added data from all tables of both database. I have not started coding, just wanted to know whether we have any good approach to implement the same.
You should be using a transaction if you wan't to control the result of all inserts. There is more than one way to solve this, but you could probably just create an instance of an OracleTransaction object, call a "begin", execute your code then at the end you can commit or rollback. The other method uses a .net System.Transactions.TransactionScope (which is better) but requires that you install further oracle components. The latter is useful if you are writing to multiple databases (oracle and SQL Server) and want to maintain integrity.
OracleConnection _cn = new OracleConnection();
OracleTransaction _trn = _cn.BeginTransaction();
// command 1
OracleCommand _cmd = new OracleCommand();
_cmd.Transaction = _trn;
// command 2, 3, etc
if (!error) {
_trn.Commit();
} else {
_trn.Rollback();
}
This is not complete code obviously, just to give you an idea of the structure.
I'm creating something like a small cashier application that keeps record for the clients, employees, services, sales, and appointments. I'm using windows forms, and within that DataGrids. I've created the database that I'm going to be using for the application. I want to know if I should use SqlCommand-SqlDataReader or SqlDataAdapter-DataSet instead. Which approach is better?
This is highly depend upon type of operation you want.
Following is my suggetion.
If you want to read data faster go for SQLDataReader but that comes as cost of operation you need to take
during read after that also.
Open Connection
Read Data
Close Connection. If you forgot to close than it will hit performance.
Go for SQLDataAdapter
If you want to read faster and use benefit of Disconnected Arch. of ADO.net
This will automatically close/open connection.
Also it will also allow you to automatically handle update in DataSet back to DataBase. ( SqlCommandBuilder)
Use SQLCommand ( This will also comes when you read SQLDataReader for read data) and for insert and update.
This will give you better performance for insert and update.
If you are using .NET Frame 3.5 sp1 or later i would suggest Linq to SQL or Entity Framework would also
solve your purpose.
Thanks.
SqlDataAdapter
stores data on your client and updates database as necessary. So it
consumes more memory.
On the other hand you wouldn't need to be
connected to your database on insert/delete/update/select command.
It manages connections internally so you wouldn't have to worry about
that.
All good stuff from SqlDataAdapter come at a cost of more memory consumption. It's usually used for systems that need multiple users connected to database.
So I'd say if that's not your situation go for SqlCommand and the connected model.
If you are just reading data and not doing updates/inserts/deletes, then SqlDataReader will be faster. You can also combine it with a DataSet. If you wrap the data access objects with using statements, the runtime will handle the connection cleanup logic for you.
A pattern I often use for synchronous access is something like this:
DataTable result = new DataTable();
using (SqlConnection conn = new SqlConnection(MyConnectionString))
{
using (SqlCommand cmd = new SqlCommand(MyQueryText, conn))
{
// set CommandType, parameters and SqlDependency here if needed
conn.Open();
using (SqlDataReader reader = cmd.ExecuteReader())
{
result.Load(reader);
}
}
}
For updates/deletes/inserts, a SqlDataAdapter might be worth considering, but usually only if you already have your data in a DataSet. Otherwise, there are faster/better ways of doing things.
If you are aware of these components (Core ADO.NET) (Command,Connection, DataAdapter) then I'd suggest Entity Data Model or Linq-SQL.
SqlDataAdapter is helper class which implicitly uses SqlCommand, SqlConnection and SqlDataReader.
DataReader – The datareader is a forward-only, readonly stream of data
from the database. This makes the datareader a very efficient means
for retrieving data, as only one record is brought into memory at a
time. The disadvantage: A connection object can only contain one
datareader at a time, so we must explicitly close the datareader when
we are done with it. This will free the connection for other uses. The
data adapter objects will manage opening and closing a connection for
the command to execute
DataAdapter – Represents a set of SQL commands and a database
connection that are used to fill the DataSet and update the data
source. It serves as a bridge between a DataSet and a data source for
retrieving and saving data. The DataAdapter provides this bridge by
mapping Fill, which changes the data in the DataSet to match the data
in the data source, and Update, which changes the data in the data
source to match the data in the DataSet. By using it, DataAdapter also
automatically opens and closes the connection as and when required.
SQL Command is Easier but not Automated. SQL Data Adapter is Less easy but Automated.
*Automated means it manages the opening and closing of a server, etc. automatically.
Both of them shares the same functionalities on Data
I would like to make sure when using SqlCommand that I am using best practices, particularly with regards to security.
Considerations that I am not sure about:
Is it ok to manually build the string by appending? If not, how should I do it?
What classes should I be looking at using instead?
If your first question is talking about building SQL by including the values directly, that's almost certainly not okay. It opens you up to SQL injection attacks, as well as issues with conversions (e.g. having to get the right date/time format).
Instead, you should use a parameterized query, and set the values in the parameters. See the docs for SqlCommand.Parameters for an example.
Out of interest, do you have a particular reason for using SQL directly instead of using one of the many ORMs around? (LLBL, Entity Framework, NHibernate, LINQ to SQL, SubSonic, Massive, SimpleData, Dapper...)
I would say use of parameters is one of the most important aspects for security. This will prevent SQL Injection into your database. The following SQLCommand is an example of how I would construct one (in VB.NET, apologies - no C# knowledge - yet ;))
Dim cmd as New SqlCommand("sp_StoredProcedure", Conn)
cmd.commandType = commandtypes.storedprocedure
cmd.parameters.add("#ID",sqldbtype.int).value = myID
cmd.executenonquery
And an example of an inline SqlCommand:
Dim cmd as New SqlCommand("SELECT Name, Message FROM [Table] WHERE ID=#ID", Conn)
cmd.commandType = commandtypes.storedprocedure
cmd.parameters.add("#ID",sqldbtype.int).value = myID
cmd.executenonquery
My advice: be lazy. Writing voluminous code is a good way to make brain-dead errors (wrong data type, null checks, missing Dispose(), etc), and it has zero performance advantage over many of the helper tools.
Personally, I'm a big fan of dapper (but I'm somewhat biased), which makes things easy:
int customerId = ...
var orders = connection.Query<Order>(
#"select * from Customers where CustomerId = #customerId",
new { customerId });
Which will do parameterisation and materialisation for you without pain, and stupidly fast.
For other scenarios, and in particular when you want to use OO techniques to update the system, an ORM such as EF or L2S will save you work while (and giving you better type-checking via LINQ).
I think this is the best solution
SqlConnection cn = new SqlConnection(strCn);
try
{
using (SqlCommand cmd = new SqlCommand("select * from xxxx", cn))
{
cn.Open();
//do something
cn.Close();
}
}
catch (Exception exception)
{
cn.Close();
throw exception;
}
Depending, when I do POC or personnal small projects I normally build my strings manually but when im in a project for work we have a template for DB usage and connection that I must use and can't reveal here obviously.
But I think it's ok to manually build for basic operations in small project or POC.
Edit : Like Jon said though you should always use somehting like SqlCommand.Parameters when building your own command. Sorry if I wasn't clear.
If you're concerned about security, I recommend create your queries as Stored Procedures inside the SQL Server Database instead (to prevent worrying about SQL injection), and then from the front end code, just generate a SQL Stored Procedure, add the parameters, and do it that way.
This article should help you out with setting this up.
You should always use the practice of applying the least privileges to database connections by communicating with the database through stored procedures.
Store Procs
using System.Data;
using System.Data.SqlClient;
using (SqlConnection connection = new SqlConnection(connectionString))
{
DataSet userDataset = new DataSet();
SqlDataAdapter myCommand = new SqlDataAdapter("LoginStoredProcedure", connection);
myCommand.SelectCommand.CommandType = CommandType.StoredProcedure;
myCommand.SelectCommand.Parameters.Add("#au_id", SqlDbType.VarChar, 11);
myCommand.SelectCommand.Parameters["#au_id"].Value = SSN.Text;
myCommand.Fill(userDataset);
}
The credentials for the connection string to the database:
A. Integrated Security for corporate/intranet
B. Keep it encrypted in the registry or a file in Hosting Providers.
Always be on the lookout for hackers trying to upload cross scripting into your forms.
Always check the input for SQL injection.