Best Practices for SQL Statements/Connections in Get() Request - c#

For simple lookups, I need to perform some SQL statements on a DB2 machine. I'm not able to use an ORM at the moment. I have a working example through this code, however I'm wondering if it can be optimized more as this would essentially create a connection on each request. And that just seems like bad programming.
Is there a way I can optimize this Get() request to leave a connection open? Nesting using statements seems dirty, as well. How should I handle the fact that Get() really wants to return an object of User no matter what; even in error? Can I put this connection in the start of the program so that I can use it over and over again? What are some of the best practices for this?
public class UsersController : ApiController
{
String constr = WebConfigurationManager.ConnectionStrings["DB2Connection"].ConnectionString;
public User Get([FromUri] User cst)
{
if (cst == null)
{
throw new HttpResponseException(HttpStatusCode.NotFound);
}
else
{
using (OdbcConnection DB2Conn = new OdbcConnection(constr))
{
DB2Conn.Open();
using (OdbcCommand com = new OdbcCommand(
// Generic SQL Statement
"SELECT * FROM [TABLE] WHERE customerNumber = ?", DB2Conn))
{
com.Parameters.AddWithValue("#var", cst.customerNumber);
using (OdbcDataReader reader = com.ExecuteReader())
{
try
{
while (reader.Read())
{
cst.name = (string)reader["name"];
return cst;
}
}
catch
{
throw;
}
}
}
}
return cst;
}
}
}
I found a great question that doesn't really have detailed answers, I feel like similar solutions exist for both of these questions...

And that just seems like bad programming.
Why do you think that?
The underlying system should be maintaining connections in a connection pool for you. Creating a connection should be very optimized already.
From a logical perspective, what you're doing now is exactly what you want to be doing. Create the connection, use it, and dispose of it immediately. This allows other threads/processes/etc. to use it from the connection pool now that you're done with it.
This also avoids the myriad of problems which arise from manually maintaining your open connections outside of the code that uses them.
Is there a way I can optimize this Get() request to leave a connection open?
Have you measured an actual performance problem? If not, there's nothing to optimize.
And there's a very good chance that hanging on to open connections in a static context in your web application is going to have drastic performance implications.
In short... You're already doing this correctly. (Well, except for that unnecessary try/catch. You can remove that.)
Edit: If you're just looking to improve the readability of the code (which itself is a matter of personal preference), this seems readable to me:
public User Get([FromUri] User cst)
{
if (cst == null)
throw new HttpResponseException(HttpStatusCode.NotFound);
using (var DB2Conn = new OdbcConnection(constr))
using (var com = new OdbcCommand("SELECT * FROM [TABLE] WHERE customerNumber = ?", DB2Conn))
{
com.Parameters.AddWithValue("#var", cst.customerNumber);
DB2Conn.Open();
using (OdbcDataReader reader = com.ExecuteReader())
while (reader.Read())
{
cst.name = (string)reader["name"]
return cst;
}
}
return cst;
}
Note that you can further improve it by re-addressing the logic of that SQL query. Since you're fetching one value from one record then you don't need to loop over a data reader. Just fetch a single literal and return it. Note that this is free-hand and untested, but it might look something like this:
public User Get([FromUri] User cst)
{
if (cst == null)
throw new HttpResponseException(HttpStatusCode.NotFound);
using (var DB2Conn = new OdbcConnection(constr))
using (var com = new OdbcCommand("SELECT name FROM [TABLE] WHERE customerNumber = ? FETCH FIRST 1 ROWS ONLY", DB2Conn))
{
com.Parameters.AddWithValue("#var", cst.customerNumber);
DB2Conn.Open();
cst.name = (string)com.ExecuteScalar();
}
return cst;
}

#David's answer addresses your actual questions perfectly but here's some other observations that may make your code a little more pallatable to you:
remove the try/catch block - all you're doing is re-throwing the exception which is what will happen if you don't use a try/catch at all. Don't catch the exception unless you can do something about it. (I see now that #David's answer addresses that - either it was added after I read it or I missed it - my apologies for the overlap but it's worth reinforcing)
Change your query to just pull name and use ExecuteScalar instead of ExecuteReader. You are taking the name value from the first record and exiting the while loop. ExecuteScalar returns the value from the first column in the first record, so you can eliminate the while loop and the using there.

Related

Strange SaveChanges behavior in Entity Framework and SQL Server

I have some code, you can check project github, error contains in UploadContoller method GetExtensionId.
Database diagram:
Code (in this controller I sending files to upload):
[HttpPost]
public ActionResult UploadFiles(HttpPostedFileBase[] files, int? folderid, string description)
{
foreach (HttpPostedFileBase file in files)
{
if (file != null)
{
string fileName = Path.GetFileNameWithoutExtension(file.FileName);
string fileExt = Path.GetExtension(file.FileName)?.Remove(0, 1);
int? extensionid = GetExtensionId(fileExt);
if (CheckFileExist(fileName, fileExt, folderid))
{
fileName = fileName + $" ({DateTime.Now.ToString("dd-MM-yy HH:mm:ss")})";
}
File dbFile = new File();
dbFile.folderid = folderid;
dbFile.displayname = fileName;
dbFile.file_extensionid = extensionid;
dbFile.file_content = GetFileBytes(file);
dbFile.description = description;
db.Files.Add(dbFile);
}
}
db.SaveChanges();
return RedirectToAction("Partial_UnknownErrorToast", "Toast");
}
I want to create Extension in database if it not exist yet. And I do it with GetExtensionId:
private static object locker = new object();
private int? GetExtensionId(string name)
{
int? result = null;
lock (locker)
{
var extItem = db.FileExtensions.FirstOrDefault(m => m.displayname == name);
if (extItem != null) return extItem.file_extensionid;
var fileExtension = new FileExtension()
{
displayname = name
};
db.FileExtensions.Add(fileExtension);
db.SaveChanges();
result = fileExtension.file_extensionid;
}
return result;
}
In the SQL Server database I have unique constraint on displayname column of FileExtension.
Problem starts only if I uploading few files with the same extension and this extension not exist in database yet.
If I remove lock, in GetExtensionId will be Exception about unique constraint.
Maybe, for some reason, next iteration of foreach cycle calls GetExtensionId without waiting? I don't know.
But only if I set lock my code works fine.
If you know why it happens please explain.
This sounds like a simple concurrency race condition. Imagine two requests come in at once; they both check the FirstOrDefault, which correctly says "nope" for both. Then they both try and insert; one wins, one fails because the DB has changed. While EF manages transactions around SaveChanges, that transaction doesn't start from when you query the data initially
The lock appears to work, by preventing them getting into the looking code at the same time, but this is not a reliable solution for this in general, as it only works inside a single process, let alone node.
So: a few option here:
your code could detect the foreign key violation exception and recheck from the start (FirstOrDefault etc), which keeps things simple in the success case (which is going to be the majority of the time) and not horribly expensive in the failure case (just an exception and an extra DB hit) - pragmatic enough
you could move the "select if exists, insert if it doesn't" into a single operation inside the database inside a transaction (ideally serializable isolation level, and/or using the UPDLOCK hint) - this requires writing TSQL yourself, rather than relying on EF, but minimises round trips and avoids writing "detect failure and compensate" code
you could perform the selects and possible inserts inside a transaction via EF - complicated and messy, frankly: don't do this (and it would again need to be serializable isolation level, but now the serializable transaction spans multiple round trips, which can start to impact locking, if at scale)

Stacked using statements vs seperate using statements

While refactoring code I stumbled upon some stacked using statements (I'm talking about 10 to 15-ish).
using(X x=new X())
using(Y y=new Y())
using(Z z=new Z())
using(...)
{
List<X> listX= x.GetParameterListByID(pID);
List<Y> listY=y.GetParameterListByID(pID);
...
...
//other (business) calls/code, like 20-30 lines, that don't need the using instances
}
Class example would be something like
public class X : IDisposable{
public List<XParameterInfo> GetParameterListByID(int? pID){
const string query="SELECT name,value FROM parameters WHERE id=#ID";
//query code here
return queryResult;
}
}
And the first thing I thought of was, knowing that a using is basicly a try{} finally{ x.Dispose(); }, that the using connections would be kept open/active untill the code within the using block is finished while it is only needed to fill one list.
The above is what I assume, correct me if I'm wrong.
Considering what I said is correct, would it be better (performance, but mostly good practice-wise) to write something like
List<X> listX;
List<Y> listY;
List<Z> listZ;
...
//for the example here I wrote out 3 but let's talk 10 or more
using(X x=new X())
{
listX=x.GetParameterListByID(pID);
}
using(Y y=new Y())
{
listY=y.GetParameterListByID(pID);
}
using(Z z=new Z())
{
listZ=z.GetParameterListByID(pID);
}
...
// other calls but now outside the using statements
or is it negligible, other than the fact that stacked using statements take away the nested code look?
would it be better (performance wise)
that depends on the types that are used in the using. You can't make a general statement.
Performance is not the only factor, open resources can block other processes or cause memory issues. But on the other side readability is increased with the more compact first version. So you have to decide if there is an issue at all. If not, why bother? Then chose the best readable and maintainable code.
But you could refactor your code, use a method to encapsulate the usings:
// in class X:
public static List<X> GetParameterListByID(int pID)
{
using(X x = new X())
{
return x.GetParameterListByID(pID);
}
}
// and same in other classes
Now the code has no usings:
List<X> listX = X.GetParameterListByID(pID); // static method called
List<Y> listY = Y.GetParameterListByID(pID); // static method called
In addition to what Tim said, many DB drivers use "Connection Pooling" to increase the performance of connections, by reusing old ones. So when Dispose is called upon connections, they don't really close (except after certain conditions are met), but they are kept idle for subsequent calls.
https://learn.microsoft.com/en-us/dotnet/framework/data/adonet/connection-pooling
Why don't you use 1 connection for all the queries?
I think this would be a more elegant solution.
void Main()
{
List<X> listX;
List<Y> listY;
List<Z> listZ;
using(SqlConnection conn = new SqlConnection())
{
conn.Open();
listX=new X().GetParameterListByID(conn, pID);
listY=new Y().GetParameterListByID(conn, pID);
listZ=new Z().GetParameterListByID(conn, pID);
}
}

Opening and Closing OleDbConnection during Data Processing - is this good form?

Is this a good way to process data, i dont like the idea of copying the open close connection all over the place. In essence, is this good form/style?
Data Processing Method:
public int Process(Func<Product, OleDbConnection, int> func, Product data)
{
var oleConnect = new OleDbConnection { ConnectionString = #"stuff" };
oleConnect.Open();
oleConnect.ChangeDatabase("InventoryManager");
var ret = func(data, oleConnect);
oleConnect.Close();
return ret;
}
Typical Method used by the Func:
(Update, Delete, Select are the others to pass)
public int Insert(Product data, OleDbConnection oleConnect)
{
var oleCommand = new OleDbCommand("pInsProduct", oleConnect) { CommandType = CommandType.StoredProcedure };
oleCommand.Parameters.Add(new OleDbParameter("#ProductId", data.ProductID));
oleCommand.Parameters.Add(new OleDbParameter("#ProductName", data.ProductName));
return oleCommand.ExecuteNonQuery();
}
The usage code ends up more or less written as:
Process(Insert, data);
Process(Update, data);
EDIT:
I thought up the following alternative method, which is a better implementation? (using's aside):
(open connection more or less equals the Process method above)
int Insert(Product data)
{
Using ( OleDbConnection oleConnect = OpenConnection() )
{
//do stuff
oleConnect.Close(); // maybe redundant with Using statement?
}
}
So, you should be making sure to wrap your connections in using statements to ensure that the connections get closed and disposed of properly. You should do the same for commands. In the end, it is fine to open and close connections like that as typically you won't pay a penalty due to connection pooling, but you still want to re-use connections as much as you can, so do so whenever possible as long as you make sure you close / clean up when done.

SqlDataReader does not work - does not read data

I have a SqlDataReader, but it never enters into Read().
When I debug it, it pass the loop while(readerOne.Read()). It never enter into this loop even though there is data.
public static List<Pers_Synthese> Get_ListeSynthese_all(string codeClient, DateTime DateDeb, DateTime DateFin)
{
try
{
using (var connectionWrapper = new Connexion())
{
var connectedConnection = connectionWrapper.GetConnected();
string sql_Syntax = Outils.LoadFileToString(Path.Combine(appDir, #"SQL\Get_ListeSynthese_All.sql"));
SqlCommand comm_Command = new SqlCommand(sql_Syntax, connectionWrapper.conn);
comm_Command.Parameters.AddWithValue("#codeClioent", codeClient);
comm_Command.Parameters.AddWithValue("#DateDeb", DateDeb);
comm_Command.Parameters.AddWithValue("#DateFin", DateFin);
List<Pers_Synthese> oListSynthese = new List<Pers_Synthese>();
SqlDataReader readerOne = comm_Command.ExecuteReader();
while (readerOne.Read())
{
Pers_Synthese oSyntehse = new Pers_Synthese();
oSyntehse.CodeTrf = readerOne["CODE_TARIF"].ToString();
oSyntehse.NoLV = readerOne["NOID"].ToString();
oSyntehse.PrxUnitaire = readerOne["PRIX_UNITAIRE"].ToString();
oSyntehse.ZoneId = readerOne["LE_ZONE"].ToString();
oSyntehse.LeZone = readerOne["LIB_ZONE"].ToString();
oSyntehse.LeDept = readerOne["DEPT"].ToString();
oSyntehse.LeUnite = readerOne["ENLEV_UNITE"].ToString();
oSyntehse.LePoids = Convert.ToInt32(readerOne["POID"]);
//oSyntehse.LePoidsCorr = Convert.ToInt32(readerOne["POID_CORR"]);
oSyntehse.LeColis = readerOne["NBR_COLIS"].ToString();
oSyntehse.LeCr = readerOne["NBR_CREMB"].ToString();
oSyntehse.SumMontantCR = readerOne["ENLEV_CREMB"].ToString();
oSyntehse.LeVd = readerOne["NBR_DECL"].ToString();
oSyntehse.SumMontantVD = readerOne["ENLEV_DECL"].ToString();
oSyntehse.LePrixHT = readerOne["PRIX_HT"].ToString();
oSyntehse.LePrixTTC = readerOne["PRIX_TTC"].ToString();
oSyntehse.TrDeb = readerOne["TR_DEB"].ToString();
oSyntehse.TrFin = readerOne["TR_FIN"].ToString();
oListSynthese.Add(oSyntehse);
}
readerOne.Close();
readerOne.Dispose();
return oListSynthese;
}
}
catch (Exception excThrown)
{
throw new Exception(excThrown.Message);
}
}
When I debug it with SQL Server profiler it shows the data....that meant the data is not empty, but it never enter into this loop.
while (readerOne.Read())
{
by the way my connection class:
class Connexion : IDisposable
{
public SqlConnection conn;
public SqlConnection GetConnected()
{
try
{
string strConnectionString = Properties.Settings.Default.Soft8Exp_ClientConnStr;
conn = new SqlConnection(strConnectionString);
}
catch (Exception excThrown)
{
conn = null;
throw new Exception(excThrown.InnerException.Message, excThrown);
}
// Ouverture et restitution de la connexion en cours
if (conn.State == ConnectionState.Closed) conn.Open();
return conn;
}
public Boolean IsConnected
{
get { return (conn != null) && (conn.State != ConnectionState.Closed) && (conn.State != ConnectionState.Broken); }
}
public void CloseConnection()
{
// Libération de la connexion si elle existe
if (IsConnected)
{
conn.Close();
conn = null;
}
}
public void Dispose()
{
CloseConnection();
}
}
and my SQL Statement:
exec sp_executesql N'SELECT CODE_TARIF,PRIX_UNITAIRE,TR_DEB,TR_FIN,LE_ZONE,T_TARIF_ZONE.LIBELLE as LIB_ZONE,
SUBSTRING(CP_DEST,1,2) as DEPT,T_UNITE.LIBELLE as ENLEV_UNITE,
count(NOID)as NOID,
SUM(CASE WHEN POID_CORR IS NOT NULL THEN POID_CORR ELSE POID END) as POID,sum(NBR_COLIS)as NBR_COLIS,COUNT(NULLIF(ENLEV_CREMB,0))as NBR_CREMB, sum(ENLEV_CREMB)as ENLEV_CREMB,COUNT(NULLIF(ENLEV_DECL,0))as NBR_DECL,sum(ENLEV_DECL)as ENLEV_DECL,sum(PRIX_HT)as PRIX_HT,sum(PRIX_TTC)as PRIX_TTC, sum (POID_CORR)as POID_CORR
FROM LETTRE_VOIT_FINAL
LEFT JOIN T_TARIF_ZONE ON LETTRE_VOIT_FINAL.LE_ZONE = T_TARIF_ZONE.NO_ID
LEFT JOIN T_UNITE ON LETTRE_VOIT_FINAL.ENLEV_UNITE = T_UNITE.NO_ID
where code_client = #codeClioent
and DATE_CLOTUR_REEL BETWEEN #DateDeb AND #DateFin
and STATUT_LV = 2
group by CODE_TARIF,PRIX_UNITAIRE,TR_DEB,TR_FIN,LE_ZONE,T_TARIF_ZONE.LIBELLE,SUBSTRING(CP_DEST,1,2),T_UNITE.LIBELLE
order by LE_ZONE,PRIX_UNITAIRE
',N'#codeClioent nvarchar(8),#DateDeb datetime,#DateFin datetime',#codeClioent=N'17501613',#DateDeb='2013-06-05 00:00:00',#DateFin='2013-06-05 23:59:00'
it return the data on SQL profiler:
my real query :
SELECT CODE_TARIF,PRIX_UNITAIRE,TR_DEB,TR_FIN,LE_ZONE,T_TARIF_ZONE.LIBELLE as LIB_ZONE,
SUBSTRING(CP_DEST,1,2) as DEPT,T_UNITE.LIBELLE as ENLEV_UNITE,
count(NOID)as NOID,
SUM(CASE WHEN POID_CORR IS NOT NULL THEN POID_CORR ELSE POID END) as POID,sum(NBR_COLIS)as NBR_COLIS,COUNT(NULLIF(ENLEV_CREMB,0))as NBR_CREMB, sum(ENLEV_CREMB)as ENLEV_CREMB,COUNT(NULLIF(ENLEV_DECL,0))as NBR_DECL,sum(ENLEV_DECL)as ENLEV_DECL,sum(PRIX_HT)as PRIX_HT,sum(PRIX_TTC)as PRIX_TTC, sum (POID_CORR)as POID_CORR
FROM LETTRE_VOIT_FINAL
LEFT JOIN T_TARIF_ZONE ON LETTRE_VOIT_FINAL.LE_ZONE = T_TARIF_ZONE.NO_ID
LEFT JOIN T_UNITE ON LETTRE_VOIT_FINAL.ENLEV_UNITE = T_UNITE.NO_ID
where code_client = #codeClioent
and DATE_CLOTUR_REEL BETWEEN #DateDeb AND #DateFin
and STATUT_LV = 2
group by
CODE_TARIF,PRIX_UNITAIRE,TR_DEB,TR_FIN,LE_ZONE,T_TARIF_ZONE.LIBELLE,SUBSTRING(CP_DEST,1,2),T_UNITE.LIBELLE
order by LE_ZONE,PRIX_UNITAIRE
it is strange....when the data is between :
DATE_CLOTUR_REEL BETWEEN '2013-06-05 00:00:00' and '2013-06-05 23:59:00'
but
DATE_CLOTUR_REEL BETWEEN '2013-06-01 00:00:00' and '2013-06-05 23:59:00'
it works.
This is the way it should be. You are not doing the connection.Open()
Also set up the connection string.
private static void ReadOrderData(string connectionString)
{
string queryString =
"SELECT OrderID, CustomerID FROM dbo.Orders;";
using (SqlConnection connection =
new SqlConnection(connectionString))
{
SqlCommand command =
new SqlCommand(queryString, connection);
connection.Open();
SqlDataReader reader = command.ExecuteReader();
// Call Read before accessing data.
while (reader.Read())
{
ReadSingleRow((IDataRecord)reader);
}
// Call Close when done reading.
reader.Close();
}
}
The perfect example of how to do it belongs to MSDN - Microsoft Website
NOTICE:
SqlCommand command =
new SqlCommand(queryString, connection);
connection.Open();
SqlDataReader reader = command.ExecuteReader();
Create the SqlCommand
then open the connection
You are doing it the other way, you open it and then create the command.
I also don't see where you set the query string, I just see that you add the parameters; are you missing it?
This is perhaps not the answer you're looking for, but your code sample exhibits a number of bad coding practices that are easy to fall into due to ADO.NET's poor API design. Rather than manually do all this sql-to-.net conversion, you should use a library that does this for you.
It's easier to avoid bugs when you're not using a bug-prone API.
I recommend PetaPoco - it's easier to use than your current code, and has virtually no overhead (and given your example, is probably faster). There are many other alternatives, however.
Issues with your code sample:
Improperly disposed objects: you aren't disposing SqlCommand and SqlDataReader properly. You possibly aren't disposing connections either (but that depends on Connexion internals).
Using .ToString rather than type-safe casts. You should never extract data from an SqlDataReader like that because it undermines the whole point of the type system, and it's slow to boot. (PetaPoco or something similar will help a lot here)
You're discarding stack traces on error due to the (pointless) try-catch. That just makes your code less readable and harder to debug. Don't catch unless you have have.
Keeping your query away from the code - your code is tightly coupled to the query, and this separation just makes it hard to keep them in sync. Also, loading from the filesystem each and everytime you query is slow and opens up unnecessary filesystem-related failure modes such as locking, max path lengths, and permissions. This is probably the source of your bug - your query probably doesn't do what you think it does.
With PetaPoco or something similar, your entire function would look something like this:
public static List<Pers_Synthese> Get_ListeSynthese_all(
string codeClient, DateTime DateDeb, DateTime DateFin) {
var db = new PetaPoco.Database("Soft8Exp_ClientConnStr");
//you should probably not be storing a query in a file.
//To be clear: your query should not be wrapped in exec sp_executesql,
//ADO.NET will do that for you.
string sql_Syntax = Outils.LoadFileToString(
Path.Combine(appDir, #"SQL\Get_ListeSynthese_All.sql"));
//You'll need to rename Pers_Synthese's properties to match the db,
// or vice versa, or you can annotate the properties with the column names.
return db.Fetch<Pers_Synthese>(sql_Syntax, new {
codeClioent = codeClient, //I suspect this is a typo
DateDeb,
DateFin
});
}
And in that much shorter, readable, faster form, you'll hopefully find whatever bug you have much faster.
Alternatives:
PetaPoco
Dapper (fewer features, but stackoverflow uses it!)
OrmLite (of ServiceStack fame)
Massive (older, uses dynamic which is a feature that can cause bad habits - I don't recommend this unless you really know what you're doing)
You could use heavier, more invasive ORM's like the Entity framework and NHibernate, but these require quite a bit more learning, and they're much slower, and they impose a particular workflow on you which I don't think makes them the best choice in your case.
when i debug it with sql profiler it show the data....that meant the data is not empty, but it never enter into this loop.
It's the other way round: if it never enters into this loop, then it means "the data is empty", i.e. the query returns no rows.
The bug is in your code, not SqlReader: you possibly have the wrong values in your parameters, or maybe the query you read from a file isn't what you think it is. Get the debugger out and inspect the query text and parameters.

Very slow foreach loop

I am working on an existing application. This application reads data from a huge file and then, after doing some calculations, it stores the data in another table.
But the loop doing this (see below) is taking a really long time. Since the file sometimes contains 1,000s of records, the entire process takes days.
Can I replace this foreach loop with something else? I tried using Parallel.ForEach and it did help. I am new to this, so will appreciate your help.
foreach (record someredord Somereport.r)
{
try
{
using (var command = new SqlCommand("[procname]", sqlConn))
{
command.CommandTimeout = 0;
command.CommandType = CommandType.StoredProcedure;
command.Parameters.Add(…);
IAsyncResult result = command.BeginExecuteReader();
while (!result.IsCompleted)
{
System.Threading.Thread.Sleep(10);
}
command.EndExecuteReader(result);
}
}
catch (Exception e)
{
…
}
}
After reviewing the answers , I removed the Async and used edited the code as below. But this did not improve performance.
using (command = new SqlCommand("[sp]", sqlConn))
{
command.CommandTimeout = 0;
command.CommandType = CommandType.StoredProcedure;
foreach (record someRecord in someReport.)
{
command.Parameters.Clear();
command.Parameters.Add(....)
command.Prepare();
using (dr = command.ExecuteReader())
{
while (dr.Read())
{
if ()
{
}
else if ()
{
}
}
}
}
}
Instead of looping the sql connection so many times, ever consider extracting the whole set of data out from sql server and process the data via the dataset?
Edit: Decided to further explain what i meant..
You can do the following, pseudo code as follow
Use a select * and get all information from the database and store them into a list of the class or dictionary.
Do your foreach(record someRecord in someReport) and do the condition matching as usual.
Step 1: Ditch the try at async. It isn't implemented properly and you're blocking anyway. So just execute the procedure and see if that helps.
Step 2: Move the SqlCommand outside of the loop and reuse it for each iteration. that way you don't incurr the cost of creating and destroying it for every item in your loop.
Warning: Make sure you reset/clear/remove parameters you don't need from the previous iteration. We did something like this with optional parameters and had 'bleed-thru' from the previous iteration because we didn't clean up parameters we didn't need!
Your biggest problem is that you're looping over this:
IAsyncResult result = command.BeginExecuteReader();
while (!result.IsCompleted)
{
System.Threading.Thread.Sleep(10);
}
command.EndExecuteReader(result);
The entire idea of the asynchronous model is that the calling thread (the one doing this loop) should be spinning up ALL of the asynchronous tasks using the Begin method before starting to work with the results with the End method. If you are using Thread.Sleep() within your main calling thread to wait for an asynchronous operation to complete (as you are here), you're doing it wrong, and what ends up happening is that each command, one at a time, is being called and then waited for before the next one starts.
Instead, try something like this:
public void BeginExecutingCommands(Report someReport)
{
foreach (record someRecord in someReport.r)
{
var command = new SqlCommand("[procname]", sqlConn);
command.CommandTimeout = 0;
command.CommandType = CommandType.StoredProcedure;
command.Parameters.Add(…);
command.BeginExecuteReader(ReaderExecuted,
new object[] { command, someReport, someRecord });
}
}
void ReaderExecuted(IAsyncResult result)
{
var state = (object[])result.AsyncState;
var command = state[0] as SqlCommand;
var someReport = state[1] as Report;
var someRecord = state[2] as Record;
try
{
using (SqlDataReader reader = command.EndExecuteReader(result))
{
// work with reader, command, someReport and someRecord to do what you need.
}
}
catch (Exception ex)
{
// handle exceptions that occurred during the async operation here
}
}
In SQL on the other end of a write is a (one) disk. You rarely can write faster in parallel. In fact in parallel often slows it down due to index fragmentation. If you can sort the data by primary (clustered) key prior to loading. In a big load even disable other keys, load data rebuild keys.
Not really sure what are doing in the asynch but for sure it was not doing what you expected as it was waiting on itself.
try
{
using (var command = new SqlCommand("[procname]", sqlConn))
{
command.CommandTimeout = 0;
command.CommandType = CommandType.StoredProcedure;
foreach (record someredord Somereport.r)
{
command.Parameters.Clear()
command.Parameters.Add(…);
using (var rdr = command.ExecuteReader())
{
while (rdr.Read())
{
…
}
}
}
}
}
catch (…)
{
…
}
As we were talking about in the comments, storing this data in memory and working with it there may be a more efficient approach.
So one easy way to do that is to start with Entity Framework. Entity Framework will automatically generate the classes for you based on your database schema. Then you can import a stored procedure which holds your SELECT statement. The reason I suggest importing a stored proc into EF is that this approach is generally more efficient than doing your queries in LINQ against EF.
Then run the stored proc and store the data in a List like this...
var data = db.MyStoredProc().ToList();
Then you can do anything you want with that data. Or as I mentioned, if you're doing a lot of lookups on primary keys then use ToDictionary() something like this...
var data = db.MyStoredProc().ToDictionary(k => k.MyPrimaryKey);
Either way, you'll be working with your data in memory at this point.
It seems executing your SQL command puts lock on some required resources and that's the reason enforced you to use Async methods (my guess).
If the database in not in use, try an exclusive access to it. Even then in there are some internal transactions due to data-model complexity consider consulting to database designer.

Categories