High Memory Usage with Oracle.ManagedDataAccess.Core - c#

I need a little help here.
I'm facing a problem now with Oracle.ManagedDataAccess.Core.
I have a class written for centralize oracle queries (Clases.Oracle()), which works perfectly, but with one query the memory usage rises up to 1GB, which is not a real problem, considering that the resultset has about 260.000 rows in the worst scenario. The real problem is that it never frees that memory, and if I execute that query again, rises up to 2GB, been that the higher limit so far.
I've tried adding GC.Collect() and GC.WaitForPendingFinalizers() with no results.
My command execution function in Clases.Oracle() is:
private DataTable ExecuteReader(string package, ref OracleParameter[] parametros, string owner)
{
var dt = new DataTable();
using (var cn = new OracleConnection(_connection_string))
{
using var cmd = cn.CreateCommand();
try
{
cn.Open();
cmd.CommandText = $"{owner}.{package}";
cmd.CommandType = CommandType.StoredProcedure;
foreach (var par in parametros)
{
cmd.Parameters.Add(par);
}
using var rdr = cmd.ExecuteReader(CommandBehavior.CloseConnection);
dt.Load(rdr);
}
catch (Exception ex)
{
throw new Exceptions.OracleException(ex.Message);
}
finally
{
cn?.Close();
cmd?.Dispose();
cn?.Dispose();
}
}
return dt;
}
I'm using the using clause, so the objects are disposing.
And I'm calling the connection with this function:
public List<AuditoriaUsuarios> ObtieneAuditoriaUsuarios(long incluyeCargoRol = 0)
{
var ora = new Clases.Oracle();
var param = new OracleParameter[]
{
ora.AddInParameter("PIN_INCLUYECARGOROL",OracleDbType.Decimal, incluyeCargoRol),
ora.AddOutCursor("CUR_OUT"),
ora.AddOutParameter("PON_CODE", OracleDbType.Decimal),
ora.AddOutParameter("POV_ERROR", OracleDbType.Varchar2)
};
var result = ora.ExecuteReader<AuditoriaUsuarios>($"{_PCK}.p_AUDIT_USUARIOS", ref param);
if (ora.HayError(param))
{
throw new Exceptions.OracleException(ora.CodigoError, ora.MensajeError);
}
//GC.Collect();
//GC.WaitForPendingFinalizers();
return result;
}
Clases.Oracle() doesn't need to be Disposable, because all the objects that I'm using are Disposable, and are being disposed, and 2 strings for the ConnectionString and the name of the Database Owner.
This is a Memory Usage dump from VS
You can see an oracle related object (OracleInternal.Common.ZoneValue) using a lot of memory way long after the ExecuteReader finished and the results where returned.
Don't know if I'm doing something wrong.
Edit:
I've forgot. This is a ASP .Net Core x64 WebAPI, using .NET Core 3.1 and C#, with Visual Studio 2019 Enterprise.
Edit2:
I know it's dirty, but adding this to ObtieneAuditoriaUsuarios made things a little better. (In this case I don't care about CPU usage, because this data extraction it's supposed to be executed a few times a week, and is not part of everyday operation):
GC.Collect(GC.MaxGeneration, GCCollectionMode.Forced);
GC.WaitForPendingFinalizers();
GC.Collect(GC.MaxGeneration, GCCollectionMode.Forced);
Edit3:
I've sent 8 simultaneous requests and the memory usage rised up to 3GB in some tests. But only it takes 1 request with a filter that returns less than 100rows, and the memory usage drops to less than 1GB

Related

Simultaneous requests slow down asp net core API

TLDR; I have an ASP.NET Core 5.0 API that's sitting at AWS. It makes a large call to MSSQL db to return ~1-4k rows of data. A single request is fine, taking ~500ms, but when multiple requests come in about the same time (4-5), the request slows to ~2000ms per call. What's going on?
There's not much more to state than what I have above. I open a connection to our DB then initialize a SqlCommand.
using (var connection = new SqlConnection(dbConnection))
connection.Open();
using (SqlCommand command = new SqlCommand(strSQLCommand))
I've tried both filling a datatable with SqlDataAdapter and using a SqlDataReader to fill up a custom object, I get similar slow downs either way. As stated above the query returns ~1-4k rows of data of varying types. And Postman says the returned Json data is about 1.95MB of size after decompression. The slowdown only occurs when multiple requests come in around the same time. I don't know if it's having trouble with multiple connections to the db, or if it's about the size of the data and available memory. Paging isn't an option, the request needs to return that much data.
This all occurs within a HttpGet function
[HttpGet]
[Route("Foo")]
[Consumes("application/json")]
[EnableCors("DefaultPolicy")]
public IActionResult Foo([FromHeader] FooRequest request)
{
///stuff
DataTable dt = new DataTable();
using (var connection = new SqlConnection(_dataDBConnection))
{
timer.Start();
connection.Open();
using (SqlCommand command = new SqlCommand(
"SELECT foo.name, bar.first, bar.second, bar.third, bar.fourth
FROM dbo.foo with(nolock)
JOIN dbo.bar with(nolock) ON bar.name = foo.name
WHERE bar.date = #date", connection))
{
command.Parameters.AddWithValue("#date", request.Date.ToString("yyyyMMdd"));
using (SqlDataAdapter adapter = new SqlDataAdapter(command))
{
adapter.Fill(dt);
}
}
timer.Stop();
long elapsed = timer.ElapsedMilliseconds;
}
///Parse the data from datatable into a List<object> and return
///I've also used a DataReader to put the data directly into the List<object> but experienced the same slowdown.
///response is a class containing an array of objects that returns all the data from the SQL request
return new JsonResult(response);
}
Any insights would be appreciated!
--EDIT AFTER ADDITOINAL TESTING---
[HttpGet]
[Route("Foo")]
[Consumes("application/json")]
[EnableCors("DefaultPolicy")]
public IActionResult Foo([FromHeader] FooRequest request)
{
///stuff
using (var connection = new SqlConnection(_dataDBConnection))
{
connection.Open();
///This runs significantly faster
using (SqlCommand command = new SqlCommand(#"dbo.spGetFoo", connection))
{
command.CommandType = CommandType.StoredProcedure;
command.Parameters.AddWithValue("#date", request.date.ToString("yyyyMMdd"));
using (SqlDataReader reader = command.ExecuteReader())
{
while (reader.Read())
{
///Add data to list to be returned
}
}
}
}
///Parse the data from datatable into a List<object> and return
///I've also used a DataReader to put the data directly into the List<object> but experienced the same slowdown.
///response is a class containing an array of objects that returns all the data from the SQL request
return new JsonResult(response);
}
--FINAL EDIT PLEASE READ--
People seem to be getting caught up on the DataAdapter and Fill portion instead of reading the full post. So, I'll include a final example here that provides the same issue above.
[HttpGet]
[Route("Foo")]
[Consumes("application/json")]
[EnableCors("DefaultPolicy")]
public async Task<IActionResult> Foo([FromHeader] FooRequest request)
{
///stuff
using (var connection = new SqlConnection(_dataDBConnection))
{
await connection.OpenAsync();
///This runs significantly faster
using (SqlCommand command = new SqlCommand(#"dbo.spGetFoo", connection))
{
command.CommandType = CommandType.StoredProcedure;
command.Parameters.AddWithValue("#date", request.date.ToString("yyyyMMdd"));
using (SqlDataReader reader = await command.ExecuteReaderAsync())
{
while (await reader.ReadAsync())
{
///Add data to list to be returned
}
}
}
}
///Parse the data from datatable into a List<object> and return
///response is a class containing an array of objects that returns all the data from the SQL request
return new JsonResult(response);
}
First thing to note here is that your action method is not asynchronous. Second thing to note here is that using adapters to fill datasets is something I hadn't seen for years now. Use Dapper! Finally, that call to the adapter's Fill() method is most likely synchronous. Move to Dapper and use asynchronous calls to maximize your ASP.net throughput.
I think your idea is correct, it shouldn't be a database problem.
I think that Session can be one suspect. If you use ASP.NET Core
Session in your application, requests are queued and processed one by
one. So, the last request can stay holt in the queue while the
previous requests are being processed.
Another can be bits of MVC running in your pipeline and that can bring
Session without asking you.
In addition, another possible reason is that all threads in the
ASP.NET Core Thread Pool are
busy.
In this case, a new thread will be created to process a new request
that takes additional time.
This is just my idea, any other cause is possible. Hope it can help you.
The reason this is slow is that the method is not async. This means that threads are blocked. Since Asp.Net has a limited thread pool, it will be exhausted after a while, and then additional requests will have to queue, which makes the system slow. All of this should be fixed by using async await pattern.
Since SQLDataAdapter does not provide any async methods, it could be easier to use a technology which provides such an async methods, e.g. EF Core. Otherwise you could start a new task for adapter.Fill, however, this is not a clean way of doing it.

C# Parallel.For and Oracle database access - memory exception

I am working on some code that I would like to access an Oracle database inside of a Parallel.For loop. The loop will run for several minutes, and then result in the error:
"Attempted to read or write protected memory. This is often an
indication that other memory is corrupt."
There is no inner exception. Inside my Parallel.For loop, I am creating an opening the database connection as local objects. My code looks like this:
static void CheckSinglePath(Path p)
{
string sqlBase = "select * from table where hour = #HOUR#";
Parallel.For (1, 24, i =>
{
DBManager localdbm = new DBManager();
string sql = sqlBase;
sql = sql.Replace("#HOUR#", i.ToString());
OracleDataReader reader = db.GetData(sql);
if (reader.Read())
{
//do some stuff
}
reader.Close();
});
}
class DBManager
{
OracleConnection conn;
OracleCommand cmd;
public DBManager()
{
string connStr = "blahblahblah;Connection Timeout=600;";
conn = new OracleConnection(connStr);
conn.Open();
cmd = conn.CreateCommand();
}
public OracleDataReader GetData(string sql)
{
cmd.CommandText = sql;
return cmd.ExecuteReader();//EXCEPTION HERE!
}
}
What am I doing wrong? How can I create 24 parallel Oracle connections to process the data? I'm guessing there is some sort of race condition or memory leak that is going on here which I don't fully understand because it seems to be coming from inside the OracleConnection object. Is the database connection not threadsafe? I tried changing the connection string to use a connection pool and that didn't change anything.
Memory problems is always caused by wrong resources usage. You do not properly release your connections after the loop exit.
You need to implement IDisposable interface and after that you need rewrite your code in such manner with using keyword:
// dispose the connection after command finished
using (var localdbm = new DBManager())
{
var sql = sqlBase;
sql = sql.Replace("#HOUR#", i.ToString());
using (var reader = db.GetData(sql))
{
if (reader.Read())
{
//do some stuff
}
// no need to close reader
// as it's being disposed inside using directive
}
}

SqlDataReader does not work - does not read data

I have a SqlDataReader, but it never enters into Read().
When I debug it, it pass the loop while(readerOne.Read()). It never enter into this loop even though there is data.
public static List<Pers_Synthese> Get_ListeSynthese_all(string codeClient, DateTime DateDeb, DateTime DateFin)
{
try
{
using (var connectionWrapper = new Connexion())
{
var connectedConnection = connectionWrapper.GetConnected();
string sql_Syntax = Outils.LoadFileToString(Path.Combine(appDir, #"SQL\Get_ListeSynthese_All.sql"));
SqlCommand comm_Command = new SqlCommand(sql_Syntax, connectionWrapper.conn);
comm_Command.Parameters.AddWithValue("#codeClioent", codeClient);
comm_Command.Parameters.AddWithValue("#DateDeb", DateDeb);
comm_Command.Parameters.AddWithValue("#DateFin", DateFin);
List<Pers_Synthese> oListSynthese = new List<Pers_Synthese>();
SqlDataReader readerOne = comm_Command.ExecuteReader();
while (readerOne.Read())
{
Pers_Synthese oSyntehse = new Pers_Synthese();
oSyntehse.CodeTrf = readerOne["CODE_TARIF"].ToString();
oSyntehse.NoLV = readerOne["NOID"].ToString();
oSyntehse.PrxUnitaire = readerOne["PRIX_UNITAIRE"].ToString();
oSyntehse.ZoneId = readerOne["LE_ZONE"].ToString();
oSyntehse.LeZone = readerOne["LIB_ZONE"].ToString();
oSyntehse.LeDept = readerOne["DEPT"].ToString();
oSyntehse.LeUnite = readerOne["ENLEV_UNITE"].ToString();
oSyntehse.LePoids = Convert.ToInt32(readerOne["POID"]);
//oSyntehse.LePoidsCorr = Convert.ToInt32(readerOne["POID_CORR"]);
oSyntehse.LeColis = readerOne["NBR_COLIS"].ToString();
oSyntehse.LeCr = readerOne["NBR_CREMB"].ToString();
oSyntehse.SumMontantCR = readerOne["ENLEV_CREMB"].ToString();
oSyntehse.LeVd = readerOne["NBR_DECL"].ToString();
oSyntehse.SumMontantVD = readerOne["ENLEV_DECL"].ToString();
oSyntehse.LePrixHT = readerOne["PRIX_HT"].ToString();
oSyntehse.LePrixTTC = readerOne["PRIX_TTC"].ToString();
oSyntehse.TrDeb = readerOne["TR_DEB"].ToString();
oSyntehse.TrFin = readerOne["TR_FIN"].ToString();
oListSynthese.Add(oSyntehse);
}
readerOne.Close();
readerOne.Dispose();
return oListSynthese;
}
}
catch (Exception excThrown)
{
throw new Exception(excThrown.Message);
}
}
When I debug it with SQL Server profiler it shows the data....that meant the data is not empty, but it never enter into this loop.
while (readerOne.Read())
{
by the way my connection class:
class Connexion : IDisposable
{
public SqlConnection conn;
public SqlConnection GetConnected()
{
try
{
string strConnectionString = Properties.Settings.Default.Soft8Exp_ClientConnStr;
conn = new SqlConnection(strConnectionString);
}
catch (Exception excThrown)
{
conn = null;
throw new Exception(excThrown.InnerException.Message, excThrown);
}
// Ouverture et restitution de la connexion en cours
if (conn.State == ConnectionState.Closed) conn.Open();
return conn;
}
public Boolean IsConnected
{
get { return (conn != null) && (conn.State != ConnectionState.Closed) && (conn.State != ConnectionState.Broken); }
}
public void CloseConnection()
{
// Libération de la connexion si elle existe
if (IsConnected)
{
conn.Close();
conn = null;
}
}
public void Dispose()
{
CloseConnection();
}
}
and my SQL Statement:
exec sp_executesql N'SELECT CODE_TARIF,PRIX_UNITAIRE,TR_DEB,TR_FIN,LE_ZONE,T_TARIF_ZONE.LIBELLE as LIB_ZONE,
SUBSTRING(CP_DEST,1,2) as DEPT,T_UNITE.LIBELLE as ENLEV_UNITE,
count(NOID)as NOID,
SUM(CASE WHEN POID_CORR IS NOT NULL THEN POID_CORR ELSE POID END) as POID,sum(NBR_COLIS)as NBR_COLIS,COUNT(NULLIF(ENLEV_CREMB,0))as NBR_CREMB, sum(ENLEV_CREMB)as ENLEV_CREMB,COUNT(NULLIF(ENLEV_DECL,0))as NBR_DECL,sum(ENLEV_DECL)as ENLEV_DECL,sum(PRIX_HT)as PRIX_HT,sum(PRIX_TTC)as PRIX_TTC, sum (POID_CORR)as POID_CORR
FROM LETTRE_VOIT_FINAL
LEFT JOIN T_TARIF_ZONE ON LETTRE_VOIT_FINAL.LE_ZONE = T_TARIF_ZONE.NO_ID
LEFT JOIN T_UNITE ON LETTRE_VOIT_FINAL.ENLEV_UNITE = T_UNITE.NO_ID
where code_client = #codeClioent
and DATE_CLOTUR_REEL BETWEEN #DateDeb AND #DateFin
and STATUT_LV = 2
group by CODE_TARIF,PRIX_UNITAIRE,TR_DEB,TR_FIN,LE_ZONE,T_TARIF_ZONE.LIBELLE,SUBSTRING(CP_DEST,1,2),T_UNITE.LIBELLE
order by LE_ZONE,PRIX_UNITAIRE
',N'#codeClioent nvarchar(8),#DateDeb datetime,#DateFin datetime',#codeClioent=N'17501613',#DateDeb='2013-06-05 00:00:00',#DateFin='2013-06-05 23:59:00'
it return the data on SQL profiler:
my real query :
SELECT CODE_TARIF,PRIX_UNITAIRE,TR_DEB,TR_FIN,LE_ZONE,T_TARIF_ZONE.LIBELLE as LIB_ZONE,
SUBSTRING(CP_DEST,1,2) as DEPT,T_UNITE.LIBELLE as ENLEV_UNITE,
count(NOID)as NOID,
SUM(CASE WHEN POID_CORR IS NOT NULL THEN POID_CORR ELSE POID END) as POID,sum(NBR_COLIS)as NBR_COLIS,COUNT(NULLIF(ENLEV_CREMB,0))as NBR_CREMB, sum(ENLEV_CREMB)as ENLEV_CREMB,COUNT(NULLIF(ENLEV_DECL,0))as NBR_DECL,sum(ENLEV_DECL)as ENLEV_DECL,sum(PRIX_HT)as PRIX_HT,sum(PRIX_TTC)as PRIX_TTC, sum (POID_CORR)as POID_CORR
FROM LETTRE_VOIT_FINAL
LEFT JOIN T_TARIF_ZONE ON LETTRE_VOIT_FINAL.LE_ZONE = T_TARIF_ZONE.NO_ID
LEFT JOIN T_UNITE ON LETTRE_VOIT_FINAL.ENLEV_UNITE = T_UNITE.NO_ID
where code_client = #codeClioent
and DATE_CLOTUR_REEL BETWEEN #DateDeb AND #DateFin
and STATUT_LV = 2
group by
CODE_TARIF,PRIX_UNITAIRE,TR_DEB,TR_FIN,LE_ZONE,T_TARIF_ZONE.LIBELLE,SUBSTRING(CP_DEST,1,2),T_UNITE.LIBELLE
order by LE_ZONE,PRIX_UNITAIRE
it is strange....when the data is between :
DATE_CLOTUR_REEL BETWEEN '2013-06-05 00:00:00' and '2013-06-05 23:59:00'
but
DATE_CLOTUR_REEL BETWEEN '2013-06-01 00:00:00' and '2013-06-05 23:59:00'
it works.
This is the way it should be. You are not doing the connection.Open()
Also set up the connection string.
private static void ReadOrderData(string connectionString)
{
string queryString =
"SELECT OrderID, CustomerID FROM dbo.Orders;";
using (SqlConnection connection =
new SqlConnection(connectionString))
{
SqlCommand command =
new SqlCommand(queryString, connection);
connection.Open();
SqlDataReader reader = command.ExecuteReader();
// Call Read before accessing data.
while (reader.Read())
{
ReadSingleRow((IDataRecord)reader);
}
// Call Close when done reading.
reader.Close();
}
}
The perfect example of how to do it belongs to MSDN - Microsoft Website
NOTICE:
SqlCommand command =
new SqlCommand(queryString, connection);
connection.Open();
SqlDataReader reader = command.ExecuteReader();
Create the SqlCommand
then open the connection
You are doing it the other way, you open it and then create the command.
I also don't see where you set the query string, I just see that you add the parameters; are you missing it?
This is perhaps not the answer you're looking for, but your code sample exhibits a number of bad coding practices that are easy to fall into due to ADO.NET's poor API design. Rather than manually do all this sql-to-.net conversion, you should use a library that does this for you.
It's easier to avoid bugs when you're not using a bug-prone API.
I recommend PetaPoco - it's easier to use than your current code, and has virtually no overhead (and given your example, is probably faster). There are many other alternatives, however.
Issues with your code sample:
Improperly disposed objects: you aren't disposing SqlCommand and SqlDataReader properly. You possibly aren't disposing connections either (but that depends on Connexion internals).
Using .ToString rather than type-safe casts. You should never extract data from an SqlDataReader like that because it undermines the whole point of the type system, and it's slow to boot. (PetaPoco or something similar will help a lot here)
You're discarding stack traces on error due to the (pointless) try-catch. That just makes your code less readable and harder to debug. Don't catch unless you have have.
Keeping your query away from the code - your code is tightly coupled to the query, and this separation just makes it hard to keep them in sync. Also, loading from the filesystem each and everytime you query is slow and opens up unnecessary filesystem-related failure modes such as locking, max path lengths, and permissions. This is probably the source of your bug - your query probably doesn't do what you think it does.
With PetaPoco or something similar, your entire function would look something like this:
public static List<Pers_Synthese> Get_ListeSynthese_all(
string codeClient, DateTime DateDeb, DateTime DateFin) {
var db = new PetaPoco.Database("Soft8Exp_ClientConnStr");
//you should probably not be storing a query in a file.
//To be clear: your query should not be wrapped in exec sp_executesql,
//ADO.NET will do that for you.
string sql_Syntax = Outils.LoadFileToString(
Path.Combine(appDir, #"SQL\Get_ListeSynthese_All.sql"));
//You'll need to rename Pers_Synthese's properties to match the db,
// or vice versa, or you can annotate the properties with the column names.
return db.Fetch<Pers_Synthese>(sql_Syntax, new {
codeClioent = codeClient, //I suspect this is a typo
DateDeb,
DateFin
});
}
And in that much shorter, readable, faster form, you'll hopefully find whatever bug you have much faster.
Alternatives:
PetaPoco
Dapper (fewer features, but stackoverflow uses it!)
OrmLite (of ServiceStack fame)
Massive (older, uses dynamic which is a feature that can cause bad habits - I don't recommend this unless you really know what you're doing)
You could use heavier, more invasive ORM's like the Entity framework and NHibernate, but these require quite a bit more learning, and they're much slower, and they impose a particular workflow on you which I don't think makes them the best choice in your case.
when i debug it with sql profiler it show the data....that meant the data is not empty, but it never enter into this loop.
It's the other way round: if it never enters into this loop, then it means "the data is empty", i.e. the query returns no rows.
The bug is in your code, not SqlReader: you possibly have the wrong values in your parameters, or maybe the query you read from a file isn't what you think it is. Get the debugger out and inspect the query text and parameters.

Very slow foreach loop

I am working on an existing application. This application reads data from a huge file and then, after doing some calculations, it stores the data in another table.
But the loop doing this (see below) is taking a really long time. Since the file sometimes contains 1,000s of records, the entire process takes days.
Can I replace this foreach loop with something else? I tried using Parallel.ForEach and it did help. I am new to this, so will appreciate your help.
foreach (record someredord Somereport.r)
{
try
{
using (var command = new SqlCommand("[procname]", sqlConn))
{
command.CommandTimeout = 0;
command.CommandType = CommandType.StoredProcedure;
command.Parameters.Add(…);
IAsyncResult result = command.BeginExecuteReader();
while (!result.IsCompleted)
{
System.Threading.Thread.Sleep(10);
}
command.EndExecuteReader(result);
}
}
catch (Exception e)
{
…
}
}
After reviewing the answers , I removed the Async and used edited the code as below. But this did not improve performance.
using (command = new SqlCommand("[sp]", sqlConn))
{
command.CommandTimeout = 0;
command.CommandType = CommandType.StoredProcedure;
foreach (record someRecord in someReport.)
{
command.Parameters.Clear();
command.Parameters.Add(....)
command.Prepare();
using (dr = command.ExecuteReader())
{
while (dr.Read())
{
if ()
{
}
else if ()
{
}
}
}
}
}
Instead of looping the sql connection so many times, ever consider extracting the whole set of data out from sql server and process the data via the dataset?
Edit: Decided to further explain what i meant..
You can do the following, pseudo code as follow
Use a select * and get all information from the database and store them into a list of the class or dictionary.
Do your foreach(record someRecord in someReport) and do the condition matching as usual.
Step 1: Ditch the try at async. It isn't implemented properly and you're blocking anyway. So just execute the procedure and see if that helps.
Step 2: Move the SqlCommand outside of the loop and reuse it for each iteration. that way you don't incurr the cost of creating and destroying it for every item in your loop.
Warning: Make sure you reset/clear/remove parameters you don't need from the previous iteration. We did something like this with optional parameters and had 'bleed-thru' from the previous iteration because we didn't clean up parameters we didn't need!
Your biggest problem is that you're looping over this:
IAsyncResult result = command.BeginExecuteReader();
while (!result.IsCompleted)
{
System.Threading.Thread.Sleep(10);
}
command.EndExecuteReader(result);
The entire idea of the asynchronous model is that the calling thread (the one doing this loop) should be spinning up ALL of the asynchronous tasks using the Begin method before starting to work with the results with the End method. If you are using Thread.Sleep() within your main calling thread to wait for an asynchronous operation to complete (as you are here), you're doing it wrong, and what ends up happening is that each command, one at a time, is being called and then waited for before the next one starts.
Instead, try something like this:
public void BeginExecutingCommands(Report someReport)
{
foreach (record someRecord in someReport.r)
{
var command = new SqlCommand("[procname]", sqlConn);
command.CommandTimeout = 0;
command.CommandType = CommandType.StoredProcedure;
command.Parameters.Add(…);
command.BeginExecuteReader(ReaderExecuted,
new object[] { command, someReport, someRecord });
}
}
void ReaderExecuted(IAsyncResult result)
{
var state = (object[])result.AsyncState;
var command = state[0] as SqlCommand;
var someReport = state[1] as Report;
var someRecord = state[2] as Record;
try
{
using (SqlDataReader reader = command.EndExecuteReader(result))
{
// work with reader, command, someReport and someRecord to do what you need.
}
}
catch (Exception ex)
{
// handle exceptions that occurred during the async operation here
}
}
In SQL on the other end of a write is a (one) disk. You rarely can write faster in parallel. In fact in parallel often slows it down due to index fragmentation. If you can sort the data by primary (clustered) key prior to loading. In a big load even disable other keys, load data rebuild keys.
Not really sure what are doing in the asynch but for sure it was not doing what you expected as it was waiting on itself.
try
{
using (var command = new SqlCommand("[procname]", sqlConn))
{
command.CommandTimeout = 0;
command.CommandType = CommandType.StoredProcedure;
foreach (record someredord Somereport.r)
{
command.Parameters.Clear()
command.Parameters.Add(…);
using (var rdr = command.ExecuteReader())
{
while (rdr.Read())
{
…
}
}
}
}
}
catch (…)
{
…
}
As we were talking about in the comments, storing this data in memory and working with it there may be a more efficient approach.
So one easy way to do that is to start with Entity Framework. Entity Framework will automatically generate the classes for you based on your database schema. Then you can import a stored procedure which holds your SELECT statement. The reason I suggest importing a stored proc into EF is that this approach is generally more efficient than doing your queries in LINQ against EF.
Then run the stored proc and store the data in a List like this...
var data = db.MyStoredProc().ToList();
Then you can do anything you want with that data. Or as I mentioned, if you're doing a lot of lookups on primary keys then use ToDictionary() something like this...
var data = db.MyStoredProc().ToDictionary(k => k.MyPrimaryKey);
Either way, you'll be working with your data in memory at this point.
It seems executing your SQL command puts lock on some required resources and that's the reason enforced you to use Async methods (my guess).
If the database in not in use, try an exclusive access to it. Even then in there are some internal transactions due to data-model complexity consider consulting to database designer.

Prepared statements and the built-in connection pool in .NET

I have a long-running service with several threads calling the following method hundreds of times per second:
void TheMethod()
{
using (var c = new SqlConnection("..."))
{
c.Open();
var ret1 = PrepareAndExecuteStatement1(c, args1);
// some code
var ret2 = PrepareAndExecuteStatement2(c, args2);
// more code
}
}
PrepareAndExecuteStatement is something like this:
void PrepareAndExecuteStatement*(SqlConnection c, args)
{
var cmd = new SqlCommand("query", c);
cmd.Parameters.Add("#param", type);
cmd.Prepare();
cmd.Parameters["#param"] = args;
return cmd.execute().read().etc();
}
I want reuse the prepared statements, preparing once per connection and executing them until the connection breaks. I hope this will improve performance.
Can I use the built-in connection pool to achieve this? Ideally every time a new connection is made, all statements should be automatically prepared, and I need to have access to the SqlCommand objects of these statements.
Suggest taking a slightly modified approach. Close your connection immedately after use. You can certainly re-use your SqlConnection.
The work being done at //some code may take a long time. Are you interacting with other network resources, disk resources, or spending any amount of time with calculations? Could you ever, in the future, need to do so? Perhaps the intervals between executing statement are/could be so long that you'd want to reopen that connection. Regardless, the Connection should be opened late and closed early.
using (var c = new SqlConnection("..."))
{
c.Open();
PrepareAndExecuteStatement1(c, args);
c.Close();
// some code
c.Open();
PrepareAndExecuteStatement2(c, args);
c.Close();
// more code
}
Open Late, Close Early as MSDN Magazine by John Papa.
Obviously we've now got a bunch of code duplication here. Consider refactoring your Prepare...() method to perform the opening and closing operations.
Perhaps you'd consider something like this:
using (var c = new SqlConnection("..."))
{
var cmd1 = PrepareAndCreateCommand(c, args);
// some code
var cmd2 = PrepareAndCreateCommand(c, args);
c.Open();
cmd1.ExecuteNonQuery();
cmd2.ExecuteNonQuery();
c.Close();
// more code
}

Categories