Timeout error when result of a transaction is used as parameter - c#

I have following code that uses SqlTransaction
string connectionString = ConfigurationManager.AppSettings["ConnectionString"];
using (SqlConnection connection = new SqlConnection(connectionString))
{
connection.Open();
SqlTransaction transaction = connection.BeginTransaction();
int logID = HelperClass.InsertException(connection, 1, DateTime.Now, "Y", "test", "test", 1, 1, transaction);
LogSearch logSearch = new LogSearch();
logSearch.LogID = 258;
Collection<Log> logs = HelperClass.GetLogs(logSearch, connectionString);
}
This code is throwing the following exception.
Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding.
However if I pass a hard coded value for LogID, there is no exception.
QUESTION
Why does exception come when I pass logID (result from InsertException()) ?
Please explain why there is NO exception when I pass a hard coded value as LogID
Note: InsertException() uses a connection with SqlTransaction whereas GetLogs() uses a new connection without any transaction
UPDATED QUESTION
The Business Layer code does not use Transaction. I need to call the Business Layer methods in my Unit Testing code shown above (for integration testing). How can we apply transaction to UT code (for integration testing) even though the Business Layer does not use transaction? From #jbl answer, it seems like, it is not at all possible to use transaction in Unit Testing. How can we apply transaction for UT code.
CODE
public static class HelperClass
{
public static Collection<Log> GetLogs(LogSearch logSearch, string connectionString)
{
Collection<Log> logs = new Collection<Log>();
using (SqlConnection connection = new SqlConnection(connectionString))
{
connection.Open();
string commandText = "SELECT * FROM Application_EX WHERE application_ex_id = #application_ex_id";
using (SqlCommand command = new SqlCommand(commandText, connection))
{
command.CommandType = System.Data.CommandType.Text;
//Parameter value setting
command.Parameters.AddWithValue("#application_ex_id", logSearch.LogID);
using (SqlDataReader reader = command.ExecuteReader())
{
if (reader.HasRows)
{
while (reader.Read())
{
}
}
}
}
}
return logs;
}
public static Int16 InsertException(SqlConnection connection, Int16 applicationID, DateTime createdDate, string isInternalLocationIndicator, string exceptionDescription, string operation, Int16 severityLevelNumber, Int16 exceptionTypeCode, SqlTransaction transaction)
{
Int16 newLogID = 0;
string commandText = #"INSERT INTO Application_Ex
VALUES (#severity_level_nbr, #appl_service_id, #ex_internal_appl_ind,
#application_ex_txt,#ex_location_txt,#create_ts,#application_ex_code);
SELECT SCOPE_IDENTITY() AS [LogIdentity];";
using (SqlCommand command = new SqlCommand(commandText, connection, transaction))
{
command.CommandType = System.Data.CommandType.Text;
command.Parameters.AddWithValue("#severity_level_nbr", severityLevelNumber);
command.Parameters.AddWithValue("#appl_service_id", applicationID);
command.Parameters.AddWithValue("#ex_internal_appl_ind", isInternalLocationIndicator);
command.Parameters.AddWithValue("#application_ex_txt", exceptionDescription);
command.Parameters.AddWithValue("#ex_location_txt", operation);
command.Parameters.AddWithValue("#create_ts", createdDate);
command.Parameters.AddWithValue("#application_ex_code", exceptionTypeCode);
newLogID = Convert.ToInt16(command.ExecuteScalar(), CultureInfo.InvariantCulture);
}
return newLogID;
}
}

QUESTION
Why does exception come when I pass logID (result from InsertException()) ?
Please explain why there is NO exception when I pass a hard coded value as LogID
ANSWERS
When a new record is inserted with using a transaction, that means the new record is not finally committed until you yourself commit the transaction. Until that time the new record is LOCKED, which means that any query that touches that new record will be suspended until a timeout occurs. In your case, the call to the method GetLogs is executed while the transaction is still running, and that method searches for the newly inserted record-id. Since the row is locked, your GetLogs call will wait until the timeout occurs, which results in a timeout exception.
In the case of a hard coded value, the call to GetLogs will search for the existing record with the corresponding id. Since you are searching for a PK-value, SQL does not have to search all rows, since it is the PK. Therefore, the existing row is found, and returned, all in a separate transaction, since the transactions are not overlapping in the data they touch.
Suppose your method GetLogs was searching the table on another column, a non-pk column (say for example application_message), then the whole table would have to be read to find a row (or rows) with the corresponding value for application_message. This would result in a query that always touches the newly inserted locked row, and then also with a hard coded (application_message) value your would get a timeout exception. This I added just to clarify the locking, and when SQL does or does not need to touch the locked row(s).
Hope this helps.

I guess that's because HelperClass.GetLogs(logSearch, connectionString); instantiate a new connection out of scope of your transaction :
You may, at your will :
have your helper class accept the connection object holding the transaction instead of a connection string
or replace "SELECT * FROM Application_EX WHERE application_ex_id = #application_ex_id" with "SELECT * FROM Application_EX with (nolock) WHERE application_ex_id = #application_ex_id"
note that the second case would sometimes return incorrect values, and would not return values you are currently inserting in your transaction
Hope this will help

Related

Testing SQL methods that don't use transactions?

Suppose I have a database method that looks like this:
public void insertRow(SqlConnection c)
{
using (var cmd = new SqlCommand("insert into myTable values(#dt)",c))
{
cmd.Parameters.Add(new SqlParameter("#dt",DbType.DateTime)).Value = DateTime.Now;
cmd.ExecuteNonQuery();
}
}
Now suppose I want to test this method. So, I write a test case that attempts to wrap this method inside a transaction, so that I can rollback the change after testing the result of the insertion:
public void testInsertRow()
{
SqlConnection c = new SqlConnection("connection.string.here");
SqlTransaction trans = c.BeginTransaction();
insertRow();
// do something here to evaluate what happened, e.g. query the DB
trans.Rollback();
}
This however fails to work, because:
ExecuteNonQuery requires the command to have a transaction when the connection assigned to the command is in a pending local transaction. The Transaction property of the command has not been initialized.
Is there a way to accomplish this, without having to rewrite every single database method to accept a transaction, and then rewrite every single call to pass null into the method for the transaction?
For example, this would work:
public void insertRow(SqlConnection c, SqlTransaction t)
{
using (var cmd = new SqlCommand("insert into myTable values(#dt)",c))
{
if (t != null) cmd.Transaction = t;
cmd.Parameters.Add(new SqlParameter("#dt",DbType.DateTime)).Value = DateTime.Now;
cmd.ExecuteNonQuery();
}
c.Close();
}
But then, I have to either rewrite each and every call to the database method to include that null parameter, or write override signatures for each and every database method that automatically pass in a null, e.g.
public void insertRow(SqlConnection c) { insertRow(c, null); }
What's the best way to allow transaction-based testing of database calls?
You can use a TransactionScope to add the connections automatically in to a transaction
public void testInsertRow()
{
using(TransactionScope scope = new TransactionScope(TransactionScopeOption.RequiresNew))
{
SqlConnection c = new SqlConnection("connection.string.here");
insertRow(c);
// do something here to evaluate what happened, e.g. query the DB
//do not call scope.Complete() so we get a rollback.
}
}
Now this will cause tests to block each other if you have multiple parallel tests running. If you database is set up to support it you could do Snapshot isolation so updates from concurrent tests won't lock each other out.
public void testInsertRow()
{
using(TransactionScope scope = new TransactionScope(TransactionScopeOption.RequiresNew,
new TransactionOptions(IsolationLevel = IsolationLevel.Snapshot))
{
SqlConnection c = new SqlConnection("connection.string.here");
insertRow(c);
// do something here to evaluate what happened, e.g. query the DB
//do not call scope.Complete() so we get a rollback.
}
}

Where to close SqlDataReader object and SqlConnection object?

I call a function which returns a SqlDataReader object to calling statement. I am confused where should I close the SqlDataReader object and SqlConnection object? In function or after calling it?
This is the function call:
SqlDataReader dr2= RetrieveSearcher();
pid = dr2[0].ToString();
This is the function:
protected SqlDataReader RetrieveSearcher()
{
String Q = "select price from tb3 where pid='12';
cn = new SqlConnection("data source=.\\sqlexpress; integrated security=true; initial catalog=singh");
cn.Open();
cmd = new SqlCommand(Q,cn);
dr1 = cmd.ExecuteReader();
dr1.Read();
return dr1;
}
Always use parameterized queries to avoid sql injection attacks and increase performance (most db servers can reuse execution plans with proper queries)
Never leave a connection open any longer than necessary!
Do not share db connections! Create it, use it, destroy it.
Wrap everything that implements IDisposable in a using block like Connections, Commands, DataReaders, etc. This ensures no resources remain open even in the event of an exception.
Use correct types in your db schema and read those types, do not blanket-convert everything to/from string! Example price seems like it should really be a decimal or numeric value and not a string so do not store it as a string and do not read it back as a string.
Retrieve the connection strings by name from the app.config or web.config (depending on the application type), do not hard code the strings into your connections or anywhere else.
About your logic
Change your method to return a custom type like a piece of data. This ensures proper SoS (Separation of Concerns). Do not return a DataReader! This will abstract the whole database call from the caller which is what you should strive for.
protected SomeType RetrieveSearcherData(string pid)
{
const string Q = "SELECT price FROM tb3 WHERE pid = #pid";
using(var cn=new SqlConnection())
using(var cmd=new SqlCommand(Q,cn))
{
// I do not know what pid is but use tho correct type here as well and specify that type using SqlDbType
cmd.Parameters.Add(new SqlParameter("#pid", SqlDbType.VarChar, 100) { Value = pid});
cn.Open();
using(var dr1= cmd.ExecuteReader())
{
if(dr1.Read())
{
var result = dr1.GetDecimal(0);
// read something and return it either in raw format or in some object (use a custom type)
}
else
return null; // return something else that indicates nothing was found
}
}
}
Do you really want to open a connection each time you call into this function? Having one thread deal with multiple connections is a sure fire way to get deadlocks.
If you still want to do #1, I'd recommend having your RetrieveSearcher return the data it needs in a List<T> or heck, just return a DataTable and deal with that. That way the function can close the connection that it opened.
If you still REALLY want to return a SqlDataReader then you need to make sure that you can close the connection that you opened. SqlDataReader doesn't expose a SqlConnection directly, so you can't directly close the connection after you leave the RetrieveSearcher method. However, you can do this:
dr1 = cmd.ExecuteReader(CommandBehavior.CloseConnection);
That will close the connection when the reader is closed. So, then you can do:
using (SqlDataReader dr2 = RetrieveSearcher()) {
pid=dr2[0].ToString();
}
I'm assuming of course that you REALLY need more than just one string. :) If you REALLY only need one string you just be returning the string and calling cmd.ExecuteScalar();

Properly implement concurrency control in ASP.NET WebAPI + SQL

I'm writing a very simple web application that serves as an endpoint for uploading money transactions from customers and saving them in SQL Server DB. It accepts requests with just 2 params: userid: 'xxx', balancechange: -19.99. If the user ID exists in the app database, then the balance is changed; if not - a new row is created for this ID.
The difficult part in all this is that the numer of requests is enormous and I have to implement the app in such a way that it works as fast as possible and resolves concurrency issues (if 2 requests for the same ID arrive simultaneously).
The app is a ASP.NET MVC WebAPI. I chose to use plain old ADO.NET for speed, and this is what I currently have:
private static readonly object syncLock = new object();
public void UpdateBalance(string userId, decimal balance)
{
lock (syncLock)
{
using (var sqlConnection = new SqlConnection(this.connectionString))
{
var command = new SqlCommand($"SELECT COUNT(*) FROM Users WHERE Id = '{userId}'", sqlConnection);
if ((int)command.ExecuteScalar() == 0)
{
command = new SqlCommand($"INSERT INTO Users (Id, Balance) VALUES ('{userId}', 0)", sqlConnection);
command.ExecuteNonQuery();
}
command = new SqlCommand($"UPDATE Users SET Balance = Balance + {balance} WHERE Id = {userId}", sqlConnection);
command.ExecuteNonQuery();
}
}
}
Called from a controller like this:
[HttpPost]
public IHttpActionResult UpdateBalance(string id, decimal balanceChange)
{
UpdateBalance(id, balanceChange);
return Ok();
}
The thing I'm concernred with is concurrency control using lock (syncLock). This would slow the app down under high load and doesn't allow multiple instances of the app to be deployed on different servers. What are ways to properly implement concurrency control here?
Note: I'd like to use a fast and DB-independent way of implementing concurrency control, as the current storage mechanism (SQL Server) can change in the future.
First, DB-independent code:
For this, you will want to look at DbProviderFactory. What this allows is passing the provider name (MySql.Data.MySqlClient, System.Data.SqlClient), then using the abstract classes (DbConnection, DbCommand) you interact with your DB.
Second, using transactions and paramaterized queries:
When you are working with a database, you ALWAYS want to have your queries paramaterized. If you use String.Format() or any other type of string concatenation, you open your query up to injection.
Transactions ensure all or nothing with your queries, and they can also lock down the table so that only queries within the transaction can access those tables. Transactions have two commands, Commit which will save the changes (if any) to the DB, and Rollback which discards any changes to the DB.
The following will assume that you already have an instance of DbProviderFactory in a class variable _factory.
public void UpdateBalance(string userId, decimal balanceChange)
{
//since we might need to execute two queries, we will create the paramaters once
List<DbParamater> paramaters = new List<DbParamater>();
DbParamater userParam = _factory.CreateParamater();
userParam.ParamaterName = "#userId";
userParam.DbType = System.Data.DbType.Int32;
userParam.Value = userId;
paramaters.Add(userParam);
DbParamater balanceChangeParam = _factory.CreateParamater();
balanceChangeParam.ParamaterName = "#balanceChange";
balanceChangeParam.DbType = System.Data.DbType.Decimal;
balanceChangeParam.Value = balanceChange;
paramaters.Add(balanceChangeParam);
//Improvement: if you implement a method to clone a DbParamater, you can
//create the above list in class construction instead of function invocation
//then clone the objects for the function.
using (DbConnection conn = _factory.CreateConnection()){
conn.Open(); //Need to open the connection before you start the transaction
DbTransaction trans = conn.BeginTransaction(System.Data.IsolationLevel.Serializable);
//IsolationLevel.Serializable locks the entire table down until the
//transaction is commited or rolled back.
try {
int changedRowCount = 0;
//We can use the fact that ExecuteNonQuery will return the number
//of affected rows, and if there are no affected rows, a
//record does not exist for the userId.
using (DbCommand cmd = conn.CreateCommand()){
cmd.Transaction = trans; //Need to set the transaction on the command
cmd.CommandText = "UPDATE Users SET Balance = Balance + #balanceChange WHERE Id = #userId";
cmd.Paramaters.AddRange(paramaters.ToArray());
changedRowCount = cmd.ExecuteNonQuery();
}
if(changedRowCount == 0){
//If no record was affected in the previous query, insert a record
using (DbCommand cmd = conn.CreateCommand()){
cmd.Transaction = trans; //Need to set the transaction on the command
cmd.CommandText = "INSERT INTO Users (Id, Balance) VALUES (#userId, #balanceChange)";
cmd.Paramaters.AddRange(paramaters.ToArray());
cmd.ExecuteNonQuery();
}
}
trans.Commit(); //This will persist the data to the DB.
}
catch (Exception e){
trans.Rollback(); //This will cause the data NOT to be saved to the DB.
//This is the default action if Commit is not called.
throw e;
}
finally {
trans.Dispose(); //Need to call dispose
}
//Improvement: you can remove the try-catch-finally block by wrapping
//the conn.BeginTransaction() line in a using block. I used a try-catch here
//so that you can more easily see what is happening with the transaction.
}
}

Unit testing with manual transactions and layered transactions

Due to a few restrictions I can't use entity Framework and thus need to use SQL Connections, commands and Transactions manually.
While writing unit tests for the methods calling these data layer operations I stumbled upon a few problems.
For the unit tests I NEED to do them in a Transaction as most of the operations are changing data by their nature and thus doing them outside a Transaction is problematic as that would change the whole base data. Thus I need to put a Transaction around these (with no commit fired at the end).
Now I have 2 different variants of how These BL methods work.
A few have Transactions themselves inside of them while others have no Transactions at all. Both of these variants cause problems.
Layered Transaction: Here I get errors that the DTC cancelled the distributed Transaction due to timeouts (although the timeout is being set to 15 minutes and it is running for only 2 minutes).
Only 1 Transaction: Here I get an error about the state of the Transaction when I come to the "new SQLCommand" line in the called method.
My question here is what can I do to correct this and get unit testing with manual normal and layered Transactions working?
Unit testing method example:
using (SqlConnection connection = new SqlConnection(Properties.Settings.Default.ConnectionString))
{
connection.Open();
using (SqlTransaction transaction = connection.BeginTransaction())
{
MyBLMethod();
}
}
Example for a Transaction using method (very simplified)
using (SqlConnection connection = new SqlConnection(Properties.Settings.Default.ConnectionString))
{
connection.Open();
using (SqlTransaction transaction = connection.BeginTransaction())
{
SqlCommand command = new SqlCommand();
command.Connection = connection;
command.Transaction = transaction;
command.CommandTimeout = 900; // Wait 15 minutes before a timeout
command.CommandText = "INSERT ......";
command.ExecuteNonQuery();
// Following commands
....
Transaction.Commit();
}
}
Example for a non Transaction using method
using (SqlConnection connection = new SqlConnection(Properties.Settings.Default.ConnectionString))
{
connection.Open();
SqlCommand command = new SqlCommand();
command.Connection = connection;
command.CommandTimeout = 900; // Wait 15 minutes before a timeout
command.CommandText = "INSERT ......";
command.ExecuteNonQuery();
}
On the face of it, you have a few options, depending upon what you want to test and your ability to spend money / change your code base.
At the moment, you’re effectively writing integration tests. If the database isn’t available then your tests will fail. This means the tests can be slow, but on the plus side if they pass you’re pretty confident that you code can hit the database correctly.
If you don’t mind hitting the database, then the minimum impact to changing your code / spending money would be for you to allow the transactions to complete and verify them in the database. You can either do this by taking database snapshots and resetting the database each test run, or by having a dedicated test database and writing your tests in such a way that they can safely hit the database over and over again and then verified. So for example, you can insert a record with an incremented id, update the record, and then verify that it can be read. You may have more unwinding to do if there are errors, but if you’re not modifying the data access code or the database structure that often then this shouldn’t be too much of an issue.
If you’re able to spend some money and you want to actually turn your tests into unit tests, so that they don’t hit the database, then you should consider looking into TypeMock. It’s a very powerful mocking framework that can do some pretty scary stuff. I believe it using the profiling API to intercept calls, rather than using the approach used by frameworks like Moq. There's an example of using Typemock to mock a SQLConnection here.
If you don’t have money to spend / you’re able to change your code and don’t mind continuing to rely on the database then you need to look at some way to share your database connection between your test code and your dataaccess methods. Two approaches that spring to mind are to either inject the connection information into the class, or make it available by injecting a factory that gives access to the connection information (in which case you can inject a mock of the factory during testing that returns the connection you want).
If you go with the above approach, rather than directly injecting SqlConnection, consider injecting a wrapper class that is also responsible for the transaction. Something like:
public class MySqlWrapper : IDisposable {
public SqlConnection Connection { get; set; }
public SqlTransaction Transaction { get; set; }
int _transactionCount = 0;
public void BeginTransaction() {
_transactionCount++;
if (_transactionCount == 1) {
Transaction = Connection.BeginTransaction();
}
}
public void CommitTransaction() {
_transactionCount--;
if (_transactionCount == 0) {
Transaction.Commit();
Transaction = null;
}
if (_transactionCount < 0) {
throw new InvalidOperationException("Commit without Begin");
}
}
public void Rollback() {
_transactionCount = 0;
Transaction.Rollback();
Transaction = null;
}
public void Dispose() {
if (null != Transaction) {
Transaction.Dispose();
Transaction = null;
}
Connection.Dispose();
}
}
This will stop nested transactions from being created + committed.
If you’re more willing to restructure your code, then you might want to wrap your dataaccess code in a more mockable way. So, for example you could push your core database access functionality into another class. Depending on what you’re doing you’ll need to expand on it, however you might end up with something like this:
public interface IMyQuery {
string GetCommand();
}
public class MyInsert : IMyQuery{
public string GetCommand() {
return "INSERT ...";
}
}
class DBNonQueryRunner {
public void RunQuery(IMyQuery query) {
using (SqlConnection connection = new SqlConnection(Properties.Settings.Default.ConnectionString)) {
connection.Open();
using (SqlTransaction transaction = connection.BeginTransaction()) {
SqlCommand command = new SqlCommand();
command.Connection = connection;
command.Transaction = transaction;
command.CommandTimeout = 900; // Wait 15 minutes before a timeout
command.CommandText = query.GetCommand();
command.ExecuteNonQuery();
transaction.Commit();
}
}
}
}
This allows you to unit test more of your logic, like the command generation code, without having to actually worry about hitting the database and you can test your core dataaccess code (the Runner) against the database once, rather than for every command you want to run against the database. I would still write integration tests for all dataaccess code, but I’d only tend to run them whilst actually working on that section of code (to ensure column names etc have been specified correctly).

SqlDataReader does not work - does not read data

I have a SqlDataReader, but it never enters into Read().
When I debug it, it pass the loop while(readerOne.Read()). It never enter into this loop even though there is data.
public static List<Pers_Synthese> Get_ListeSynthese_all(string codeClient, DateTime DateDeb, DateTime DateFin)
{
try
{
using (var connectionWrapper = new Connexion())
{
var connectedConnection = connectionWrapper.GetConnected();
string sql_Syntax = Outils.LoadFileToString(Path.Combine(appDir, #"SQL\Get_ListeSynthese_All.sql"));
SqlCommand comm_Command = new SqlCommand(sql_Syntax, connectionWrapper.conn);
comm_Command.Parameters.AddWithValue("#codeClioent", codeClient);
comm_Command.Parameters.AddWithValue("#DateDeb", DateDeb);
comm_Command.Parameters.AddWithValue("#DateFin", DateFin);
List<Pers_Synthese> oListSynthese = new List<Pers_Synthese>();
SqlDataReader readerOne = comm_Command.ExecuteReader();
while (readerOne.Read())
{
Pers_Synthese oSyntehse = new Pers_Synthese();
oSyntehse.CodeTrf = readerOne["CODE_TARIF"].ToString();
oSyntehse.NoLV = readerOne["NOID"].ToString();
oSyntehse.PrxUnitaire = readerOne["PRIX_UNITAIRE"].ToString();
oSyntehse.ZoneId = readerOne["LE_ZONE"].ToString();
oSyntehse.LeZone = readerOne["LIB_ZONE"].ToString();
oSyntehse.LeDept = readerOne["DEPT"].ToString();
oSyntehse.LeUnite = readerOne["ENLEV_UNITE"].ToString();
oSyntehse.LePoids = Convert.ToInt32(readerOne["POID"]);
//oSyntehse.LePoidsCorr = Convert.ToInt32(readerOne["POID_CORR"]);
oSyntehse.LeColis = readerOne["NBR_COLIS"].ToString();
oSyntehse.LeCr = readerOne["NBR_CREMB"].ToString();
oSyntehse.SumMontantCR = readerOne["ENLEV_CREMB"].ToString();
oSyntehse.LeVd = readerOne["NBR_DECL"].ToString();
oSyntehse.SumMontantVD = readerOne["ENLEV_DECL"].ToString();
oSyntehse.LePrixHT = readerOne["PRIX_HT"].ToString();
oSyntehse.LePrixTTC = readerOne["PRIX_TTC"].ToString();
oSyntehse.TrDeb = readerOne["TR_DEB"].ToString();
oSyntehse.TrFin = readerOne["TR_FIN"].ToString();
oListSynthese.Add(oSyntehse);
}
readerOne.Close();
readerOne.Dispose();
return oListSynthese;
}
}
catch (Exception excThrown)
{
throw new Exception(excThrown.Message);
}
}
When I debug it with SQL Server profiler it shows the data....that meant the data is not empty, but it never enter into this loop.
while (readerOne.Read())
{
by the way my connection class:
class Connexion : IDisposable
{
public SqlConnection conn;
public SqlConnection GetConnected()
{
try
{
string strConnectionString = Properties.Settings.Default.Soft8Exp_ClientConnStr;
conn = new SqlConnection(strConnectionString);
}
catch (Exception excThrown)
{
conn = null;
throw new Exception(excThrown.InnerException.Message, excThrown);
}
// Ouverture et restitution de la connexion en cours
if (conn.State == ConnectionState.Closed) conn.Open();
return conn;
}
public Boolean IsConnected
{
get { return (conn != null) && (conn.State != ConnectionState.Closed) && (conn.State != ConnectionState.Broken); }
}
public void CloseConnection()
{
// Libération de la connexion si elle existe
if (IsConnected)
{
conn.Close();
conn = null;
}
}
public void Dispose()
{
CloseConnection();
}
}
and my SQL Statement:
exec sp_executesql N'SELECT CODE_TARIF,PRIX_UNITAIRE,TR_DEB,TR_FIN,LE_ZONE,T_TARIF_ZONE.LIBELLE as LIB_ZONE,
SUBSTRING(CP_DEST,1,2) as DEPT,T_UNITE.LIBELLE as ENLEV_UNITE,
count(NOID)as NOID,
SUM(CASE WHEN POID_CORR IS NOT NULL THEN POID_CORR ELSE POID END) as POID,sum(NBR_COLIS)as NBR_COLIS,COUNT(NULLIF(ENLEV_CREMB,0))as NBR_CREMB, sum(ENLEV_CREMB)as ENLEV_CREMB,COUNT(NULLIF(ENLEV_DECL,0))as NBR_DECL,sum(ENLEV_DECL)as ENLEV_DECL,sum(PRIX_HT)as PRIX_HT,sum(PRIX_TTC)as PRIX_TTC, sum (POID_CORR)as POID_CORR
FROM LETTRE_VOIT_FINAL
LEFT JOIN T_TARIF_ZONE ON LETTRE_VOIT_FINAL.LE_ZONE = T_TARIF_ZONE.NO_ID
LEFT JOIN T_UNITE ON LETTRE_VOIT_FINAL.ENLEV_UNITE = T_UNITE.NO_ID
where code_client = #codeClioent
and DATE_CLOTUR_REEL BETWEEN #DateDeb AND #DateFin
and STATUT_LV = 2
group by CODE_TARIF,PRIX_UNITAIRE,TR_DEB,TR_FIN,LE_ZONE,T_TARIF_ZONE.LIBELLE,SUBSTRING(CP_DEST,1,2),T_UNITE.LIBELLE
order by LE_ZONE,PRIX_UNITAIRE
',N'#codeClioent nvarchar(8),#DateDeb datetime,#DateFin datetime',#codeClioent=N'17501613',#DateDeb='2013-06-05 00:00:00',#DateFin='2013-06-05 23:59:00'
it return the data on SQL profiler:
my real query :
SELECT CODE_TARIF,PRIX_UNITAIRE,TR_DEB,TR_FIN,LE_ZONE,T_TARIF_ZONE.LIBELLE as LIB_ZONE,
SUBSTRING(CP_DEST,1,2) as DEPT,T_UNITE.LIBELLE as ENLEV_UNITE,
count(NOID)as NOID,
SUM(CASE WHEN POID_CORR IS NOT NULL THEN POID_CORR ELSE POID END) as POID,sum(NBR_COLIS)as NBR_COLIS,COUNT(NULLIF(ENLEV_CREMB,0))as NBR_CREMB, sum(ENLEV_CREMB)as ENLEV_CREMB,COUNT(NULLIF(ENLEV_DECL,0))as NBR_DECL,sum(ENLEV_DECL)as ENLEV_DECL,sum(PRIX_HT)as PRIX_HT,sum(PRIX_TTC)as PRIX_TTC, sum (POID_CORR)as POID_CORR
FROM LETTRE_VOIT_FINAL
LEFT JOIN T_TARIF_ZONE ON LETTRE_VOIT_FINAL.LE_ZONE = T_TARIF_ZONE.NO_ID
LEFT JOIN T_UNITE ON LETTRE_VOIT_FINAL.ENLEV_UNITE = T_UNITE.NO_ID
where code_client = #codeClioent
and DATE_CLOTUR_REEL BETWEEN #DateDeb AND #DateFin
and STATUT_LV = 2
group by
CODE_TARIF,PRIX_UNITAIRE,TR_DEB,TR_FIN,LE_ZONE,T_TARIF_ZONE.LIBELLE,SUBSTRING(CP_DEST,1,2),T_UNITE.LIBELLE
order by LE_ZONE,PRIX_UNITAIRE
it is strange....when the data is between :
DATE_CLOTUR_REEL BETWEEN '2013-06-05 00:00:00' and '2013-06-05 23:59:00'
but
DATE_CLOTUR_REEL BETWEEN '2013-06-01 00:00:00' and '2013-06-05 23:59:00'
it works.
This is the way it should be. You are not doing the connection.Open()
Also set up the connection string.
private static void ReadOrderData(string connectionString)
{
string queryString =
"SELECT OrderID, CustomerID FROM dbo.Orders;";
using (SqlConnection connection =
new SqlConnection(connectionString))
{
SqlCommand command =
new SqlCommand(queryString, connection);
connection.Open();
SqlDataReader reader = command.ExecuteReader();
// Call Read before accessing data.
while (reader.Read())
{
ReadSingleRow((IDataRecord)reader);
}
// Call Close when done reading.
reader.Close();
}
}
The perfect example of how to do it belongs to MSDN - Microsoft Website
NOTICE:
SqlCommand command =
new SqlCommand(queryString, connection);
connection.Open();
SqlDataReader reader = command.ExecuteReader();
Create the SqlCommand
then open the connection
You are doing it the other way, you open it and then create the command.
I also don't see where you set the query string, I just see that you add the parameters; are you missing it?
This is perhaps not the answer you're looking for, but your code sample exhibits a number of bad coding practices that are easy to fall into due to ADO.NET's poor API design. Rather than manually do all this sql-to-.net conversion, you should use a library that does this for you.
It's easier to avoid bugs when you're not using a bug-prone API.
I recommend PetaPoco - it's easier to use than your current code, and has virtually no overhead (and given your example, is probably faster). There are many other alternatives, however.
Issues with your code sample:
Improperly disposed objects: you aren't disposing SqlCommand and SqlDataReader properly. You possibly aren't disposing connections either (but that depends on Connexion internals).
Using .ToString rather than type-safe casts. You should never extract data from an SqlDataReader like that because it undermines the whole point of the type system, and it's slow to boot. (PetaPoco or something similar will help a lot here)
You're discarding stack traces on error due to the (pointless) try-catch. That just makes your code less readable and harder to debug. Don't catch unless you have have.
Keeping your query away from the code - your code is tightly coupled to the query, and this separation just makes it hard to keep them in sync. Also, loading from the filesystem each and everytime you query is slow and opens up unnecessary filesystem-related failure modes such as locking, max path lengths, and permissions. This is probably the source of your bug - your query probably doesn't do what you think it does.
With PetaPoco or something similar, your entire function would look something like this:
public static List<Pers_Synthese> Get_ListeSynthese_all(
string codeClient, DateTime DateDeb, DateTime DateFin) {
var db = new PetaPoco.Database("Soft8Exp_ClientConnStr");
//you should probably not be storing a query in a file.
//To be clear: your query should not be wrapped in exec sp_executesql,
//ADO.NET will do that for you.
string sql_Syntax = Outils.LoadFileToString(
Path.Combine(appDir, #"SQL\Get_ListeSynthese_All.sql"));
//You'll need to rename Pers_Synthese's properties to match the db,
// or vice versa, or you can annotate the properties with the column names.
return db.Fetch<Pers_Synthese>(sql_Syntax, new {
codeClioent = codeClient, //I suspect this is a typo
DateDeb,
DateFin
});
}
And in that much shorter, readable, faster form, you'll hopefully find whatever bug you have much faster.
Alternatives:
PetaPoco
Dapper (fewer features, but stackoverflow uses it!)
OrmLite (of ServiceStack fame)
Massive (older, uses dynamic which is a feature that can cause bad habits - I don't recommend this unless you really know what you're doing)
You could use heavier, more invasive ORM's like the Entity framework and NHibernate, but these require quite a bit more learning, and they're much slower, and they impose a particular workflow on you which I don't think makes them the best choice in your case.
when i debug it with sql profiler it show the data....that meant the data is not empty, but it never enter into this loop.
It's the other way round: if it never enters into this loop, then it means "the data is empty", i.e. the query returns no rows.
The bug is in your code, not SqlReader: you possibly have the wrong values in your parameters, or maybe the query you read from a file isn't what you think it is. Get the debugger out and inspect the query text and parameters.

Categories