Multiple stored proc call in .Net under 1 transaction - c#

I have a .net web page in which two stored procs are called.
Both of these have begin-commit transaction in sql server.Also I am calling the second proc multiple times depending on some if conditions.
I want to wrap this whole process under a single transaction. I have looked around and found Sqltransaction and TransactionScope clases in C# will help me in this situation.
But I have never used them, always use transaction in sql server, and so do not know if the transactions in .Net will have problems as both my stored procs have their own Begin-commit transaction in Sql Server.
If they do conflict is there a way to get them to work under a single transaction?

Yes, it is possible to call (e.g. existing or legacy) Stored Procs from .Net which use manual BEGIN TRAN, COMMIT TRAN / ROLLBACKs under a .Net TransactionScope, or if you manage the transaction from a SqlTransaction. (Although to state the obvious, if you can avoid using multiple transaction technologies, do so).
For the 'happy case' scenario, what will happen is that the ##TRANCOUNT will be increased when the SPROC transactions are called (just as per nested transactions in SqlServer). Transactions are only committed when ##TRANCOUNT hits zero after the outermost commit on the connection. i.e. inner commits will simply decrease ##TRANCOUNT. Note however that the same is not true for ROLLBACKS - unless you are using SAVEPOINTS, any rollback will rollback the entire transaction. You'll need to be very careful of matching up ##TRANCOUNTs.
You sound a bit undecided about TransactionScope vs SqlTransaction. TransactionScope is more versatile, in that it can span both single phase and distributed transactions (using DTC). However, if you only need to coordinate the transaction on a single connection, same database, then SqlTransaction would also be fine.

Related

How to re-execute all commands in a rolled back SqlTransaction instance?

I execute several SQL Transactions from my ASP.NET application. One transaction executes several stored procedures one after another. Transactions are quite long running ones.
When two instances of my application i.e. two different processes are connecting to one database, I sometimes run into SQL deadlock issue. In that case SQL server automatically rolls back a transaction from any one process.
When this happens, I want to execute all commands (stored procedures) in the SqlTransaction which was rolled back.
How can I do this in C#?
Is there easy way to re-execute all the commands in a SQL transaction which were executed before the transaction was rolled back by the SQL server?
Or do I have to write my own logic to "remember" (and execute them once again whenever required) all the stored procedures I had executed?
Actually its a bad practice to do so since as a dev you need to find what was the issue for rollback and fix that even so i would say declare a flag of int and increment it after each sp executions when something goes wrong in catch section where you have wrote the rollback just call the sps according to the flag.
Again thats a wrong practice to do so

Are there any disadvantages of placing explicit transactions in StoredProcedures in SQL SERVER 2008

I have to implement Transactions in my code. I have the following options:-
TransactionScope from C# code:- I will use this case if i have some logical code in transaction along with my Database calls.Transaction rolls back and locks are released if command timeouts are there.
Having Explicit transactions in SP:- In case of command timeout, Transactions remain opens and locks are not released.
Has any one of you faced similar issues. Please suggest. Also tell me setting XACT_ABORT_ON will help in the second case.
If your stored procedure is not an "interface procedure" (called by third-party outside your code) I recommend using TransactionScope or creating SqlTransaction in code for your procedure call. Handling transactions in code is much easier. You may read this article: http://www.code-magazine.com/Article.aspx?quickid=0305111 - as you can see if you start using transactions in stored procedures things may get unnecesarily complicated.
I always use SET XACT_ABORT ON
Server side transactions don't require MSDTC and have less overhead
TRY/CATCH makes the stored procs more reliable
See my answer here for a safe stored proc template which will deal with nested transactions too: Nested stored procedures containing TRY CATCH ROLLBACK pattern?

Sql Transaction - SQL Server or C#?

Am I right in saying that from a performance perspective, sql transactions are far better within a stored procedure than code?
At the moment I use most of my transactions in stored procs but sometimes I use code for more complex routines - which obviously I keep to a minimum as much as possible.
It's just that there was a complex routine that required too many "variables" that writing the sql transaction in c# was far easier than using SQL Server. It's a fine line between code readability and performance.
Any ideas?
The performance varies; a SqlTransaction can have less overhead than a TransactionScope, especially if the TransactionScope decides it needs to get entangled with DTC. But I wouldn't expect a vast difference between SqlTransaction and a BEGIN TRAN, except for the extra round trip. However, TransactionScope is still fast, and is the most convenient option for encapsulating multiple operations in a transaction, as the ambient transaction does not need to be manually associated with the command each time.
Perhaps a better (and more significant) factor is the isolation-level. TransactionScope defaults to the highest (serializable). Lower isolation levels allow morefor less blocking (but at the risk of non-repeatable reads, etc). IIRC a TSQL transaction defaults to one of the lower levels. But the isolation level can be tweaked for all 3 options.
I'm for TransactionScope. As per Marc, use a factory method on your TransactionScope to drop the isolation level to READ COMMITTED for most common usages I can think of.
Note that you can use both SQL transactions AND TransactionScope - the SQL BEGIN TRAN / COMMIT TRAN will have little effect on TransactionScope (other than incrementing / decrementing ##TRANCOUNT) - this way if you do need to call the same SQL Sproc elsewhere, e.g. from an adhoc query that you will still get the benefit of a transaction.
The benefit of TransactionScope IMO is that it will manage DTC for you if you DO need to do 2 phase commit (e.g. Multiple databases, Queues or other XA transactions). And with SQL 2005 and later, it works with the Lightweight Transaction Manager, so DTC won't be required e.g. if all accesses are to the one database, one connection at a time.
Transaction management in the code (read c#) is an option and to be followed for managing transactions over multiple data source or system. For managing transaction for a single database, the management at the server end will be always simpler. But if you think that the code might need to cater to a scenario where multiple data-sources get added in the transaction, keep the transactions in the code level.
I belive that the store procedure will have better performance.
If you mean you write SQL Transactions in C# and use ADO.Net or similar to execute them then they are probably less efficient because SQL will cache the query plan for a stored procedure (which is also now the case for Entity Framework - though still not as quick as proc I don't think) so really you should probably be doing it the other way round - complex procedures in SQL to get the caching benefits (if only it were that simple...)
It depends on what application.
But I will say that in most cases it is best to surprised to have logic in the database. A very advantageous also to have business logic in the databse, is that then it will be the same even if you have riders a WinForms version and a web, or whatever
But if you're talking about the CLR in SQL. The negative with this, it becomes much more difficult for a DBA to find any errors or something about. performance.

Sql client transactions from code vs database-controlled transactions

I've always done transactions from within stored procedures but now I need to wrap a bunch of "dynamic" statements executed from code against sp_executesql in a transaction.
Specifically I need the READ UNCOMMITED isolation level for these in some cases (I know what that does, and yes, that's what I need). This is SQL2008.
My question is this: If I use the BeginTransaction() method of my SqlConnection instance with the isolation level set to IsolationLevel.ReadUncommitted will that have the same effect as if I executed a stored proc that has the READ UNCOMMITED statement?
Yes, it will.
The SqlConnection uses the SQL native client, and a call to BeginTransaction causes exactly this to be sent to the server:
SET TRANSACTION ISOLATION LEVEL <WHATEVER>; BEGIN TRANSACTION;

Is it possible combine transaction scope with committing a transaction inside a stored procedure?

We have a test that runs within a transaction scope. We dispose of the transaction scope at the end to avoid changing the database.
This works fine in most cases.
However, when we use Entity Framework to execute a stored procedure which contains a transaction, which is committed inside the stored procedure. We get the following error:
"Distributed transaction completed. Either enlist this session in a new transaction or the NULL transaction.\r\n "
Is it possible combine transaction scope with committing a transaction inside a stored procedure?
While you may or may not be able to solve this particular problem, I'd suggest that avoiding it entirely might be a better option. As you've seen, depending on a transaction to guarantee that your database is in a particular state doesn't always work. Also, because you are using multiple connections to the DB, you've automatically promoted any transactions that do occur to distributed transactions -- a subtle distinction, perhaps, but it changes the nature of the test. You may end up writing code to overcome the particular limitations of distributed transactions that wouldn't otherwise have been needed.
A better strategy would be -- for unit tests, anyway -- to mock out the database dependency, using in-memory mock or fake objects in place of the database. I've done something similar for LINQ to SQL (see my blog entry on the subject) For integration tests, I think you are better off using a test instance and writing set up code that reinitializes the state of the DB to known values before each test than introducing an extra transaction to clean things up. That way if your clean up code fails in a test, it won't affect other tests being run.
I use the following code inside an SP to handle contexts where a transaction may or may not be currently in force:-
DECLARE #InTran int
Set #InTran = ##TRANCOUNT
IF #InTran = 0 BEGIN TRANSACTION
/* Stuff happens */
IF #InTran = 0 AND ##TRANCOUNT > 0 COMMIT TRANSACTION
Only thing I'm not sure of is if ##TRANCOUNT reflects a transaction from a Transaction scope, its worth a shot though.

Categories