Using Entity Framework Transaction Correctly for Isolation - c#

I'm using Entity Framework 6.0 and SQL Server 2016 for my ASP.Net Website. I recently found a problem with concurrency at one of my function. The function is used for processing unpaid order and sometimes this function is executed multiple times for the same key and the same time (because multiple user access it together).
Here is what it looks like.
public void PaidOrder(string paymentCode)
{
using (MyEntities db = new MyEntities())
{
using (DbContextTransaction trans = db.Database.BeginTransaction())
{
try
{
Order_Payment_Code payment = db.Order_Payment_Code.Where(item => item.PaymentCode == paymentCode).FirstOrDefault();
if(payment.Status == PaymentStatus.NotPaid)
{
//This Scope can be executed multiple times
payment.Status = PaymentStatus.Paid;
db.Entry(payment).State = EntityState.Modified;
db.SaveChanges();
//Continue processing Order
trans.Commit();
}
}
catch (Exception ex)
{
trans.Rollback();
}
}
}
}
What I don't understand is why scope inside my if statement can be executed multiple time even it is inside a transaction? Isn't transaction suppose to be isolating the data? Or my understanding of transaction is wrong? If so, then what is the correct way to make the scope inside my if statement only executed once?

A simple and reliable way to serialize an EF SQL Server transaction is to use an Application Lock.
Add this method to your DbContext:
public void GetAppLock(string lockName)
{
var sql = "exec sp_getapplock #lockName, 'exclusive';";
var pLockName = new SqlParameter("#lockName", SqlDbType.NVarChar, 255);
pLockName.Value = lockName;
this.Database.ExecuteSqlCommand(sql, pLockName);
}
And call it just after you start your transaction.
public void PaidOrder(string paymentCode)
{
using (MyEntities db = new MyEntities())
{
using (DbContextTransaction trans = db.Database.BeginTransaction())
{
db.GetAppLock("PaidOrder");
Order_Payment_Code payment = db.Order_Payment_Code.Where(item => item.PaymentCode == paymentCode).FirstOrDefault();
if(payment.Status == PaymentStatus.NotPaid)
{
//This Scope can be executed multiple times
payment.Status = PaymentStatus.Paid;
db.Entry(payment).State = EntityState.Modified;
db.SaveChanges();
//Continue processing Order
}
trans.Commit();
}
}
}
Then only one instance of that transaction can run at a time, even if you have multiple front-end servers. So this is like a Mutex that works across all the clients that access the same database.

Related

How to implement transaction in Entity Framework 6 with Web Api in Asp.Net?

Is there any way to implement a transaction in Entity Framework 6 with Web API in Asp.Net?
I am inserting 10 tables separately using Web API calls in a single asp.net web form. I am seeking ideas or technical feasibility suggestions using Entity framework and Web API.
Keep in mind:
In Entity Framework, the SaveChanges() method internally creates a transaction and wraps all INSERT, UPDATE and DELETE operations under it. Multiple SaveChanges() calls, create separate transactions, perform CRUD operations and then commit each transaction.
If you really want to perform transaction its really easy:
using (var context = new SomeDbContext())
{
using (DbContextTransaction transaction = context.Database.BeginTransaction()) {
//do stuff
context.SaveChanges();
// multiple saves
context.SaveChanges();
transaction.Commit(); // this is one transaction
}
}
Below is the sample code snippet would give a clarity on the multiple tables Create or Update transaction. The first table column ID is the foreign key for other child tables. So if there is an exception on child table insertion, the parent table record will also get rolled back. And thus the whole table which is included in transaction will get rolled back successfully.
public bool CreateOrUpdateEmployee(Common common)
{
bool IstransactionComplete= false;
EmployeeEntities DbContext = new EmployeeEntities();
using (var transaction = DbContext.Database.BeginTransaction())
{
try
{
if (common.Mode == Modes.CREATE) //Modes - User defined Enum
{
DbContext = CreateFinanceEmployees(common, DbContext); //DbContext.savechanges() inside this method.
DbContext = CreateManufacturingEmployee(common, DbContext); //DbContext.savechanges() inside this method.
DbContext = CreateLogisticsEmployee(common, DbContext); //DbContext.savechanges() inside this method.
}
else
{
DbContext = UpdateFinanceEmployees(common, DbContext); //DbContext.savechanges() inside this method.
DbContext = UpdateManufacturingEmployee(common, DbContext); //DbContext.savechanges() inside this method.
DbContext = UpdateLogisticsEmployee(common, DbContext); //DbContext.savechanges() inside this method.
}
**transaction.Commit();**
IstransactionComplete=true;
}
catch (Exception ex)
{
**transaction.Rollback();**
IstransactionComplete=false;
}
finally
{
transaction.Dispose();
}
}
return IstransactionComplete;
}
Yes.
You can use the context provided by EF to create a transaction scope. At the end of the scope you can commit or rollback.
You can do something like that:
class WebController
{
public Response restMethod()
{
var context = getYourDBCOntext();
using(var dbContextTransaction = context.Database.BeginTransaction())
{
try {
// do something with the DB
context.Database.ExecuteSqlCommand( /* sql command */ );
// save changes
context.SaveChanges();
// commit transaction
dbContextTransaction.Commit();
catch(Exception)
{
// Rollback in case of an error
dbContextTransaction.Rollback();
}
}
}
}

Use external System.Common.DBTransaction with Entity Framework 5

i'm stuck within a problem that is going to happen to everyone in ADO.NET mixed with Entity Framework contexts.
I've got a big procedure that handles and saves data into database by using multiple ways like ADO.NET dataadapters and direct CRUD commands against DB. All the procedure is wrapped by 2 using() blocks that creates and releases a DBConnection/DBTransaction and a try/catch block to commit or rollback the transaction. Unfortunately, in the middle of this routine, i have to recall a saving procedure implemented by using Entity Framework. This leads me to a problem:
According to the official documentation, Entity Framework 5 allows me to pass a connection with an transaction associated transaction (it should work, in debug mode, when i call SaveChanges() i don't receive any TimeOutException due to deadlocks, conversely if i pass a new connection it does), but unfortunately after SaveChanges() kicks in the connection is closed and associated transaction committed! Even if i set the flag 'contextOwnsConnection'!
As far as i know, if i migrate EF5 to EF6, things should work (am i right?), but unfortunately i can't, because the project i'm working on is very large and involves a lot of dependecies and it would take a large amount of time.
How can i make it work with EF5? Is there any trick or pattern to achieve the desired result? Am i right about the behavior of EF6? Does it worth EF6 migration?
Here you are a simple example of how does my code looks like.
For privacy reasons i can't post the original code but just imagine a situation like this with a lot of more complexity:
using(DbConnection conn = DBProvider.CreateConnection()){
//Open the created connection
conn.Open();
//Create a new transaction
using(DbTransaction tr = DBProvider.CreateTransaction()){
//Begin a new transaction
tr.Begin();
bool saveOk;
try{
//Updates customers by using dataadapter
dataAdapterCustomers.InsertCommand.Transaction = tr;
dataAdapterCustomers.UpdateCommand.Transaction = tr;
dataAdapterCustomers.DeleteCommand.Transaction = tr;
dataAdapterCustomers.Update();
//Updates stock items by using dataadapter
stockAdapterCustomers.InsertCommand.Transaction = tr;
stockAdapterCustomers.UpdateCommand.Transaction = tr;
stockAdapterCustomers.DeleteCommand.Transaction = tr;
stockAdapterCustomers.Update();
//...Many other DB accessing here...
//Updates stock quantity by using simple DBCommand
quantityUpdateCmd.Transaction = tr;
quantityUpdateCmd.ExecuteNonQuery();
//Updates stock statistics by using a simple DBCommand
updateStockStatsCmd.Transaction = tr;
updateStockStatsCmd.ExecuteNonQuery();
//...Many other DB accessing here...
//HERE:
//Creates a new activity and save it using EF.
//I use a UnitOfWork and i pass to it my connection and 'false' as contextOwnsConnection parameter
//(it 'll be used by the DBContext contained in my Unit of work)
using(ActivityUoW uow = new ActivityUoW(conn, false)){
Activity act = new Activity();
act.Name = "Saving activity";
act.Description = "Done by user";
act.Date = DateTime.Now;
uow.Activities.Add(act);
uow.SaveChanges();
}
//Based on activity result, launch a store procedure that makes other complex things.
UNFORTUNATELY THE CONNECTION HAS BEEN CLOSED AND TRANSACTION COMMITTED, SO THE FOLLOWING INSTRUCTION WILL FAIL.
launchActivityUpdateSpCmd.Transaction = tr;
launchActivityUpdateSpCmd.ExecuteNonQuery();
//...Many other DB accessing here...
//Data saved correctly
saveOk = true;
}
catch(Exception ex){
//There was an error during save
saveOk = false;
}
//Commit or rollback transaction according to save procedure result
if(saveOk)
tr.Commit();
else
tr.Rollback();
}
}
I didn't quite follow your question(s) and wasn't sure if your issue was related to how to handle transactions or if you had a question about EF5 to EF6 migrations. That being said, you have an interesting mixture of data access code.
Regarding transactions - I would look into using the TransactionScope which is part of the System.Transactions namespace.
For example:
try
{
using (var scope = new TransactionScope())
{
using (var conn = new SqlConnection("your connection string"))
{
conn.Open();
// your EF and ADO.NET code
}
scope.Complete();
}
}
catch (TransactionAbortedException ex)
{
}
catch (ApplicationException ex)
{
}

Dapper + Oracle + TransactionScope = Transaction has aborted

I've looked around but couldn't find an appropriate (or one that is satisfactory to me) on how to address an issue we are having.
I use Dapper and ODP.NET 12 Managed drivers. The problem is not encountered when TransactionScope is not used.
When performing commands under a transaction scope, I get an error "Transaction has aborted" via the TransactionAbortedException thrown.
Observed behaviors:
1) TransactionAbortedException is thrown if and only if the transaction is completed and the TransactionScope is disposed. The point at when the exception is thrown is during dispose.
2) Despite the exception, the transaction concept actually works! After Complete() is invoked, the changes are committed into the database.
Below is the code snippet.
// Conn string: "Data Source=OraDB;Persist Security Info=True;User ID=userxxx;Password=passwordxxx;" providerName="Oracle.ManagedDataAccess.Client
// Note: GetDbFactory().Create() returns a DbConnection object
using (TransactionScope scope = new TransactionScope(TransactionScopeOption.RequiresNew, new TransactionOptions { IsolationLevel = System.Transactions.IsolationLevel.ReadCommitted }))
using (var dbConn = GetDbFactory().Create())
{
foreach (MyDTO dto in dtoList)
{
var tableDAO= new TableDAO(dbConn);
MyEntity entity = new MyEntity()
{
Field1 = dto.Field1,
Field2 = dto.Field2
};
tableDAO.AddOrUpdate(entity);
}
// Commit changes
scope.Complete();
}
// This method is under the DAO class
public void AddOrUpdate(MyEntity entity)
{
// Verify arguments
entity.AsArgumentThrowExceptionIfNull("entity");
// build param
OracleDynamicParameters parameters = new OracleDynamicParameters();
parameters.Add("P_FIELD1", entity.Field1);
parameters.Add("P_FIELD2", entity.Field2);
// execute SP
dbConnection.Execute("PRC_MY_ENTITY_ADDORUPDATE", parameters, commandType: CommandType.StoredProcedure);
}//-- end AddOrUpdate()
==================================================================
UPDATE (09-Apr-15)
I have changed my approach and use the following pattern for now for Oracle. Our code deals with connections in both Oracle and SQL Server so I'd much prefer that the coding pattern is consistent, but until a solution is found with using Oracle+TransactionScope, we'll use the pattern below for Oracle command execution:
using (var dbConnection = dbConnFactory.Create())
{
// Open db connection
dbConnection.Open();
using (var trans = dbConnection.BeginTransaction())
{
bool isSuccess = false;
try
{
// Perform DB operations here
trans.Commit();
isSuccess = true;
}
finally
{
if(!isSuccess) trans.Rollback();
}
}
}
First, the exception is documented as possible:
A call to the Dispose method marks the end of the transaction scope. Exceptions that occur after calling this method may not affect the transaction.
This is in the documentation on the TransactionScope class (https://msdn.microsoft.com/en-us/library/system.transactions.transactionscope%28v=vs.110%29.aspx).
One other thing I noticed is that the complete on the transaction is being before the connection is closed. I'd change this to close the connection and then complete the transaction.
using (TransactionScope scope = new TransactionScope(TransactionScopeOption.RequiresNew, new TransactionOptions { IsolationLevel = System.Transactions.IsolationLevel.ReadCommitted }))
{
using (var dbConn = GetDbFactory().Create())
{
foreach (MyDTO dto in dtoList)
{
var tableDAO= new TableDAO(dbConn);
MyEntity entity = new MyEntity()
{
Field1 = dto.Field1,
Field2 = dto.Field2
};
tableDAO.AddOrUpdate(entity);
}
}
// Commit changes
scope.Complete();
}
I don't see anything wrong with that code. However, if that loop runs long enough, the transaction is going to timeout. You'll then get the exception in question the next time you do an operation against the database. I would try increasing the timeout - Timeout is a property on the TransactionScopeOption class.
I am resorting to using BeginTransaction() as the final approach (refer to my update in my original post). I have read more about why TransactionScope() was failing.
1) ODP.Net promotes to distributed transaction even when using a single DB connection when connecting to Oracle 10g and below (source). Lo and behold, the database I'm connecting to is indeed 10g.
2) You'll need Oracle MTS Service installed. This I didn't have setup on my dev machine.

Bulk insert with EF

I need to insert some objects (about 4 million) in the database using C# and EF (using .NET 3.5). My method that adds the objects is in a for:
private DBModelContainer AddToContext(DBModelContainer db, tblMyTable item, int count)
{
db.AddTottblMyTable (item);
if ((count % 10000== 0) || (count == this.toGenerate))
{
try
{
db.SaveChanges();
}
catch (Exception e)
{
Console.WriteLine(e.StackTrace);
}
}
return db;
}
How to detach the added objects (of type tblMyTable) from the context object? I don't need them for a later use and when more than 300000 objects are added, the execution time between db saving ( db.SaveChanges()) increases considerably.
Regards
Entity Framework may not be the best tool for this type of operation. You may be better off with plain ADO.Net, some stored procedures... But if you had to use it, here are a number of suggestions:
Keep the active Context Graph small by using a new context for each
Unit of Work
Turn off AutoDetechChangesEnabled - context.Configuration.AutoDetectChangesEnabled = false;
Batching, in your loop, Call SaveChanges periodically
EDIT
using(var db = new DBModelContainer())
{
db.tblMyTable.MergeOption = MergeOption.NoTracking;
// Narrow the scope of your db context
db.AddTottblMyTable (item);
db.SaveChanges();
}
Keeping a long running db context is not advisable, so consider refactoring your Add method to not keep attempting to reuse the same context.
See Rick Strahl's post on bulk inserts for more details
AFAK EF does not support directly the BulkInsert so it will be tedious to do such thing manually.
try to consider EntityFramework.BulkInsert
using (var ctx = GetContext())
{
using (var transactionScope = new TransactionScope())
{
// some stuff in dbcontext
ctx.BulkInsert(entities);
ctx.SaveChanges();
transactionScope.Complete();
}
}
You may try Unit Of Work and dont save context (SaveChanges) on every record insert but save it at end

SqlBulkCopy Multiple Tables Insert under single Transaction OR Bulk Insert Operation between Entity Framework and Classic Ado.net

I have two tables which need to be inserted when my application run.
Let's say that I have tables as followed
tbl_FirstTable and tbl_SecondTable
My problem is data volume.
I need to insert over 10,000 rows to tbl_FirstTable and over 500,000 rows to tbl_SecondTable.
So fristly, I use entity framework as follow.
public bool Save_tbl_FirstTable_Vs_tbl_SecondTable(List<tbl_FirstTable> List_tbl_FirstTable, List<tbl_SecondTable> List_tbl_SecondTable)
{
bool IsSuccessSave = false;
try
{
using (DummyDBClass_ObjectContext _DummyDBClass_ObjectContext = new DummyDBClass_ObjectContext())
{
foreach (tbl_FirstTable _tbl_FirstTable in List_tbl_FirstTable)
{
_DummyDBClass_ObjectContext.tbl_FirstTable.InsertOnSubmit(_tbl_FirstTable);
}
foreach (tbl_SecondTable _tbl_SecondTable in List_tbl_SecondTable)
{
_DummyDBClass_ObjectContext.tbl_SecondTable.InsertOnSubmit(_tbl_SecondTable);
}
_DummyDBClass_ObjectContext.SubmitChanges();
IsSuccessSave = true;
}
}
catch (Exception ex)
{
Log4NetWrapper.WriteError(string.Format("{0} : {1} : Exception={2}",
this.GetType().FullName,
(new StackTrace(new StackFrame(0))).GetFrame(0).GetMethod().Name.ToString(),
ex.Message.ToString()));
if (ex.InnerException != null)
{
Log4NetWrapper.WriteError(string.Format("{0} : {1} : InnerException Exception={2}",
this.GetType().FullName,
(new StackTrace(new StackFrame(0))).GetFrame(0).GetMethod().Name.ToString(),
ex.InnerException.Message.ToString()));
}
}
return IsSuccessSave;
}
That is the place I face error Time out exception.
I think that exception will be solved If I use below code.
DummyDBClass_ObjectContext.CommandTimeout = 1800; // 30 minutes
So I used it. It solved but I face another error OutOfMemory Exception.
So I searched the solutions, fortunately, I found below articles.
Problem with Bulk insert using Entity Framework
Using Transactions with SqlBulkCopy
Performing a Bulk Copy Operation in a Transaction
According to that articles, I change my code from Entity Framework to Classic ADO.net code.
public bool Save_tbl_FirstTable_Vs_tbl_SecondTable(DataTable DT_tbl_FirstTable, DataTable DT_tbl_SecondTable)
{
bool IsSuccessSave = false;
SqlTransaction transaction = null;
try
{
using (DummyDBClass_ObjectContext _DummyDBClass_ObjectContext = new DummyDBClass_ObjectContext())
{
var connectionString = ((EntityConnection)_DummyDBClass_ObjectContext.Connection).StoreConnection.ConnectionString;
using (SqlConnection connection = new SqlConnection(connectionString))
{
connection.Open();
using (transaction = connection.BeginTransaction())
{
using (SqlBulkCopy bulkCopy_tbl_FirstTable = new SqlBulkCopy(connection, SqlBulkCopyOptions.KeepIdentity, transaction))
{
bulkCopy_tbl_FirstTable.BatchSize = 5000;
bulkCopy_tbl_FirstTable.DestinationTableName = "dbo.tbl_FirstTable";
bulkCopy_tbl_FirstTable.ColumnMappings.Add("ID", "ID");
bulkCopy_tbl_FirstTable.ColumnMappings.Add("UploadFileID", "UploadFileID");
bulkCopy_tbl_FirstTable.ColumnMappings.Add("Active", "Active");
bulkCopy_tbl_FirstTable.ColumnMappings.Add("CreatedUserID", "CreatedUserID");
bulkCopy_tbl_FirstTable.ColumnMappings.Add("CreatedDate", "CreatedDate");
bulkCopy_tbl_FirstTable.ColumnMappings.Add("UpdatedUserID", "UpdatedUserID");
bulkCopy_tbl_FirstTable.ColumnMappings.Add("UpdatedDate", "UpdatedDate");
bulkCopy_tbl_FirstTable.WriteToServer(DT_tbl_FirstTable);
}
using (SqlBulkCopy bulkCopy_tbl_SecondTable = new SqlBulkCopy(connection, SqlBulkCopyOptions.KeepIdentity, transaction))
{
bulkCopy_tbl_SecondTable.BatchSize = 5000;
bulkCopy_tbl_SecondTable.DestinationTableName = "dbo.tbl_SecondTable";
bulkCopy_tbl_SecondTable.ColumnMappings.Add("ID", "ID");
bulkCopy_tbl_SecondTable.ColumnMappings.Add("UploadFileDetailID", "UploadFileDetailID");
bulkCopy_tbl_SecondTable.ColumnMappings.Add("CompaignFieldMasterID", "CompaignFieldMasterID");
bulkCopy_tbl_SecondTable.ColumnMappings.Add("Value", "Value");
bulkCopy_tbl_SecondTable.ColumnMappings.Add("Active", "Active");
bulkCopy_tbl_SecondTable.ColumnMappings.Add("CreatedUserID", "CreatedUserID");
bulkCopy_tbl_SecondTable.ColumnMappings.Add("CreatedDate", "CreatedDate");
bulkCopy_tbl_SecondTable.ColumnMappings.Add("UpdatedUserID", "UpdatedUserID");
bulkCopy_tbl_SecondTable.ColumnMappings.Add("UpdatedDate", "UpdatedDate");
bulkCopy_tbl_SecondTable.WriteToServer(DT_tbl_SecondTable);
}
transaction.Commit();
IsSuccessSave = true;
}
connection.Close();
}
}
}
catch (Exception ex)
{
if (transaction != null)
transaction.Rollback();
Log4NetWrapper.WriteError(string.Format("{0} : {1} : Exception={2}",
this.GetType().FullName,
(new StackTrace(new StackFrame(0))).GetFrame(0).GetMethod().Name.ToString(),
ex.Message.ToString()));
if (ex.InnerException != null)
{
Log4NetWrapper.WriteError(string.Format("{0} : {1} : InnerException Exception={2}",
this.GetType().FullName,
(new StackTrace(new StackFrame(0))).GetFrame(0).GetMethod().Name.ToString(),
ex.InnerException.Message.ToString()));
}
}
return IsSuccessSave;
}
Finally, It perform insert process in less than 15 seconds for over 500,000 rows.
There is two reasons why I post this scenario.
I would like to share what I found out.
As I am not perfect, I still need to get more suggestion from you.
So, every better solution will be appreciated.
1) Use EF6.x, which has much better performance than EF5.x
Here are more suggestions (from Bulk insert with EF)
2) Keep the active Context Graph small by using a new context for each Unit of Work
3) Turn off AutoDetechChangesEnabled - context.Configuration.AutoDetectChangesEnabled = false;
4) Batching, in your loop, Call SaveChanges periodically
I use payed Entity Framework extension from ZZZ Projects which is developer friendly because of fluent API (extenssion methods, functional approach). This is not an andvertisment, I use it in business projects for several years and it is great. If You want to use something for free and You have Oracle database the Oracle Managed Data Access Oracle.ManagedDataAccess.Core has implementation of bulk operations.
Bulk Operations are not really what ORMs are meant for. For bulk insert operations, I send xml to the stored procedure, and I shred it and bulk insert/update or merge from there.
So even when I use an ORM, I create a Domain Library that is not EF (or NHibernate) dependent.so I have a "safety valve" to bypass the ORM in certain situations.
You should look at using the System.Data.SqlClient.SqlBulkCopy for this. Here's the documentation- http://msdn.microsoft.com/en-us/library/system.data.sqlclient.sqlbulkcopy.aspx, and of course there are plenty of tutorials online.
In case we want EF to bulk insert records, would suggest below points to improve performance
Call SaveChanges() after for example 100 records and dispose the context and create a new one.
Disable change detection
Example:
using (TransactionScope scope = new TransactionScope())
{
MyDbContext context = null;
try
{
context = new MyDbContext();
context.Configuration.AutoDetectChangesEnabled = false;
int count = 0;
foreach (var entityToInsert in someCollectionOfEntitiesToInsert)
{
++count;
context = AddToContext(context, entityToInsert, count, 100, true);
}
context.SaveChanges();
}
finally
{
if (context != null)
context.Dispose();
}
scope.Complete();
}
private MyDbContext AddToContext(MyDbContext context,
Entity entity, int count, int commitCount, bool recreateContext)
{
context.Set<Entity>().Add(entity);
if (count % commitCount == 0)
{
context.SaveChanges();
if (recreateContext)
{
context.Dispose();
context = new MyDbContext();
context.Configuration.AutoDetectChangesEnabled = false;
}
}
return context;
}
For the performance it is important to call SaveChanges() after "many" records ("many" around 100 or 1000). It also improves the performance to dispose the context after SaveChanges and create a new one.
This clears the context from all entites, SaveChanges doesn't do that, the entities are still attached to the context in state Unchanged. It is the growing size of attached entities in the context what slows down the insertion step by step.
So, it is helpful to clear it after some time.
AutoDetectChangesEnabled = false; on the DbContext.
It also has a big additional performance effect: Why is inserting entities in EF 4.1 so slow compared to ObjectContext?.
below combination increase speed well enough in EF.
context.Configuration.AutoDetectChangesEnabled = false;
context.Configuration.ValidateOnSaveEnabled = false;

Categories