How to dispose/rollback a BeginTransaction() with several SaveChanges() - c#

I have a 3 layer structure in my solution, where the two lower ones are the logic layer and the database layer. The database layer consists of all CRUD methods for each entity in my database. And in the logic layer we have CRUD services for each entity database CRUD method (EntityService.Add/Get/Update/Delete -> EntityRepository.Add/Get/Update/Delete).
So here I am trying to do a big save method in services that saves several entities. The method calls several Add methods in the logic layer for each entity, which in their turn calls the database Add method. I might add that all database CRUD methods ends with SaveChanges(). Of course I want the save method to be Rollbacked if anything goes wrong so our database don't get unconnected rows, so I have added BeginTransaction() to the save method.
Maybe worth mentioning is that after every CRUD we check so that nothing has gone wrong, and if it has we do a return with the returncode and whatnot. These returns happens before transaction.Commit()
The save method is structured like this: (Logic Layer)
public void SaveMethod(...)
{
using (var context = _unitOfWork.EntityRepository.GetContext()) //Gets the dbContext
{
using (var transaction = context.Database.BeginTransaction())
{
try
{
Service.Entity1.AddEntity1();
Service.Entity2.AddEntity2();
//And so on...
transaction.Commit();
}
catch{
//Exception handling...
}
}
}
}
The method that gets the context to the Save method transaction: (database layer)
public AniPlanContext GetContext()
{
var dbBuilder = new DbContextOptionsBuilder<MyContext>();
var dbConn = _configuration.GetConnectionString("dbContextConnectionstring"); //Gets the db connection string
var options = dbBuilder.UseSqlServer(dbConn).Options;
return new AniPlanContext(options);
}
How an AddEntity method looks like: (Logic layer)
public Entity AddClinic(...)
{
try
{
//Validation....
Entity entity = _unitOfWork.EntityRepository.Add(
new Entity
{
//Set the attributes...
});
return entity;
}
catch (Exception ex)
{
//Exception
}
}
How a database Add looks like: (Database layer)
public TEntity Add(TEntity entity)
{
if (entity == null)
{
throw new ArgumentNullException($"{nameof(Add)} entity must not be null");
}
try
{
_context.Add(entity);
_context.SaveChanges();
return entity;
}
catch (Exception)
{
throw new Exception($"{nameof(entity)} could not be saved");
}
}
Anyhow, my issue is that BeginTransaction Commits the things that has done their AddEntity() even if the save method crashes or fails in any way. Can you help me understand why this is and how I can fix this?
I have tried with TransactionScope and it works but when reading several blogposts it sounds like BeginTransaction() is the save and more reliant way to go. Of what I have understood, both of these should be disposed and rollbacked if it does not go through transaction.Commit or scope.Complete, is that correct?
So to summarize or clarify: I'd like to use BeginTransaction() to save several entities to the database but that also rollback / dispose the transaction if anything has gone wrong.

As #PanagiotisKanavos and #Tomas Chabada agreed upon, it is not the error that is the problem but my solution structure.
This link provides (for me at least) a new way of seeing the UnitOfWork / Repository structure. Apparently they are not needed, which is completely mindblowing. It is worth reading if you ever happen upon the same issue as I had above. It might not be the transaction that is wrong but how we use EF Core and the DBContext and their transactions.

Related

How much can I put into one transaction?

This builds on "Should I always use transactions in nhibernate (even for simple reads and writes)?
".
The gist is, we should always use a transaction, even for a simple get like this:
using (ITransaction transaction = session.BeginTransaction())
{
var printerJob2 = (PrinterJob) session.Get(typeof (PrinterJob), id);
transaction.Commit();
return printerJob2;
}
Let's consider this code:
User user = session.Get<User>(userId);
if(user == null) return UnAuthorizedResult();
Order order = session.Get<Order>(orderId);
if(order == null) return BadRequestResult();
session.Delete<Order>(order);
transaction.Commit();
return OkResult();
I am assuming that I am not meant to create a new transaction for every DB access, as that becomes quite messy:
User user;
using (ITransaction transaction = session.BeginTransaction())
{
user = session.Get<User>(userId);
transaction.Commit();
}
if(user == null) return UnAuthorizedResult();
Order order;
using (ITransaction transaction = session.BeginTransaction())
{
order = session.Get<Order>(orderId);
transaction.Commit();
}
if(order == null) return BadRequestResult();
using (ITransaction transaction = session.BeginTransaction())
{
session.Delete<Order>(order);
transaction.Commit();
return OkResult();
}
I am assuming it's all supposed to go into one transaction.
Are there limits to this?
How much can I put into one transaction before it "breaks down" in one way or another?
You use transactions when you want a bunch of updates to all succeed or all fail. Generally things are starting to move away from it as a modus operandi, and towards operations that are tolerant of failures, pick up where they left off if tried again (rather than inserting duplicate records - look up idempotency) etc and are generally more forgiving of a world full of latency, transient and fickle network reliability, disparate systems and the inability to establish unifying transactions (look up sagas) etc but in essence, you use a transaction when you want it to be as if it were a one hit operation that worked out, or not.
There's no point making a transaction for a read, and there's nearly no point using one for a single update (unless you want to be able to undo that update later), but if you had to collect a whole load of data about a person and insert it into 17 different tables, you might use a transaction to ensure it all worked or all failed, so you don't have partial data lying around. You don't need to worry about unpicking 9 out of the 17 inserts if there was a fail; you just roll back the transaction and it's as it the insert never happened, though autoincrement numbers might remain bumped on by one
Are there limits to how much data you can ball up in a transaction? Technically yes but I think it unlikely you'd breach them with an enterprise DB if you're adhering to the next rule of thumb, which is..
Realistically, you probably want to keep the transaction sets as small as possible to leave someone the minimum headache of sorting things out when it goes wrong. Don't save every new customer you acquire for a whole year in the same transaction and then then decide to commit the transaction just because it's Christmas; ten thousand inserts thwarted by one name that's too long or a server crash on Christmas eve isn't ideal
So that code from your example is kind of a mess littered with transactions opening and closing within a given controller method. The code in your example is basically the same as using no transactions because each step commits itself. I'm assuming and MVC app because I see an OKResult near the end of your code.
Transactions are atomic units of work. If you've got 3 steps and one fails everything should be rolled back to the last known state before the transaction started. In a web scenario this is usually the request. You'll see a lot of information is you google "session per request" nhibernate. With that said I do a couple of things to ensure I'm adhering to this pattern.
In global.asax I have these helper methods
public static ISession CurrentSession
{
get { return (ISession) HttpContext.Current.Items[Sessionkey]; }
private set { HttpContext.Current.Items[Sessionkey] = value; }
}
protected void Application_BeginRequest() { CurrentSession = SessionFactory.OpenSession(); }
protected void Application_EndRequest()
{
if (CurrentSession != null)
CurrentSession.Dispose();
}
Then I also have this Attribute that I can used at the method or controller level to ensure each controller action is transactionally sound
[AttributeUsage(AttributeTargets.Method | AttributeTargets.Class)]
public class TransactionAttribute : ActionFilterAttribute
{
private ITransaction Transaction { get; set; }
public override void OnActionExecuting(ActionExecutingContext filterContext)
{
Transaction = MvcApplication.CurrentSession.BeginTransaction(System.Data.IsolationLevel.ReadCommitted);
}
public override void OnActionExecuted(ActionExecutedContext filterContext)
{
if (!Transaction.IsActive)
return;
if (filterContext.Exception == null)
{
Transaction.Commit();
return;
}
Transaction.Rollback();
}
}
And now my transaction code fairly well contained and doesn't have to be littered all over your controller methods.

How to implement transaction in Entity Framework 6 with Web Api in Asp.Net?

Is there any way to implement a transaction in Entity Framework 6 with Web API in Asp.Net?
I am inserting 10 tables separately using Web API calls in a single asp.net web form. I am seeking ideas or technical feasibility suggestions using Entity framework and Web API.
Keep in mind:
In Entity Framework, the SaveChanges() method internally creates a transaction and wraps all INSERT, UPDATE and DELETE operations under it. Multiple SaveChanges() calls, create separate transactions, perform CRUD operations and then commit each transaction.
If you really want to perform transaction its really easy:
using (var context = new SomeDbContext())
{
using (DbContextTransaction transaction = context.Database.BeginTransaction()) {
//do stuff
context.SaveChanges();
// multiple saves
context.SaveChanges();
transaction.Commit(); // this is one transaction
}
}
Below is the sample code snippet would give a clarity on the multiple tables Create or Update transaction. The first table column ID is the foreign key for other child tables. So if there is an exception on child table insertion, the parent table record will also get rolled back. And thus the whole table which is included in transaction will get rolled back successfully.
public bool CreateOrUpdateEmployee(Common common)
{
bool IstransactionComplete= false;
EmployeeEntities DbContext = new EmployeeEntities();
using (var transaction = DbContext.Database.BeginTransaction())
{
try
{
if (common.Mode == Modes.CREATE) //Modes - User defined Enum
{
DbContext = CreateFinanceEmployees(common, DbContext); //DbContext.savechanges() inside this method.
DbContext = CreateManufacturingEmployee(common, DbContext); //DbContext.savechanges() inside this method.
DbContext = CreateLogisticsEmployee(common, DbContext); //DbContext.savechanges() inside this method.
}
else
{
DbContext = UpdateFinanceEmployees(common, DbContext); //DbContext.savechanges() inside this method.
DbContext = UpdateManufacturingEmployee(common, DbContext); //DbContext.savechanges() inside this method.
DbContext = UpdateLogisticsEmployee(common, DbContext); //DbContext.savechanges() inside this method.
}
**transaction.Commit();**
IstransactionComplete=true;
}
catch (Exception ex)
{
**transaction.Rollback();**
IstransactionComplete=false;
}
finally
{
transaction.Dispose();
}
}
return IstransactionComplete;
}
Yes.
You can use the context provided by EF to create a transaction scope. At the end of the scope you can commit or rollback.
You can do something like that:
class WebController
{
public Response restMethod()
{
var context = getYourDBCOntext();
using(var dbContextTransaction = context.Database.BeginTransaction())
{
try {
// do something with the DB
context.Database.ExecuteSqlCommand( /* sql command */ );
// save changes
context.SaveChanges();
// commit transaction
dbContextTransaction.Commit();
catch(Exception)
{
// Rollback in case of an error
dbContextTransaction.Rollback();
}
}
}
}

Entity framework object not updating after being passed around

I'm working on setting up a new MVC payment site with a dependency-injected database connection in a separate project, and experimenting with some new things as I do. Currently, I'm trying to load an existing transaction from the database, authorize the card payment, and then save the result back to the database. Simple and straightforward, but when I call SaveChanges(), nothing gets saved, and I've run out of things to try.
The database interaction for this is handled by a CheckoutDataProvider:
public class CheckoutDataProvider : ICheckoutDataProvider
{
private readonly CheckoutEntities _context;
public CheckoutDataProvider(CheckoutEntities _context)
{
this._context = _context;
}
public ITransaction GetTransactionDetails(Guid transactionId)
{
var trans = _context.Transactions.FirstOrDefault(x => x.CheckoutTransactionId == transactionId);
return trans; // It's OK if trans == null, because the caller will expect that.
}
public void AddAuthorization(ITransaction transaction, IAuthorizationHistory history)
{
try
{
var trans = (Transaction)transaction;
var hist = (AuthorizationHistory)history;
trans.AuthorizationHistories.Add(hist);
_context.SaveChanges();
}
catch (DbEntityValidationException ex)
{
throw new InvalidDataException(ex.EntityValidationErrors.First().ValidationErrors.First().ErrorMessage, ex);
}
}
}
Transaction and AuthorizationHistory are EF objects and correspond directly to the database tables. CheckoutEntities is the context, as injected by Ninject.
GetTransactionDetails() works flawlessly. I give it the transactionId, I get the object, and then I use that data to run the card and generate the AuthorizationHistory class. Then I call AddAuthorization() to attach it to the transaction and save it to the database. But both the new AuthorizationHistory object and any changes to the original Transaction fail to save.
I can tell from inspecting the _context object that it's not aware of any changes, and if I make changes withing GetTransactionDetails() (before it gets returned as an interface) they will persist. So it looks like a problem with the casting (which makes me feel icky anyway, so I'd love to find out that that's the problem).
Am I missing something obvious? Is there something missing to get this to work? Is there a way to avoid the casting in AddAuthorization()?
Probably you are not sharing the same DBContext Between GetTransactionDetails and AddAuthoritzation. Due to this reason Entity Framework is not able to track the changes.
Set the scope life of DBContext for web request, you can do it with Ninject with .InRequestScope() https://github.com/ninject/ninject/wiki/Object-Scopes , with this option the same DBContext will be used during a web request.

Bulk insert with EF

I need to insert some objects (about 4 million) in the database using C# and EF (using .NET 3.5). My method that adds the objects is in a for:
private DBModelContainer AddToContext(DBModelContainer db, tblMyTable item, int count)
{
db.AddTottblMyTable (item);
if ((count % 10000== 0) || (count == this.toGenerate))
{
try
{
db.SaveChanges();
}
catch (Exception e)
{
Console.WriteLine(e.StackTrace);
}
}
return db;
}
How to detach the added objects (of type tblMyTable) from the context object? I don't need them for a later use and when more than 300000 objects are added, the execution time between db saving ( db.SaveChanges()) increases considerably.
Regards
Entity Framework may not be the best tool for this type of operation. You may be better off with plain ADO.Net, some stored procedures... But if you had to use it, here are a number of suggestions:
Keep the active Context Graph small by using a new context for each
Unit of Work
Turn off AutoDetechChangesEnabled - context.Configuration.AutoDetectChangesEnabled = false;
Batching, in your loop, Call SaveChanges periodically
EDIT
using(var db = new DBModelContainer())
{
db.tblMyTable.MergeOption = MergeOption.NoTracking;
// Narrow the scope of your db context
db.AddTottblMyTable (item);
db.SaveChanges();
}
Keeping a long running db context is not advisable, so consider refactoring your Add method to not keep attempting to reuse the same context.
See Rick Strahl's post on bulk inserts for more details
AFAK EF does not support directly the BulkInsert so it will be tedious to do such thing manually.
try to consider EntityFramework.BulkInsert
using (var ctx = GetContext())
{
using (var transactionScope = new TransactionScope())
{
// some stuff in dbcontext
ctx.BulkInsert(entities);
ctx.SaveChanges();
transactionScope.Complete();
}
}
You may try Unit Of Work and dont save context (SaveChanges) on every record insert but save it at end

How to save combined (new+modified) detached entities in Entity Framework?

What is the proper and fast way to save combined new and modified detached POCO entities?
I was thinking about these methods:
private void Method_2(IList<Entity> entities) //detached entities
{
//This method is using SELECT to check if entity exist
using (var context = new ModelContainer())
{
foreach (Entity entity in entities)
{
var foundEntity = context.CreateObjectSet<Entity>().SingleOrDefault(t => t.Id == entity.Id);
context.Detach(foundEntity); //Remove it from ObjectStateManager
if (foundEntity != null)//It is modified entity
{
context.AttachTo("EntitySet", entity); //Attach our entity
context.ObjectStateManager.ChangeObjectState(entity, EntityState.Modified); //We know it exists
}
else//It is new entity
{
context.CreateObjectSet<Entity>().AddObject(entity);
}
}
context.SaveChanges();
}
}
private void Method_1(IList<Entity> entities) //detached entities
{
//This method doesn't select anything from DB, but i have ta call Savechanges after each object
using (var context = new ModelContainer())
{
foreach (Entity entity in entities)
{
try
{
context.AttachTo("EntitySet", entity);
context.ObjectStateManager.ChangeObjectState(entity, EntityState.Modified);
context.SaveChanges();
}
catch (OptimisticConcurrencyException)
{
context.ObjectStateManager.ChangeObjectState(entity, EntityState.Added);
context.SaveChanges();
}
}
}
}
When you are working in detached environment you have to know which entity was added and which is modified - it is your responsibility to keep this information and provide it to ObjectContext.
Well i agree with this statement if you found yourself in situation when you need to use EF code like this in EF definitely something is wrong with you decision. I have chosen wrong tool for this job.
When you are working in detached environment you have to know which entity was added and which is modified - it is your responsibility to keep this information and provide it to ObjectContext.
The very easy way is:
foreach (var entity in entities)
{
if (entity.Id == 0) // 0 = default value: means new entity
{
// Add object
}
else
{
// Attach object and set state to modified
}
}
The example requires that you have some db auto-generated primary key (Id).
Your Method 2 is possible with some modifications. It is not needed to detach entity when you load it. Instead use ApplyCurrentValues. The approach with loading entity first is very usefull when you decide to work with object graphs instead of single entity. But in the case of object graph you have to do synchronization manually. ApplyCurrentValues works only for scalar (non navigation) properties. You can try to futher optimize your method to load needed enitites in single roundtrip to database instead of loading entities one by one.
Your Method 1 is terrible solution. Using exceptions raised on database server to control program flow is bad approach.
I agree with #Ladislav - Method_1 is a bad approach. Let the database raise exceptions which are caught by EF - don't try and swallow these exceptions yourself.
Your on the right track with Method 1.
Here is how i do it - as i also have a detached context (POCO's, no change tracking, ASP.NET MVC).
BLL Interface: (note i have TPT in my model, hence generics. "Post" is abstract)
void Add(Post post);
void Update<TPost>(TPost post) where TPost : Post, new();
The new() constraint is crucial - you'll see why shortly.
I won't show how i do "Add", because it's simple as you think - AddObject(entity);
The "Update" is the tricky part:
public class GenericRepository<T> : IRepository<T> where T : class
{
public void Update<T2>(T2 entity) where T2: class, new()
{
var stub = new T2(); // create stub, now you see why we need new() constraint
object entityKey = null;
// ..snip code to get entity key via attribute on all domain entities
// once we have key, set on stub.
// check if entity is already attached..
ObjectStateEntry entry;
bool attach;
if (CurrentContext.ObjectStateManager.TryGetObjectStateEntry(CurrentContext.CreateEntityKey(CurrentContext.GetEntityName<T>(), stub), out entry))
{
// Re-attach if necessary.
attach = entry.State == EntityState.Detached;
}
else
{
// Attach for first time.
attach = true;
}
if (attach)
CurrentEntitySet.Attach(stub as T);
// Update Model. (override stub values attached to graph)
CurrentContext.ApplyCurrentValues(CurrentContext.GetEntityName<T>(), entity);
}
}
And that works for me.
As for the entity key, i have used attributes on my domain classes. An alternative (which i'm about to move to), is have all my domain entities implement an interface, which specifies that all domain entities must have a property called "EntityKey". Then i'll use that interface on my constraints. Basically, i needed a dynamic way to create stub entities in a generic repository.
I don't personally like the idea of "checking the ID, if its > 0 then it's an update". Because i'm working with ASP.NET MVC, if i (or another developer) forgets to bind the ID to the View, it won't be passed through, so even though it may be an update, because the ID == 0 it will be added.
I like to be explicit about the operations. This way, i can perform Add/Update seperate validation logic.
Perhaps take a look at Self Tracking POCO entities. IMHO they are perfect for any scenario that requires the entity to be separated from the context. It takes care of all the plumbing code for you.

Categories