Is detach necessary when updating record? - c#

I am working with VS2013, .NET4.5, EF6 and SQL 2008.
I have noticed that one of developers in our team changed all the update statements from
if (!journey.WillAnswer.Any())
{
db.WillAnswer.Add(willAnswer);
}
else
{
db.WillAnswer.Attach(willAnswer);
db.Entry(willAnswer).State = EntityState.Modified;
}
db.SaveChanges();
to
if (!journey.WillAnswer.Any())
{
db.WillAnswer.Add(willAnswer);
}
else
{
var will = db.WillAnswer.Single(x => x.Id == willAnswer.Id);
db.Detach(will);
db.WillAnswer.Attach(willAnswer);
db.Entry(willAnswer).State = EntityState.Modified;
}
db.SaveChanges();
Is detach necessary? Is it good practice?

That is certainly bad practice because now there are two will objects floating around in the app that have the same key. One of them is a detached "dummy" and will simply lose any writes made to it. This is error-prone and confusing.
Maybe your app is not using EF like it is supposed to be used. Normally, there is not a lot of attaching and detaching at all. EF objects are meant to be "live" objects representing the database state. They are not DTOs that you create at will and throw away. Especially avoid generic repo methods such as MyRepository.AddOrUpdate taking a temporary entity.
If you want to write to an entity, obtain it and write. Do not create a new entity for updating.

Related

Problem with EF Core updating nested entities when using automapper

I am maintaining an application which uses EF Core to persist data to a SQL database.
I am trying to implement a new feature which requires me to retrieve an object from the database (Lets pretend its an order) manipulate it and some of the order lines which are attached to it and save it back into the database. Which wouldn't be a problem but I have inherited some of this code so need to try to stick to the existing way of doing things.
The basic process for data access is :
UI -> API -> Service -> Repository -> DataContext
The methods in the repo follow this pattern (Though I have simplified it for the purposes of this question)
public Order GetOrder(int id)
{
return _context.Orders.Include(o=>o.OrderLines).FirstOrDefault(x=>x.Id == id);
}
The service is where business logic and mapping to DTOs are applied, this is what the GetOrder method would look like :
public OrderDTO GetOrder(int id)
{
var ord = _repo.GetOrder(id);
return _mapper.Map<OrderDto>(ord);
}
So to retrieve and manipulate an order my code would look something like this
public void ManipulateAnOrder()
{
// Get the order DTO from the service
var order = _service.GetOrder(3);
// Manipulate the order
order.UpdatedBy = "Daneel Olivaw";
order.OrderLines.ForEach(ol=>ol.UpdatedBy = "Daneel Olivaw");
_service.SaveOrder(order);
}
And the method in the service which allows this to be saved back to the DB would look something like this:
public void SaveOrder(OrderDTO order)
{
// Get the original item from the database
var original = _repo.GetOrder(order.Id);
// Merge the original and the new DTO together
_mapper.Map(order, original);
_repo.Save(original);
}
Finally the repositories save method looks like this
public void Save(Order order){
_context.Update(order)
_context.SaveChanges();
}
The problem that I am encountering is using this method of mapping the Entities from the context into DTOs and back again causes the nested objects (in this instance the OrderLines) to be changed (or recreated) by AutoMapper in such a way that EF no longer recognises them as being the entities that it has just given to us.
This results in errors when updating along the lines of
InvalidOperationException the instance of ProductLine cannot be tracked because another instance with the same key value for {'Id'} is already being tracked.
Now to me, its not that there is ANOTHER instance of the object being tracked, its the same one, but I understand that the mapping process has broken that link and EF can no longer determine that they are the same object.
So, I have been looking for ways to rectify this, There are two ways that have jumped out at me as being promising,
the answer mentioned here EF & Automapper. Update nested collections
Automapper.Collection
Automapper.collection seems to be the better route, but I cant find a good working example of it in use, and the implementation that I have done doesn't seem to work.
So, I'm looking for advice from anyone who has either used automapper collections before successfully or anyone that has any suggestions as to how best to approach this.
Edit, I have knocked up a quick console app as an example, Note that when I say quick I mean... Horrible there is no DI or anything like that, I have done away with the repositories and services to keep it simple.
I have also left in a commented out mapper profile which does work, but isn't ideal.. You will see what I mean when you look at it.
Repo is here https://github.com/DavidDBD/AutomapperExample
Ok, after examining every scenario and counting on the fact that i did what you're trying to do in my previous project and it worked out of the box.
Updating your EntityFramework Core nuget packages to the latest stable version (3.1.8) solved the issue without modifying your code.
AutoMapper in fact "has broken that link" and the mapped entities you are trying to save are a set of new objects, not previously tracked by your DbContext. If the mapped entities were the same objects, you wouldn't have get this error.
In fact, it has nothing to do with AutoMapper and the mapping process, but how the DbContext is being used and how the entity states are being managed.
In your ManipulateAnOrder method after getting the mapped entities -
var order = _service.GetOrder(3);
your DbContext instance is still alive and at the repository layer it is tracking the entities you just retrieved, while you are modifying the mapped entities -
order.UpdatedBy = "Daneel Olivaw";
order.OrderLines.ForEach(ol=>ol.UpdatedBy = "Daneel Olivaw");
Then, when you are trying to save the modified entities -
_service.SaveOrder(order);
this mapped entities reach the repository layer and DbContext tries to add them to its tracking list, but finds that it already has entities of same type with same Ids in the list (the previously fetched ones). EF can track only one instance of a specific type with a specific key. Hence, the complaining message.
One way to solve this, is when fetching the Order, tell EF not to track it, like at your repository layer -
public Order GetOrder(int id, bool tracking = true) // optional parameter
{
if(!tracking)
{
return _context.Orders.Include(o=>o.OrderLines).AsNoTracking().FirstOrDefault(x=>x.Id == id);
}
return _context.Orders.Include(o=>o.OrderLines).FirstOrDefault(x=>x.Id == id);
}
(or you can add a separate method for handling NoTracking calls) and then at your Service layer -
var order = _repo.GetOrder(id, false); // for this operation tracking is false

Is there a way to prevent Entity Framework to NEVER read entries where a property is a specific value?

I have a software that has been in the works for a while, today our client decided we NOT delete any data but instead hide them. To do this, I plan to add an "isDeleted" property to all tables and change all methods for deletion to set this property to "true" instead.
Problem is, I have 1000 times more reading than deletion, I can have a User and try to read all Comments of this User by using entity relation, I have to either add a "Where(x => !x.isDeleted)" to every single read like this or if it is possible, opt out ALL data that has isDeleted as true from being read.
Is the latter possible in any way? If not, is there an alternative to writing "Where(x => !x.isDeleted)" a thousand times?
I've looked at this problem before in the past and rolling your own solution is much more difficult than you'd initially think, mostly because it's really hard to change how Include statements load the related entities (EF doesn't really allow you to filter them).
But there is a library that can do it for you.
Filtering the read results
It can be done quite easily using the EntityFramework.DynamicFilters library. (I am not in any way affiliated with the devs, I just really like their library)
The main readme actually has an example that fits your use case:
modelBuilder.Filter("IsDeleted", (ISoftDelete d) => d.IsDeleted, false);
Essentially, it will only return results Where(d => !d.IsDeleted), which is exactly what you'd want. This filter is applied to all direct fetches and include statements, which means that those soft deleted entities are essentially non-existing as far as your domain is concerned.
This does assume that your entities all derive from a shared root which has the delete flag, which is something I'd advise you to do anyway.
Soft-deleting the entities
It's also possible to convert hard deletes into soft deletes in your database context itself, which means that you don't need to rewrite your delete code to instead update the entity (which can be a cumbersome rewrite, and it's always possible that someone forgets it here and there).
You can override the SaveChanges (and SaveChangesAsync) behavior in your context class. This allows you to find all the entities that are going to be deleted, and gives you the option to convert this into an update statement while also raising the IsDeleted flag.
It also ensures that no one can forget to soft delete. Your developers can simply hard delete the entities (when handling the code), and the context will convert it for them.
public class MyContext : DbContext
{
public override int SaveChanges()
{
ConvertHardDeleteToSoftDelete();
return base.SaveChanges();
}
public override async Task<int> SaveChangesAsync(CancellationToken cancellationToken = default)
{
ConvertHardDeleteToSoftDelete();
return await base.SaveChangesAsync(cancellationToken);
}
private void ConvertHardDeleteToSoftDelete()
{
var deletedEntries = ChangeTracker
.Entries<ISoftDelete>()
.Where(entry => entry.State == EntityState.Deleted)
.ToList();
foreach (var entry in deletedEntries)
{
entry.State = EntityState.Modified;
entry.IsDeleted = true;
}
}
}
Combined with the dynamic filter suggestion above, this means that such a soft deleted entity will not appear again in your application, but it will still exist in the database.

Entity Framework updating many to many

I am trying to find the suitable form of updating a many to many relationship but i am find some issues on it.
The application is an asp.net mvc with simple injector(set up per context)
I have an entity People which has an IEnumerable and also i have a entity Team which has an IEnumerable.
The People entity has some other fields like Description, Email, etc and in its View, there are some check boxes so the user can choose the Teams.
I had tried to search on the net for the best approach for updating a many to many relationship and all that i found was deleting everything in the third table that is created and then add the Teams again.
Under is what i am trying to do, but i am getting pk's already exists. I know it is happening because firstly i load the People entity with Find method(to remove the list of Teams inside a foreach) and after i try to Attach(when the error happens) the modified object to set it's State to Modified.
public override void Modify(People obj)
{
var ppl = SearchById(obj.Id);
if (ppl.Teams.Count > 0)
{
foreach (var team in ppl.Teams.ToList())
{
ppl.Teams.Remove(team);
}
}
var entry = lpcContext.Entry(obj);
if (lpcContext.Entry(obj).State == EntityState.Detached)
dbSet.Attach(obj);
entry.State = EntityState.Modified;
}
To air it out some things, i am using the Unit Of Work pattern, so i SaveChanges later.
Are there any other approach or i have to remove the Teams one by one, SaveChanges and after that, update the object and SaveChanges again?
Unfortunately, working with detached entities isnt that straight forward in EF (yet). Attach() in EF will work for connected entities only. That means if you load an object from DB, pass it on to a view (or page is asp.net). When you read the object back from that view/page, EF will not be tracking that object anymore. If you now try to use Attach(), you will get an error that the key already exists in the DBContext. To workaround this, you need to find the entry and make changes to the entity using SetValues(). Something like this:
public virtual void Update(T entity)
{
DbEntityEntry dbEntityEntry = DbContext.Entry(entity);
if (dbEntityEntry.State == EntityState.Detached)
{
var pkey = _dbset.Create().GetType().GetProperty("Id").GetValue(entity);//assuming Id is the key column
var set = DbContext.Set<T>();
T attachedEntity = set.Find(pkey);
if (attachedEntity != null)
{
var attachedEntry = DbContext.Entry(attachedEntity);
attachedEntry.CurrentValues.SetValues(entity);
}
}
}
Please note that this will ignore any nested objects. Hence, you should make DB trip, and compare the object returned from DB to find out if you should invoke Add, Update or Delete on each child object. This is the best workaround I could find when working with disconnected objects in EF. I guess nHibernate doesnt have this bug. Last I read about this, Microsoft was going to work on this after EF 6.x. So, we'll have to wait for this, I guess. Please go through the below article to understand the issue (and possible solutions) in length:
http://blog.maskalik.com/entity-framework/2013/12/23/entity-framework-updating-database-from-detached-objects/
To talk about your specfic scenario, you should make a DB hit and find out if any new teams were selected or some existing team was dropped and call add or delete as appropriate by comparing the Team collection of People object returned by DB vs People object returned from view/page. To update the People object itself, you can use the Update() as given above.

EF 4.1 DBContext and Navigation Properties

We've been using EF STEs for a while, but our application has grown quite a bit and we decided to sue the new 4.1 DbContext so we can "evolve" a separate business layer on top of our data layer without having to use different types for it.
In the elementary evaluation for the DbContext way of doing things, I am facing a little problem.
I am used to query and preload required related data like:
return context.Orders.Include("Detail").SingleOrDefault(ord => ord.ID == ID);
And then send the returned object to the UI for modification, and when returned from the UI save the changes to the database.
From what I read so far, doing the "change saving" in DbContext is easily done using code like this:
context.Entry(order).State = EntityState.Modified;
The problem with this code is that it actually marks all properties in the object as modified, a thing that's not allowed for some properties in my model (a business rule).
I resorted to the following solution (which seems to require a lot of code for a relatively small requirement! BTW, changing a modified property state to Unchanged is not supported):
context.Orders.Attach(order);
DbEntityEntry<Order> ordEntity = context.Entry(order);
string[] arr =
{
ordEntity.Property(ord => ord.ID).Name,
ordEntity.Property(ord => ord.ClientID).Name,
};
foreach (string prop in ordEntity.OriginalValues.PropertyNames)
{
if (!arr.Contains(prop))
{
ordEntity.Property(prop).IsModified = true;
}
}
context.SaveChanges();
The problem I am facing with this code is that the "Attach" statement is throwing an exception saying that there is some sort of conflict in the navigation properties in the attached object, even if no changes were made to anything at all! (saving the object exactly as it was retrieved from the database).
The error message is something like:
"Conflicting changes to the role 'Detail' of the relationship 'OrdersDatamodel.FK_Order_Detail' have been detected."
The questions are:
Is there a more "elegant" way for preventing the modification of certain object properties?
Does anybody know what's going on with the exception raised when attaching the object to the context?
Thanks.
From what I read so far, doing the "change saving" in DbContext is easily done using code like this:
context.Entry(order).State = EntityState.Modified;
You rarely need to explicitly set the state. When you modify properties, assuming they are virtual, the state will automatically change to Modified without you having to set it. Otherwise, DetectChanges will pick this up during your call to SaveChanges.

Entity Framework 4 Delete call not working

I'm trying to delete a record from my table using EF, and nothing is happening. The code executes with no errors, but the record just sticks around in the DB. I have nearly identical code elsewhere that's working.
using (var DB = new PTNWebConfigurationModel.PTNWebConfigurationEntities())
{
var account = DB.Accounts.Where(a => a.LoginID == loginID).FirstOrDefault();
//Load existing session for the Account, otherwise create a new one.
if (!LoadExistingSession(ip, DB, account))
{
CreateNewSession(ip, DB, account);
}
AccountsSession sessionsToDelete = DB.AccountsSessions.Where(a => a.RemoteIP == ip && a.AccountID == 1).FirstOrDefault();
if (sessionsToDelete != null)
{
DB.DeleteObject(sessionsToDelete);
DB.SaveChanges();
}
}
I've also tried it with these options:
DB.DeleteObject(sessionsToDelete);
DB.SaveChanges(System.Data.Objects.SaveOptions.DetectChangesBeforeSave);
DB.AcceptAllChanges();
I also started with no using block and just and instantiated entity object, but that didn't work either.
I love issues like this.
Most of the times that I have seen an ORM not execute a command (and without error) is due to changes to the underlying data tables that aren't represented in the generated classes for that table.
You might refresh your schema / class model and try again.
Carrying on from Chris' answer the other thing I've found in the past is if you are doing EF database first modelling then you may have created the foreign keys linking two associated tables but the key hasn't been set as delete cascade in the relationship. This can cause EF to be a little difficult. So check any FKs in your db. Hope that helps.

Categories