Hopefully the following code snippet clear enough to explain the problem. _db is an instance of DbContext.
// this scenario is only for learning purpose
Author a = _db.Authors.Find(1831);
int id = a.AuthorId;
a = null;
Author stub = new Author { AuthorId = id };
_db.Remove(stub);
_db.SaveChanges();
The code above produces
'The instance of entity type 'Author' cannot be tracked because another instance with the same key value for {'AuthorId'} is already being tracked. When attaching existing entities, ensure that only one entity instance with a given key value is attached. Consider using 'DbContextOptionsBuilder.EnableSensitiveDataLogging' to see the conflicting key values.'
Question
How to free a from being tracked?
There are a variety of ways to do this, but I find this the most simple since you're trying to detach a specific entity.
_db.Entry(a).State = EntityState.Detached
As a plus it doesn't require changing any other code, including however you fetched the entity itself.
This one line makes it very clear of the intent. It also allows the following:
Fetch an entity, or list of entities
Do various types of work
Detach an entity
I dislike the idea of changing existing queries on a DbContext when all I want to do is detach something.
You can try AsNoTracking().
_db.Authors.AsNoTracking().Where(a => a.AuthorId == 1831).FirstOrDefault();
You should use AsNoTracking option while querying for a.
e.g. Below query gets all blog items but if you try to delete them you should not get any error.
var blogs = context.Blogs
.AsNoTracking()
.ToList();
You can set AsNoTracking either on query or it can also be set on the context object as shown below:
context.ChangeTracker.QueryTrackingBehavior = QueryTrackingBehavior.NoTracking;
var blogs = context.Blogs.ToList();
This should help you to resolve this.
Refer this MSDN page for more details.
Related
Some 2 years+ ago I asked this question which was kindly solved by Steve Py.
I am having a similar but different problem now when mapping with sub-objects. I have had this issue a few times and worked around it, but facing doing so again, I can't help thinking there must be a more elegant solution. I am coding a memebership system in Blazor Wasm and wanting update membership details via a web-api. All very normal.
I have a library function to update the membership:
public async Task<MembershipLTDTO> UpdateMembershipAsync(APDbContext context, MembershipLTDTO sentmembership)
{
Membership? foundmembership = context.Memberships.Where(x =>x.Id == sentmembership.Id)
.Include(x => x.MembershipTypes)
.FirstOrDefault();
if (foundmembership == null)
{
return new MembershipLTDTO { Status = new InfoBool(false, "Error: Membership not found", InfoBool.ReasonCode.Not_Found) };
}
try
{
_mapper.Map(sentmembership, foundmembership, typeof(MembershipLTDTO), typeof(Membership));
//context.Entry(foundmembership).State = EntityState.Modified; <-This was a 'try-out'
context.Memberships.Update(foundmembership);
await context.SaveChangesAsync();
sentmembership.Status = new InfoBool(true, "Membership successfully updated");
return sentmembership;
}
catch (Exception ex)
{
return new MembershipLTDTO { Status = new InfoBool(false, $"{ex.Message}", InfoBool.ReasonCode.Not_Found) };
}
}
The Membership object is an EF DB object and references a many to many list of MembershipTypes:
public class Membership
{
[Key]
public int Id { get; set; }
...more stuff...
public List<MembershipType>? MembershipTypes { get; set; } // The users membership can be several types. e.g. Employee + Director + etc..
}
The MembershipLTDTO is a lightweight DTO with a few heavy objects removed.
Executing the code, I get an EF exception:
The instance of entity type 'MembershipType' cannot be tracked because another instance with the same key value for {'Id'} is already being tracked. When attaching existing entities, ensure that only one entity instance with a given key value is attached.
I think (from the previous question I asked some time ago) that I understand what is happening, and previously, I have worked around this by having a seperate function that would in this case update the membership types. Then, stripping it out of the 'found' and 'sent' objects to allow Mapper to do the rest.
In my mapping profile I have the mappings defines as follows for these object types:
CreateMap<Membership, MembershipLTDTO>();
CreateMap<MembershipLTDTO, Membership>();
CreateMap<MembershipTypeDTO, MembershipType>();
CreateMap<MembershipType, MembershipTypeDTO>();
As I was about to go and do that very thing again, I was wondering if I am missing a trick with my use of Mapper, or Entity Framework that would allow it to happen more seamlessly?
A couple of things come to mind. The first thing is that the call to context.Memberships.Update(foundmembership); isn't required here so long as you haven't disabled tracking in the DbContext. Calling SaveChanges will build an UPDATE SQL statement for whatever values change (if any) where Update will attempt to overwrite the entitiy(ies).
The issue you are likely encountering is common when dealing with references, and I would recommend a different approach because of this. To outline this, lets look at Membership Types. These would typically be a known list that we want to associate to new and existing memberships. We're not going to ever expect to create a new membership type as part of an operation where we create or update a membership, just add or remove associations to existing memberships.
The problem with using Automapper for this is when we want to associate another membership type in our passed in DTO. Say we have existing data that had a membership associated with Membership Type #1, and we want to add MemberShip Type #2. We load the original entity types to copy values across, eager loading membership types so we get the membership and Type #1, so far so good. However, when we call Mapper.Map() it sees a MemberShip Type #2 in the DTO, so it will add a new entity with ID #2 into the collection of our loaded Membership's Types collection. From here, one of three things can happen:
1) The DbContext was already tracking an instance with ID #2 and
will complain when Update tries to associate another entity reference
with ID #2.
2) The DbContext isn't tracking an instance, and attempts to add #2
as a new entity.
2.1) The database is set up for an Identity column, and the new
membership type gets inserted with the next available ID. (I.e. #16)
2.2) The database is not set up for an Identity column and the
`SaveChanges` raises a duplicate constraint error.
The issue here is that Automapper doesn't have knowledge that any new Membership Type should be retrieved from the DbContext.
Using Automapper's Map method can be used to update child collections, though it should only be used to update references that are actual children of the top-level entity. For instance if you have a Customer and a collection of Contacts where updating the customer you want to update, add, or remove contact detail records because those child records are owned by, and explicitly associated to their customer. Automapper can add to or remove from the collection, and update existing items. For references like many-to-many/many-to-one we cannot rely on that since we will want to associate existing entities, not add/remove them.
In this case, the recommendation would be to tell Automapper to ignore the Membership Types collection, then handle these afterwards.
_mapper.Map(sentmembership, foundmembership, typeof(MembershipLTDTO), typeof(Membership));
var memberShipTypeIds = sentmembership.MembershipTypes.Select(x => x.MembershipTypeId).ToList();
var existingMembershipTypeIds = foundmembership.MembershipTypes.Select(x => x.MembershipTypeId).ToList();
var idsToAdd = membershipTypeIds.Except(existingMembershipTypeIds).ToList();
var idsToRemove = existingMembershipTypeIds.Except(membershipTypeIds).ToList();
if(idsToRemove.Any())
{
var membershipTypesToRemove = foundmembership.MembershipTypes.Where(x => idsToRemove.Contains(x.MembershipTypeId)).ToList();
foreach (var membershipType in membershipTypesToRemove)
foundmembership.MembershipTypes.Remove(membershipType;
}
if(idsToAdd.Any())
{
var membershipTypesToAdd = context.MembershipTypes.Where(x => idsToRemove.Contains(x.MembershipTypeId)).ToList();
foundmembership.MembershipTypes.AddRange(membershipTypesToAdd); // if declared as List, otherwise foreach and add them.
}
context.SaveChanges();
For items being removed, we find those entities in the loaded data state and remove them from the collection. For new items being added, we go to the context, fetch them all, and add them to the loaded data state's collection.
Notwithstanding marking Steve Py's solution as the answer, because it is a solution that works, though not as 'elegant' as I would have liked.
I was pointed in another direction however by the comment from
Lucian Bargaoanu, which, though a little cryptic, after some digging I found could be made to work.
To do this I had to add 'AutoMapper.Collection' and 'AutoMapper.Collection.EntityFrameworkCore' to my solution. There was a bit of jiggery pokery around setting it up as the example [here][2], didn't match up with my set up. I used this in my program.cs:
// Auto Mapper Configurations
var mappingConfig = new MapperConfiguration(mc =>
{
mc.AddProfile(new MappingProfile());
mc.AddCollectionMappers();
});
I also had to modify my mapping profile for the object - DTO mapping to this:
//Membership Types
CreateMap<MembershipTypeDTO, MembershipType>().EqualityComparison((mtdto, mt) => mtdto.Id == mt.Id);
Which is used to tell AutoMapper which fields to use for an equality.
I took out the context.Memberships.Update as recommended by Steve Py and it works.
Posted on behalf of the question asker
Closed. This question needs debugging details. It is not currently accepting answers.
Edit the question to include desired behavior, a specific problem or error, and the shortest code necessary to reproduce the problem. This will help others answer the question.
Closed 23 days ago.
The community is reviewing whether to reopen this question as of 12 days ago.
Improve this question
The instance of entity type 'AssegnazioneLotto' cannot be tracked
because another instance with the same key value for
{'Id_AssegnazioneLotto'} is already being tracked.
When attaching existing entities, ensure that only one entity instance
with a given key value is attached.
Consider using 'DbContextOptionsBuilder.EnableSensitiveDataLogging'
to see the conflicting key values.
I encounter this error when we call data from a table and update it.
I solved it by calling a view which calls the table.
Why does this happen?
How can I solve without creating additional views?
The simplest answer: Don't pass entities around outside of the scope they were read. Pass view models (POCO objects rather than entities) and fetch entities on update to copy expected values across.
The complex answer is that when updating entity references, all entity references including child collections and many-to-1 references, you need to check if the DbContext is tracking a matching reference, and either replace the references with the tracked entity, or tell the DbContext to dump the tracked reference before attaching.
For example, an update method that accepts a detached or deserialized "entity". What works sometimes, but then craps other times:
public void UpdateOrder(Order order)
{
context.Update(order);
// OR
context.Attach(order);
context.Entry(order).State = EntityState.Modified;
context.SaveChanges();
}
Looks simple and clean, but craps out when the DbContext instance might already be tracking a matching Order instance. When it is, you get that exception.
The safety check:
public void UpdateOrder(Order order)
{
var existingOrder = context.Orders.Local.SingleOrDefault(o => o.OrderId == order.OrderId);
if (existingOrder != null)
context.Entry(existingOrder).State = EntityState.Detatched;
context.Update(order);
// OR
context.Attach(order);
context.Entry(order).State = EntityState.Modified;
context.SaveChanges();
}
That example checks the local tracking cache for a matching order and dumps any tracked instance. The key here is searching the .Local with the DbSet to search the local tracking cache, not hitting the DB.
Where this gets more complex is where Order contains other entity references like OrderLines, or a reference to a Customer, etc. When dealing with detached entities you need to check over the entire object graph for tracked references.
public void UpdateOrder(Order order)
{
var existingOrder = context.Orders.Local.SingleOrDefault(o => o.OrderId == order.OrderId);
if (existingOrder != null)
context.Entry(existingOrder).State = EntityState.Detatched;
var customer = context.Customers.Local.SingleOrDefault(c => c.CustomerId = order.Customer.CustomerId);
if (customer != null)
order.Customer = customer; // Replace our Customer reference with the tracked one.
else
context.Attach(order.Customer);
context.Update(order);
// OR
context.Attach(order);
context.Entry(order).State = EntityState.Modified;
context.SaveChanges();
}
As you can see, this starts to get complex and cumbersome pretty quick as you need to check every reference. Hence, it's simpler to avoid passing detached or serialized entities around. Using a View Model offers many benefits for performance and simplifying issues like this. Coupled with AutoMapper or a similar mapper that supports projection can make operations with view models very simple:
Selecting Orders:
var orders = context.Orders.Where(/* suitable conditions */)
.ProjectTo<OrderViewModel>(_mapperConfig)
.ToList();
Where _mapperConfig is an AutoMapper configuration that tells AutoMapper how to convert an Order into an OrderViewModel. This can follow conventions or optionally contain mapping rules to build a flattened view model for an Order and it's relative details. ProjectTo works with EF's IQueryable to build an SQL SELECT statement across the entity graph to return only the data needed to populate the view model. This is far more efficient than using Map which would require all related entities to be eager loaded.
When updating:
public void UpdateOrder(UpdateOrderViewModel orderVM)
{
var order = context.Orders.Single(o => o.OrderId == orderVM.OrderId);
if (orderVM.RowVersion != order.RowVersion)
throw new StaleDataException(); // placeholder to handle the situation where the data has changed since our view got the order details.
var mapper = _mapperConfig.CreateMapper();
mapper.Map(orderVM, order);
context.SaveChanges();
}
orderVM could be an OrderViewModel returned, but typically I would recommend packaging just the fields that can be updated into a dedicated view model. The "magic" is in the AutoMapper configuration which governs what fields get copied from the view model back into the entity. If can include child data such as OrderLines or such, in which case you would want to ensure those child entities are eager loaded /w .Include in your DB fetch. AutoMapper's Map method in this case is the variant that copies mapped values from a source to a destination, so values are copied across directly into the tracked entity instance. EF will build an SQL UPDATE statement based on what values actually charge rather than overwriting the entire record.
You can also use the same technique with detached entities to avoid your issue. The benefit of using AutoMapper is that you can configure which values can be legally copied over from the deserialized/detached entity provided into the real data:
public void UpdateOrder(Order updatedOrder)
{
var order = context.Orders.Single(o => o.OrderId == orderVM.OrderId);
if (updatedOrder.RowVersion != order.RowVersion)
throw new StaleDataException(); // placeholder to handle the situation where the data has changed since our view got the order details.
var mapper = _mapperConfig.CreateMapper();
mapper.Map(updatedOrder, order);
context.SaveChanges();
}
This ensures we only change what is allowed to change, and avoids the whole crapshoot of tracked references. In our mapper configuration we literally have an entry like:
cfg.CreateMap<Order, Order>(...)
which will hold explicit rules to ignore copying across fields and related entities we don't want copied across on an Update.
The downside of doing this is the overhead of sending entire entities and potentially their related entities across the wire back and forth, plus to be "safe" from tampering, a lot more effort needs to go into the mapper configuration or copying across allowed values explicitly.
I had the same issue with EF Core and Blazor Server. Switching the scope in the service collection to "Transient" and using a ServiceScopeFactory for the queries/updates did the trick. You'll see below I'm using the Blazor style dependency injection, but constructor injection will still work the same way for an IServiceScopeFactory
[Inject]
IServiceScopeFactory _serviceScopeFactory { get; set; }
private async Task UpdateItem(GridCommandEventArgs args)
{
var utilityItem = (EntityModelSample)args.Item;
using (var scope1 = _serviceScopeFactory.CreateScope())
{
var dbContext = scope1.ServiceProvider.GetService<SampleDbContext>();
dbContext.Update(utilityItem);
await dbContext.SaveChangesAsync();
}
LoadData();
}
In the startup code:
builder.Services.AddDbContext<InternalUtilitiesDbContext>(option => option.UseSqlServer(connectionString), ServiceLifetime.Transient);
this code fix your problems::
builder.Services.AddDbContext(option => option.UseSqlServer(connectionString), ServiceLifetime.Transient);
ServiceLifetime.Transient
How would you Upsert without select? the upsert would be a collection of entities received by a method which contains DTOs that may not be available in the database so you can NOT use attach range for example.
One way theoretically is to load the ExistingData partially with a select like dbContext.People.Where(x => x exists in requested collection).Select(x => new Person { Id = x.Id, State = x.State }).ToList() which just loads a part of the entity and not the heavy parts. But here if you update one of these returned entityItems from this collection it will not update because of the new Person its not tracking it and you also cannot say dbContext.Entry<Person>(person).State = Modified because it will throw an error and will tell you that ef core is already "Tracking" it.
So what to do.
One way would be to detach all of them from the ChangeTracker and then do the state change and it will do the update but not just on one field even if you say dbContext.Entry<Person>(person).Property(x => x.State).Modified = true. It will overwrite every fields that you haven't read from the database to their default value and it will make a mess in the database.
The other way would be to read the ChangeTracker entries and update them but it will also overwrite and it will consider like everything is chanaged.
So techinically I don't know how ef core can create the following SQL,
update People set state = 'Approved' where state != 'Approved'
without updating anything else. or loading the person first completely.
The reason for not loading your data is that you may want to update like 14000 records and those records are really heavy to load because they contain byte[] and have images stored on them for example.
BTW the lack of friendly documentation on EFCore is a disaster compare to Laravel. Recently it has cost us the loss of a huge amount of data.
btw, the examples like the code below will NOT work for us because they are updating one field which they know that it exists in database. But we are trying to upsert a collection which some of those DTOs may not be available in the database.
try
{
using (var db = new dbContext())
{
// Create new stub with correct id and attach to context.
var entity = new myEntity { PageID = pageid };
db.Pages.Attach(entity);
// Now the entity is being tracked by EF, update required properties.
entity.Title = "new title";
entity.Url = "new-url";
// EF knows only to update the propeties specified above.
db.SaveChanges();
}
}
catch (DataException)
{
// process exception
}
Edit: The used ef core version is #3.1.9
Fantastic, I found the solution (You need to also take care about your unit tests).
Entityframework is actually working fine it can be just a lack of experience which I'm documenting here in case anyone else got into the same issue.
Consider that we have an entity for Person which has a profile picture saved as Blob on it which causes that if you do something like the following for let's say 20k people the query goes slow even when you've tried to have enough correct index on your table.
You want to do this query to update these entities based on a request.
var entityIdsToUpdate = request.PeopleDtos.Select(p => p.Id);
var people = dbContext.People.Where(x => entityIdsToUpdate.Contains(x.Id)).ToList();
This is fine and it works perfectly, you will get the People collection and then you can update them based on the given data.
In these kind of updates you normally will not need to update images even if you do, then you need to increase the `TimeOut1 property on your client but for our case we did not need to update the images.
So the above code will change to this.
var entityIdsToUpdate = request.PeopleDtos.Select(p => p.Id);
var people = dbContext.People
.Select(p => new Person {
Id = p.Id,
Firstname = p.Firstname,
Lastname = p.Lastname,
//But no images to load
})
.Where(p => entityIdsToUpdate.Contains(p.Id)).ToList();
But then with this approach, EntityFramework will lose the track of your entities.
So you need to attach it like this and I will tell you how NOT to attach it.
This is the correct way for a collection
dbContext.People.AttachRange(people); //These are the people you've already queried
Now DO NOT do this, you may want to do this because you get an error from the first one from EntityFramework which says the entity is already being tracked, trust it because it already is. I will explain after the code.
//Do not do this
foreach(var entry in dbContext.ChangeTracker.Entries())
{
entry.State = EntityState.Detached;
}
//and then on updating a record you may write the following to attach it back
dbContext.Entry(Person).State = EntityState.Modified;
The above code will cause EntityFramework not to follow the changes on the entities anymore and by the last line you will tell it literally everything edited or not edited is changed and will cause you to LOSE your unedited properties like the "image".
Note: Now what can u do by mistake that even messes up the correct approach.
Well since you are not loading your whole entity, you may assume that it is still fine to assign values to the unloaded ones even if the value is not different than the one in the database. This causes entity framework to assume that something is changed and if you are setting a ModifiedOn on your records it will change it for no good reason.
And now about testing:
While you test, you may get something out from database and create a dto from that and pass the dto with the same dbContext to your SystemUnderTest the attach method will throw an error here which says this entity is already bein tracked because of that call in your test method. The best way would be create a new dbContext for each process and dispose them after you are done with them.
BTW in testing it may happen that with the same dbContext you update an entity and after the test you want to fetch if from the database. Please take note that this one which is returning to you is the "Cached" one by EntityFramework and if you have fetched it in the first place not completely like just with Select(x => ) then you will get some fields as null or default value.
In this case you should do DbContext.Entry(YOUR_ENTRY).Reload().
It is a really complete answer it may not directly be related to the question but all of the things mentioned above if you don't notice them may cause a disaster.
How do I tell the Find method of a DBContext that it should eagerly load the navigation property/entity?
I have the following code which deletes the association to a related Secondary entity:
Person primary = db.People.Find(Id);
if (primary == null)
return new HttpStatusCodeResult(HttpStatusCode.BadRequest);
// This line is required to load the related entity
db.Entry(primary).Reference("Secondary").Load();
primary.Secondary = null;
db.SaveChanges();
I had to add the line db.Entry(primary).Reference("Secondary").Load(); to get it working. I understand this is because the entity framework is using lazy loading. Can I override this in the Find method though and get rid of the extra line by using an Eager version of the Find method?
Eager loading is done by the Include method:
db.People.Include(p => p.Secondary)
Which can be followed by FirstOrDefault(p => p.Id == id), for example.
Find is a DbSet method that first tries to find the requested entity in the context's cache. Only when it's not found there, the entity is fetched from the database.
Because of this special behavior (of Find), Include and Find can't be mixed. It would be hard to define what to do when the root entity is found in the cache, but the entities to be included aren't (or only partly). Should they be fetched from the database? That would imply that Find + Include would always have to query the database for the included entities, because it can't rely on the local cache to be complete. That would defeat the purpose of Find. Or should find only include entities from the local cache if the root entity is in the local cache? That would make the result of the method ambiguous.
In ASP.Net MVC (or Web API) action methods, Find will hardly ever be useful, because most of the times, a new context will be created and entities will be fetched from the database once. In other words, there's nothing in the cache to be returned. You may want to use the method for its succinctness, but the effect, regarding database roundtrips, is the same as FirstOrDefault().
To answer your original question about how to use the Find method with explicit loading (from here)?
using (var context = new BloggingContext())
{
var post = context.Posts.Find(2);
// Load the blog related to a given post.
context.Entry(post).Reference(p => p.Blog).Load();
// Load the blog related to a given post using a string.
context.Entry(post).Reference("Blog").Load();
var blog = context.Blogs.Find(1);
// Load the posts related to a given blog.
context.Entry(blog).Collection(p => p.Posts).Load();
// Load the posts related to a given blog
// using a string to specify the relationship.
context.Entry(blog).Collection("Posts").Load();
}
To add to #GertArnolds post, if you use dependency injection where you load a DbContext once per Scoped Lifestyle, which essentially is: (from here):
Only one instance will be created by the container per web request.
Use this lifestyle in ASP.NET Web Forms and ASP.NET MVC applications.
Then the DbContext sticks around for a while.
Or you can use
var result = db.Person.Include(c=>c.Secondary).FirstOrDefault(entity=>entity.Id == Id);
Use using System.Data.Entity; for the Linq capability in the Include, otherwise you can just use the string "Secondary" like so: .Include("Secondary")
Just use Include method:
var result = db.Person.Include(c=>c.Secondary).FirstOrDefault(entity=>entity.Id == Id);
Reference:
http://www.entityframeworktutorial.net/EntityFramework4.3/eager-loading-with-dbcontext.aspx
just change
Person primary = db.People.Find(Id);
to
Person primary = db.People.Find(Id).FirstOrDefault();
I recently upgraded our solution from EF4.1 to EF6. Our previous create method added the detached object to the context, saved changes, then requeried the object based on the new id. We requery the object because we do not use lazy loading, and use includes/selects to get navigational properties - which are added by taking the IDbSet and returning an IQueryable. In short, I cannot just return the existing in-memory copy - because it would not be fully loaded up. I have tried detaching the in-memory copy. I cannot flush the dbcontext (because I am working with other objects in the same context).
Anyway, the big issue, my big issue is that in the EF6 branch when I requery that object it returns a null record - not finding the one I just created. The record is persisted to the database - I can see it. If I create a new dbcontext and query for that record I can find it.
dbContext.Set<TEntity>().Add(model);
dbContext.SaveChanges();
var newCopy = dbContext.Set<TEntity>().SingleOrDefault(p => p.Id == model.Id);
In this instance, newCopy is always null. But if I get a new dbContext - I can get the object fine.
Anyway, I need to solve this issue. Any ideas?
I actually discovered what the issue was/is. In EF4, it really doesn't matter if the mappings are HasRequired or HasOptional. However, if you use a HasRequired on a nullable integer field (legacy db) - then it actually is performing an inner join and will not select the record - resulting in a return of null. Since I am not using lazy loading - but rather eager loading - if I have an include on the above described mapped field - it returns a null.
Thanks everyone for their help - it was appreciated!
Does the following work:
dbContext.Set<TEntity>().Add(model);
dbContext.SaveChanges();
var newCopy = dbContext.Set<TEntity>().Find(model.Id);
I won't claim to know the ins an outs of it but I am using that in the same situation and it works.
You could also check that the model properties are not virtual as EF6 takes this as lazy loading (You may have already done this).