Update DBSet without saveChanges - c#

I try to reach the following:
Add a new entity to the DbContext's dataset.
Do no execute SaveChanges yet, as in case of any error after insert, the new entry should not go to database
However I need to search later if the newly entry is in the Customer collection
The following what I have now:
public virtual DbSet<Customer> Customers { get; set; }
_context.Customers.Add(customer);
// NO _context.SaveChanges();
// ...
var existsingUser = _context.Customers
.FirstOrDefault(x => x.CodeCustomer == customer.CodeCustomer);
// existsingUser = null, and should be valid populated object
If I execute SaveChanges, the existsingUser is not null anymore, but the data is saved to database.
_context.Customers.Add(customer);
_context.SaveChanges();
// ...
var existsingUser = _context.Customers
.FirstOrDefault(x => x.CodeCustomer == customer.CodeCustomer);
// existsingUser = has object
Is there any possible solution to 'Really' add the customer to the Customers collection, without saveChanges()?
If so, please let me know,

You can check the Local property. From the docs:
This property returns an ObservableCollection that contains all Unchanged, Modified, and Added objects that are currently tracked by the context for the given DbSet.
For example:
var existsingUser = _context.Customers
.Local
.FirstOrDefault(x => x.CodeCustomer == customer.CodeCustomer);

It's impossible to add to collection without SaveChanges(). _context.Customers.Add(customer) means that you mark this object to add to the database. Under the cabin EF context works like a transaction. You can make plenty of operation but only SaveChanges() save to database.
_context.Customers.FirstOrDefault requests data from database.
I hope it helps you.

Related

EF Core not tracking the correct property in a unit of work scenario

This question is an abstract of the problem that I am facing: consider the below,
I have entities with the following properties
public class Entity
{
public int Id { get; set; }
public int Status { get; set; }
}
Status can be 0 (New) or 1 (Deleted).
I have a service call which alters the entity from 0 -> 1 which is called through the service. The repository involved does not call context.SaveChanges(), instead this is handled farther up the chain in middleware.
This looks like:
var entity = await Context.Entities.FirstOrDefaultAsync(x => x.Id == id);
entity.SetDeleted();
Where the SetDeleted method changes the property from 0 to 1.
Then, in another service method (in the same unit of work) this is read to check if there is an entity which isn't deleted. This is where the problem occurs.
The code looks like:
var entities = context.Entities.Any(x => x.Status != 1);
Usually you would expect the graph to not be returned, but unfortunately it is. Strangely enough when you execute the ToList() function and then run the same operation it doesn't include the graph. This means that if you have the below:
var entities = await Context.Entities.Where(x => x.Status!= 1).ToListAsync();
var secondFilter = entities.Where(x => x.Status != 1).ToList();
The entities will contain the entity, the secondFilter will not contain the entity despite it being subject to the same rules.
Has anyone had this problem, does anyone know a solution?
This is by-design behavior of EF.
When you have a tracked entity, and query the same entity from the database, the tracked entity is never overwritten.
After
var entity = await Context.Entities.FirstOrDefaultAsync(x => x.Id == id);
entity.SetDeleted();
You have one tracked entity, with a modified Status.
This
var entities = await Context.Entities.Where(x => x.Status!= 1).ToListAsync();
Runs a SQL query and finds the same entity as its Status=0 in the database. But when when EF tries to add the entity to the change tracker it discovers that there's a modified version already there, and so it discards the data read from the database and adds the existing modified entity to the entities collection. Then this
var secondFilter = entities.Where(x => x.Status != 1).ToList();
returns nothing because entity has Status=1.

Updating nested list without AsNoTracking

Simplified model:
Profile {Guid Id, string Name, List<Property> Properties}
Property {Guid Id, string Name, List<Type> Types}
Type {Guid Id, string Key, string Value}
DbContext:
{
public DbSet<Profile> Profiles { get; set; }
}
I didn't include Properties and Types in the DbContext so I used the ModelBuilder:
modelBuilder.Entity<Property>().HasMany<Type>();
In the Update service:
public async Task<Response> Update([FromBody] Profile profile)
{
var entity = await _context.Profiles
.Include(x => x.Properties)
.ThenInclude(x => x.Types)
.FirstOrDefaultAsync(x => x.Id == profile.Id);
foreach (var prop in profile.Properties)
{
var existingProp = entity.Properties.SingleOrDefault(a => a.Id == prop.Id);
//Update
if (existingProp != null)
{
var entry = _context.Entry(existingProp);
entry.State = EntityState.Modified;
existingProp.ChargeFrom(prop);//maps the new values to the db entity
_context.SaveChanges();
}
}
}
But the above code throws this exception at SaveChanges:
The instance of entity type 'Type' cannot be tracked because another
instance with the same key value for {'Id'} is already being tracked.
When attaching existing entities, ensure that only one entity instance
with a given key value is attached. Consider using
'DbContextOptionsBuilder.EnableSensitiveDataLogging' to see the
conflicting key values.
I marked the Types entity AsNoTracking:
.ThenInclude(x => x.Types).AsNoTracking()
and the problem is solved, but I don't know why this exception is thrown, some other thread mentions that the DbContext might be used by another process, and might be registered as Singleton, but in my case, it's registered as scoped.
I marked the Types entity AsNoTracking:
.ThenInclude(x => x.Types).AsNoTracking()
and the problem is solved, but I don't know why this exception is thrown
The reason for the error will be because this line:
existingProp.ChargeFrom(prop);//maps the new values to the db entity
... will be attempting to copy the untracked Types from prop into existingProp. Using AsNoTracking will remove the exception but it will most likely be resulting in a duplication of data on SaveChanges where Type would be set up with an Identity key or duplicate row exceptions. If you received no exception I would be checking the Types collection to see if there are duplicate rows appearing there.
When copying data across from an untracked entity to a tracked entity, you will want to ensure that only values, and not references, are copied across. Copying an untracked reference across, EF will treat that as a new entity by default. Even if you force its state over to Modified, the DbContext could already be tracking an entity with that ID.
If Property.Types is a collection of references, such as an association to a lookup, and these could change where associations are added and removed, then to apply changes you need to load the associated types from the database then use that to remove associations that are no longer valid and add ones that aren't currently associated.
For example: given a Property (PropertyA) with Types (Type1) and (Type2), if we edit that to have (Type1) and (Type3) we need to fetch Type1 & Type3 from the DbContext (tracked) then compare against the tracked PropertyA to determine to remove Type2 and add Type3
var entity = await _context.Profiles
.Include(x => x.Properties)
.ThenInclude(x => x.Types)
.SingleAsync(x => x.Id == profile.Id);
// Get the IDs for all Types we want to associate... In the above example this would
// ask for Type1 and Type3 if only the one property. We get a Distinct list because
// multiple properties might reference the same TypeId(s).
var existingTypeIds = profile.Properties
.SelectMany(x => x.Types.Select(t => t.Id))
.Distinct()
.ToList();
// Load references to all Types that will be needed. Where associating new types, these will be referenced.
var existingTypes = _context.Types
.Where(x => existingTypeIds.Contains(x.Id))
.ToList();
foreach (var prop in profile.Properties)
{
existingProp = entity.Properties.SingleOrDefault(x => x.Id == prop.Id);
if (existingProp == null)
continue;
var updatedTypeIds = prop.Types.Select(x => x.Id).ToList();
var existingTypeIds = existingProp.Types.Select(x => x.Id).ToList();
var addedTypeIds = updatedTypeIds.Except(existingTypeIds).ToList();
var removedTypeIds = existingTypeIds.Except(updatedTypeIds).ToList();
var addedTypes = existingTypes
.Where(x => addedTypeIds.Contains(x.Id))
.ToList();
var removedTypes = existingProp.Types
.Where(x => removedTypeIds.Contains(x.Id))
.ToList();
foreach(var removedType in removedTypes)
existingProp.Types.Remove(removedType);
foreach(var addedType in addedTypes)
existingProp.Types.Add(addedType);
}
If instead the type is a child row that contains properties that can be updated then these values should be copied across between the updated data and the existing data state. This adds a considerable amount of work, though tools like AutoMapper can be configured to help. You still need to manage cases where Types can be added, removed, or have contents changed. That would apply to Properties as well, as your example only handles cases where a property is updated, not potentially added or removed.
Ultimately it can be beneficial to try and structure update scenarios to be as atomic as possible to avoid an update that is going to make changes to an entire object graph of entities, properties, and types but rather one update for just entity values, one for property values, and one for a single type update. This would apply also to adding a property, adding a type, removing a property, removing a type. While it may look like more code to break up operations like this, it keeps them very simple and straight forward rather than one big, complex method trying to compare the before and after to figure out what to add, remove, and update. Bugs hide in complex code, not simple methods. :)
When editing an object graph you should also avoid calling SaveChanges more than once. Instead of calling it within the loop of properties it should be done once when the loop is completed. The reason for this is that something like an exception on one of the properties would result in an incomplete/invalid data state being persisted. If you have 4 properties in the object being saved, and the 3rd one fails with an exception for any reason, the first 2 will be updated with the last two not persisting. Generally within an update operation, the update should follow an "all or nothing" approach to persistence.
Hopefully that helps explain the behavior you are seeing and gives you something to consider moving forward.

Is there a way to trigger a stored procedure at the end of all the insert/delete on an entity

My application is using Entity frame work and There is one table called t_details which holds multiple records. When the application gets saved, this entity gets deleted and added.
I also have a sql Db trigger on this table that gets executed when an insert/delete happens on this table. Since entity is deleting each entry one by one and inserting it one by one, it triggers the Db trigger every instance of a delete or insert and the performance becomes slow.
Is there a way I can initiate the trigger just once after all the delete and once at the end of all inserts.
Trigger is created as below now
Create TRIGGER test.trigger_Data
ON test.t_details
AFTER INSERT,DELETE,UPDATE
AS
BEGIN
EXEC [test].[spu_call_details] -- this in turn will call some views/tables etc.
END
I don't think you can do what you want without modifying the application. In SQL Server, triggers are statement level.
Here are some options:
Change the application to do all the deletes in a single statement. Then the trigger will be executed once for the transaction.
Change the application to indicate the end of a batch -- perhaps with an update on some table indicating the last time. Then use that for your trigger.
You could play games with an event queue. Instead of running the code, you could schedule it to run in, say, one minute. Then don't run the code if another delete has occurred in that time frame.
If you are only concerned with tracking this when records are added/updated/deleted by your application then you could leverage bounded contexts to action this on the applicable DbContext's SaveChanges:
Start by creating a separate DbContext that will service this Stored Procedure call or interact with the desired entities to track records for the changes. Create a reference to this DbContext as part of your application DbContext where you want to track the entity updates.
So as an example, a LoggingDbContext and an AppDbContext:
public class LoggingDbContext : DbContext
{
// ...
}
public class AppDbContext : DbContext
{
private readonly LoggingDbContext _loggingDbContext = null;
public AppDbContext(string connectionString, LoggingDbContext loggingDbContext)
: base(connectionString)
{
_loggingDbContext = loggingDbContext ?? throw new ArgumentNullException("loggingDbContext");
}
public override int SaveChanges()
{
// Do your logging call...
var log = new Log { Message = "SaveChanges Called!" };
_loggingDbContext.Logs.Add(log);
_loggingDbContext.SaveChanges();
return base.SaveChanges();
}
}
That OnSaveChanges override can call your stored procedure or whatever is necessary. If you want to differentiate between when entities are added, removed, etc. you can use the ChangeTracker to inspect the changes for particular change types or specific entities:
var updatedCustomers = ChangeTracker.Entries()
.Where(x => x.State == EntityState.Modified)
.Select(x => x.Entity)
.OfType<Customer>();
var insertedCustomers = ChangeTracker.Entries()
.Where(x => x.State == EntityState.Added)
.Select(x => x.Entity)
.OfType<Customer>();
var deletedCustomers = ChangeTracker.Entries()
.Where(x => x.State == EntityState.Deleted)
.Select(x => x.Entity)
.OfType<Customer>();
From there you could inspect the entities with .ToList() or .Select() for particular details, or just do a .Any() or .Count() to assess whether a particular entity type was added/updated/removed etc.
I use this pattern in my DbContexts for models that track things like LastModifiedBy user/date tracking. I have a CurrentUserLocator for resolving the currently logged in user for the session then a base class for "editable" entities which contains the CreatedBy/CreatedAt/LastModifiedBy/LastModifiedAt fields. This way as my DbContext saves details the Context SaveChanges looks like this:
// These DbContexts do not allow for hard Deletes.
if (ChangeTracker.Entries().Any(x => x.State == EntityState.Deleted))
throw new ApplicationException("Hard deletes are not supported.");
var currentUser = Users.Single(x => x.UserId == CurrentUserLocator.CurrentUserId);
var updatedEntities = ChangeTracker.Entries()
.Where(x => x.State == EntityState.Modified)
.Select(x => x.Entity)
.Cast<EditableEntityBase>();
foreach (var entity in updatedEntities)
{
entity.LastModifiedBy = currentUser;
entity.LastModifiedAt = DateTime.Now;
}
var insertedEntities = ChangeTracker.Entries()
.Where(x => x.State == EntityState.Added)
.Select(x => x.Entity)
.Cast<EditableEntityBase>();
foreach (var entity in insertedEntities)
{
entity.CreatedBy = entity.LastModifiedBy = currentUser;
entity.CreatedAt = entity.LastModifiedAt = DateTime.Now;
}
return base.SaveChanges();
The context also tracks non-editable entities (lookups etc.) so this also helps guard against situations where somehow a non-editable entity gets inserted or updated. Since my system uses soft-deletes in this example it also guards against hard-deleting entities. This approach can be used to leverage a separate DbContext to call a Stored Procedure or otherwise modify other entities etc. without "tripping" on those change tracking hooks in SaveChanges.

AutoMapper.Collections.EntityFramework

Asking same question differently!
Its seems clear I need to elaborate on this question because I have no viable responses.
Based on this AutoMapper registration code:
Mapper.Initialize(cfg =>
{
cfg.AddCollectionMappers();
cfg.SetGeneratePropertyMaps<GenerateEntityFrameworkPrimaryKeyPropertyMaps<DbContext>>();
});
AutoMapper adds support for "updating" DbSet collections using this line:
Mapper.Map<List<DTO>, List<Entity>>(dtoCollection, entityCollection);
Saving changes through an open context should result in updating the database:
using (var context = factory.CreateContext())
{
Mapper.Map<List<DTO>, List<Entity>>(dtoCollection, await
context.dbSet.TolistAsync());
await context.SaveChangesAsync();
}
This does nothing!
So back to my original question. If calling the mapper with the dto and current state of the entity collection returns an updated entity collection based on the comparison Mapping Created here:
cfg.SetGeneratePropertyMaps<GenerateEntityFrameworkPrimaryKeyPropertyMaps<DbContext>>();
produces entity collection here:
var entities = Mapper.Map<List<DTO>, List<Entity>>(dtoCollection, await
context.dbSet.TolistAsync());
Am I support to iterate the new collection and update EF manually using this new collection? Its not clear what I am suppose to do at this point? Is this what I am suppose to do with the resulting collection?
// map dto's to entities
var entities = Mapper.Map(collection, await dbSet.ToListAsync());
// add new records
var toAdd = entities.Where(e => e.Id == 0);
dbSet.AddRange(toAdd);
// delete old records
var toDelete = entities.Where(entity => collection.All(e => e.Id > 0 && e.Id != entity.Id));
dbSet.RemoveRange(toDelete);
// update existing records
var toUpdate = entities.Where(entity => collection.All(i => i.Id > 0 && i.Id == entity.Id)).ToList();
foreach (var entity in toUpdate)
{
context.Entry(entity).State = EntityState.Modified;
}
await context.SaveChangesAsync();
This is my original question. If so it seems redundant. So I feel like I am missing something.
I appreciate some useful feedback. Please help!
Thanks
EF DbSets are not collections. Basically they represent a database table and provide query and DML operations for it.
Looks like you want to synchronize the whole table with the DTO list. You can do that by loading the whole table locally using the Load or LoadAsync methods, and then Map the DTO collection to the entity DbSet.Local property. The difference with your attempts is that the Local property is not a simple list, but observable collection directly bound to the context local store and change tracker, so any modification (Add, Remove) will be applied to the database.
Something like this:
await dbSet.LoadAsync();
Mapper.Map(dtoCollection, dbSet.Local);
await context.SaveChangesAsync();

Can I only retrieve an element of an objectset if i call save changes?

I have the following code:
context.MyEntities.AddObject(new Entity() { Id = 1 });
var entity = context.MyEntities.Where(e => e.Id == 1).FirstOrDefault();
I didn't call save changes yet, so i'm trying to retrieve the object that i added, but the entity is null. Why?
ObjectContext.EntitySet will only get the entities with EntityKeys for us. For newly inserted entities, they only have a temp entity key until SaveChanges() is called. For detail, please see the Entity Keys and Added Objects section in Working with Entity Keys (Entity Framework).
If you want to get the entities before they are saved into the database, please consider using context.ObjectStateManager.GetObjectStateEntries(…) method. With this method, we can get all the Added/Modified/Deleted entity state entries and entities before .SaveChanges() is called.
Solved!
With Randolf answer, i reach the following solution:
context.MyEntities.AddObject(new MyEntity() { Id = 1 });
var entity = this.context
.ObjectStateManager
.GetObjectStateEntries(System.Data.EntityState.Added)
.Select(e => e.Entity)
.OfType<MyEntity>()
.Where(e => e.Id == 1)
.FirstOrDefault();

Categories