I'm using Entity Framework from a couple of years and I have a little problem now.
I add an entity to my table, with
Entities.dbContext.MyTable.Add(obj1);
and here ok.
Then, I'd like to make a query on MyTable, like
Entities.dbContext.MyTable.Where(.....)
The code above will query on my MyTable in the db.
Is there a way to query also on the just added value, before the saveChanges? (obj1) How?
UPDATE
Why do I need this? Because, for each new element I add, I need to edit some values in the previous and the next record (there is a datetime field in this table)
UPDATE2
Let's say I have to add a lot of objects, but I call the saveChanges only after the last item is added. Every time I add the new item, I read its datetime field and I search in the database the previous and the next record. Here, I edit a field of the previous and of the next record. Now, here is problem: if I insert another item, and, for example, the next item is "Obj1", I have to find and edit it, but I can't find it since I haven't saved my changes. Is it clearer now?
You should be able to get your added entities out of the dbContext via the change tracker like this:
var addedEntities = dbContext.ChangeTracker.Entries()
.Where(x => x.State == EntityState.Added && x.Entity is Mytable)
.Select(x => x.Entity as MyTable)
.Where(t => --criteria--);
Or using the type testing with pattern matching in c# 7.0:
var addedEntities = dbContext.ChangeTracker.Entries()
.Where(x => x.State == EntityState.Added && x.Entity is Mytable t && --test t for criteria--)
.Select(x => x.Entity as MyTable);
because you are only querying added entities, you can combine this with
dbContext.MyTable.Where(t => --criteria--).ToList().AddRange(addedEntities);
to get all of the relevant objects
I think this is a good situation for Transactions. I am going to assume you are using EF 6 since you did not provide a version. =)
UPDATE2 changes
public void BulkInsertObj(List<TEntity> objList)
{
using (var context = new dbContext())
{
using (var dbContextTransaction = context.Database.BeginTransaction())
{
try
{
foreach(var obj1 in objList)
{
dbContext.MyTable.Add(obj1);
//obj1 should be on the context now
var previousEntity = dbContext.MyTable.Where(.....) //However you determine this
previousEntity.field = something
var nextEntity = dbContext.MyTable.Where(.....) //However you determine this
nextEntity.field = somethingElse
}
context.SaveChanges();
dbContextTransaction.Commit();
}
catch (Exception)
{
dbContextTransaction.Rollback();
}
}
}
}
MSDN EF6 Transactions
Related
My application is using Entity frame work and There is one table called t_details which holds multiple records. When the application gets saved, this entity gets deleted and added.
I also have a sql Db trigger on this table that gets executed when an insert/delete happens on this table. Since entity is deleting each entry one by one and inserting it one by one, it triggers the Db trigger every instance of a delete or insert and the performance becomes slow.
Is there a way I can initiate the trigger just once after all the delete and once at the end of all inserts.
Trigger is created as below now
Create TRIGGER test.trigger_Data
ON test.t_details
AFTER INSERT,DELETE,UPDATE
AS
BEGIN
EXEC [test].[spu_call_details] -- this in turn will call some views/tables etc.
END
I don't think you can do what you want without modifying the application. In SQL Server, triggers are statement level.
Here are some options:
Change the application to do all the deletes in a single statement. Then the trigger will be executed once for the transaction.
Change the application to indicate the end of a batch -- perhaps with an update on some table indicating the last time. Then use that for your trigger.
You could play games with an event queue. Instead of running the code, you could schedule it to run in, say, one minute. Then don't run the code if another delete has occurred in that time frame.
If you are only concerned with tracking this when records are added/updated/deleted by your application then you could leverage bounded contexts to action this on the applicable DbContext's SaveChanges:
Start by creating a separate DbContext that will service this Stored Procedure call or interact with the desired entities to track records for the changes. Create a reference to this DbContext as part of your application DbContext where you want to track the entity updates.
So as an example, a LoggingDbContext and an AppDbContext:
public class LoggingDbContext : DbContext
{
// ...
}
public class AppDbContext : DbContext
{
private readonly LoggingDbContext _loggingDbContext = null;
public AppDbContext(string connectionString, LoggingDbContext loggingDbContext)
: base(connectionString)
{
_loggingDbContext = loggingDbContext ?? throw new ArgumentNullException("loggingDbContext");
}
public override int SaveChanges()
{
// Do your logging call...
var log = new Log { Message = "SaveChanges Called!" };
_loggingDbContext.Logs.Add(log);
_loggingDbContext.SaveChanges();
return base.SaveChanges();
}
}
That OnSaveChanges override can call your stored procedure or whatever is necessary. If you want to differentiate between when entities are added, removed, etc. you can use the ChangeTracker to inspect the changes for particular change types or specific entities:
var updatedCustomers = ChangeTracker.Entries()
.Where(x => x.State == EntityState.Modified)
.Select(x => x.Entity)
.OfType<Customer>();
var insertedCustomers = ChangeTracker.Entries()
.Where(x => x.State == EntityState.Added)
.Select(x => x.Entity)
.OfType<Customer>();
var deletedCustomers = ChangeTracker.Entries()
.Where(x => x.State == EntityState.Deleted)
.Select(x => x.Entity)
.OfType<Customer>();
From there you could inspect the entities with .ToList() or .Select() for particular details, or just do a .Any() or .Count() to assess whether a particular entity type was added/updated/removed etc.
I use this pattern in my DbContexts for models that track things like LastModifiedBy user/date tracking. I have a CurrentUserLocator for resolving the currently logged in user for the session then a base class for "editable" entities which contains the CreatedBy/CreatedAt/LastModifiedBy/LastModifiedAt fields. This way as my DbContext saves details the Context SaveChanges looks like this:
// These DbContexts do not allow for hard Deletes.
if (ChangeTracker.Entries().Any(x => x.State == EntityState.Deleted))
throw new ApplicationException("Hard deletes are not supported.");
var currentUser = Users.Single(x => x.UserId == CurrentUserLocator.CurrentUserId);
var updatedEntities = ChangeTracker.Entries()
.Where(x => x.State == EntityState.Modified)
.Select(x => x.Entity)
.Cast<EditableEntityBase>();
foreach (var entity in updatedEntities)
{
entity.LastModifiedBy = currentUser;
entity.LastModifiedAt = DateTime.Now;
}
var insertedEntities = ChangeTracker.Entries()
.Where(x => x.State == EntityState.Added)
.Select(x => x.Entity)
.Cast<EditableEntityBase>();
foreach (var entity in insertedEntities)
{
entity.CreatedBy = entity.LastModifiedBy = currentUser;
entity.CreatedAt = entity.LastModifiedAt = DateTime.Now;
}
return base.SaveChanges();
The context also tracks non-editable entities (lookups etc.) so this also helps guard against situations where somehow a non-editable entity gets inserted or updated. Since my system uses soft-deletes in this example it also guards against hard-deleting entities. This approach can be used to leverage a separate DbContext to call a Stored Procedure or otherwise modify other entities etc. without "tripping" on those change tracking hooks in SaveChanges.
Asking same question differently!
Its seems clear I need to elaborate on this question because I have no viable responses.
Based on this AutoMapper registration code:
Mapper.Initialize(cfg =>
{
cfg.AddCollectionMappers();
cfg.SetGeneratePropertyMaps<GenerateEntityFrameworkPrimaryKeyPropertyMaps<DbContext>>();
});
AutoMapper adds support for "updating" DbSet collections using this line:
Mapper.Map<List<DTO>, List<Entity>>(dtoCollection, entityCollection);
Saving changes through an open context should result in updating the database:
using (var context = factory.CreateContext())
{
Mapper.Map<List<DTO>, List<Entity>>(dtoCollection, await
context.dbSet.TolistAsync());
await context.SaveChangesAsync();
}
This does nothing!
So back to my original question. If calling the mapper with the dto and current state of the entity collection returns an updated entity collection based on the comparison Mapping Created here:
cfg.SetGeneratePropertyMaps<GenerateEntityFrameworkPrimaryKeyPropertyMaps<DbContext>>();
produces entity collection here:
var entities = Mapper.Map<List<DTO>, List<Entity>>(dtoCollection, await
context.dbSet.TolistAsync());
Am I support to iterate the new collection and update EF manually using this new collection? Its not clear what I am suppose to do at this point? Is this what I am suppose to do with the resulting collection?
// map dto's to entities
var entities = Mapper.Map(collection, await dbSet.ToListAsync());
// add new records
var toAdd = entities.Where(e => e.Id == 0);
dbSet.AddRange(toAdd);
// delete old records
var toDelete = entities.Where(entity => collection.All(e => e.Id > 0 && e.Id != entity.Id));
dbSet.RemoveRange(toDelete);
// update existing records
var toUpdate = entities.Where(entity => collection.All(i => i.Id > 0 && i.Id == entity.Id)).ToList();
foreach (var entity in toUpdate)
{
context.Entry(entity).State = EntityState.Modified;
}
await context.SaveChangesAsync();
This is my original question. If so it seems redundant. So I feel like I am missing something.
I appreciate some useful feedback. Please help!
Thanks
EF DbSets are not collections. Basically they represent a database table and provide query and DML operations for it.
Looks like you want to synchronize the whole table with the DTO list. You can do that by loading the whole table locally using the Load or LoadAsync methods, and then Map the DTO collection to the entity DbSet.Local property. The difference with your attempts is that the Local property is not a simple list, but observable collection directly bound to the context local store and change tracker, so any modification (Add, Remove) will be applied to the database.
Something like this:
await dbSet.LoadAsync();
Mapper.Map(dtoCollection, dbSet.Local);
await context.SaveChangesAsync();
I'm changing an entity by hand and after that I'm trying to verify if there is any entity in my DbContext that matches with my changes. The "answer" I expected was "true", however it's "false".
As my code is very complex and with many rules I've created a simple example to try to explain the problem:
var propertyValues = new Dictionary<string, object>()
{
{"MyProperty1", "My value"},
{"MyProperty2", 10}
};
var entityId = 13;
var entityType = typeof(MyEntity);
// Applies the changes
this.ApplyChanges(db, entityType, entityId, propertyValues);
// This is "false"
var hasEntityWithValue = db.MyEntity.Any(p => p.Id == entityId && p.MyProperty1 != null);
// The "myEntity" is null
var myEntity = db.MyEntity.FirstOrDefault(p => p.Id == entityId && p.MyProperty1 != null);
// Gets the entity only by Id
myEntity = db.MyEntity.FirstOrDefault(p => p.Id == entityId);
// And when I compare the "MyProperty1" it's "true". Why?????
hasEntityWithValue = myEntity.MyProperty1 != null;
The "ApplyChanges" method:
private void ApplyChanges(DbContext db, Type entityType, int entityId,
Dictionary<string, object> propertyValues)
{
var entity = db.Set(entityType).Find(entityId);
foreach (var propertyValue in propertyValues)
{
var propertyInfo = entityType.GetProperty(propertyValue.Key);
// Sets the value
propertyInfo.SetValue(entity, propertyValue.Value);
}
db.ChangeTracker.DetectChanges();
}
I believe this is happening because when I query the entities I'm querying them in the database instead of the EntityFramework "cache".
But is there a way to force EntityFramework identify the changes when I query the entities in the DbContext by using the IQueryable extension methods (such as "Any" and "FirstOrDefault" methods)?
You're right. When you use 'Any', 'FirstOrDefault' or any other Linq extension methods that look for data a SQL query is used. Because of this, any changes to the objects are not seen (for filtering purposes) unless you call 'SaveChanges'.
There is a way to look at materialized objects, but you'll have to do it manually. You'll have to make a Linq-to-Objects query ONLY on the materialized objects to see if what you want is there. An then, if it's not, make a regular Linq-to-Entities query searching it in the database. DO NOT mix these queries, or you may unleash hell.
To search the materialized objects:
context.ChangeTracker.Entries<MY_ENTITY>(); // strongly-typed, just an objects set
or
context.ChangeTracker.Entries(); // everything
Let look at the first two statements:
var entityId = 13;
...
// This is "false"
var hasEntityWithValue = db.MyEntity.Any(p => p.Id == entityId && p.MyProperty1 != null);
// The "myEntity" is null
var myEntity = db.MyEntity.FirstOrDefault(p => p.Id == entityId && p.MyProperty1 != null);
Both of these sends the same query to the database:
SELECT * FROM MyEntities WHERE ID = 13 AND MyProperty1 IS NOT NULL
This returns no records from the database because the database does not yet have the new data - there are no records saved in the database with an ID of 13 where MyProperty1 IS NOT NULL. This is because you have not yet called db.SaveChanges(). The first statement turns the result of that SQL statement into a value of false, whereas the second statement turns it into a value of null.
Moving on to the next statement:
// Gets the entity only by Id
myEntity = db.MyEntity.FirstOrDefault(p => p.Id == entityId);
This sends a query to the database like this:
SELECT * FROM MyEntities WHERE ID = 13
The database does have a MyEntitiy with an ID of 13, and it returns that MyEntity to EF. However, before EF returns the MyEntity to you, EF checks to see if it has a MyEntity with an ID of 13 in its cache. It does have a cached MyEntity with an ID of 13, so it sends the cached MyEntity. And the cached MyEntity just so happens to be the one you updated in your call to your custom ApplyChanges method.
// And when I compare the "MyProperty1" it's "true". Why?????
hasEntityWithValue = myEntity.MyProperty1 != null;
The reason this is true is because the entity returned to you is the one that is in the EF cache.
When you make a query with EF, it will send the query to the database, and if records are returned from the database, EF will check it's cache to see if records with the same key are in the cache. If they already exist in the cache, the cached records will be returned in place of the records that were found in the database, even if the cached records are different from the database records. (for more information on how to get around this caching, see http://codethug.com/2016/02/19/Entity-Framework-Cache-Busting/)
The cache checking is done at the point the query is run. So you could make this call and you would see your updated data:
var data = db.MyEntity
.Where(p => p.Id == entityId)
.ToList()
.Where(p => p.MyProperty1 != null);
The first Where function is processed by the database. The second is processed in memory wherever your C# code is running. The ToList call forces the query that has been built so far to be sent to the database and run before any more filtering or sorting is done.
You can also use a transaction for this, but as you mention, this will lock up resources for the duration of the transaction. Assuming you are working with EF6, you can do this:
using (var transaction = db.Database.BeginTransaction())
{
// Applies the changes
this.ApplyChanges(db, entityType, entityId, propertyValues);
db.SaveChanges();
// Should be true
var hasEntityWithValue = db.MyEntity.Any(p => p.Id == entityId && p.MyProperty1!=null);
// At this point, queries to the database will see the updates you made
// in the ApplyChanges method
var isValid = ValidateSave();
if (isValid)
{
// Assuming no more changes were made since you called db.SaveChanges()
transaction .Commit();
}
else
{
transaction .Rollback();
}
}
After I talked to my coworkers we decided to do something similar with the first suggestion of the #CodeThug. So I'm going to change the points of my code that queries "MyEntity" using "Linq to entities" to materialize the entity:
myEntity = db.MyEntity.First(p => p.Id == entityId);
var hasEntityWithValue = myEntity.MyProperty1 != null;
I have "static" readonly entities which I simply load with QueryOver<T>().List<T>(). All their properties are not "lazy". So some of them have N+1 problem.
I tried to use Future to avoid N+1. But it looks like then NH considers entity properties as "lazy". And when I access them it even reloads entities from db one by one (leading to the same N+1 situation) despite that all entities were preloaded preliminary and should be cached in the session 1st level cache. Here is the code how I'm doing it:
var futures = new List<IEnumerable>();
futures.Add(s.QueryOver<DailyBonus>().Future<DailyBonus>());
futures.Add(s.QueryOver<DailyBonusChestContent>().Future<DailyBonusChestContent>());
// ... other entities ...
// all queries should be sent with first enumeration
// but I want to ensure everything is loaded
// before using lazy properties
foreach (IEnumerable future in futures)
{
if (future.Cast<object>().Any(x => false)) break;
}
// now everything should be in cache, right?
// so I can travel the whole graph without accessing db?
Serializer.Serialize(ms, futures); // wow, N+1 here!
I checked this behavior using hibernatingrhinos profiler.
So what is going on wrong here?
The only correct way of using futures for loading entity with collections is with Fetch which means joins for each query:
var q = session.Query<User>().Where(x => x.Id == id);
var lst = new List<IEnumerable>
{
q.FetchMany(x => x.Characters).ToFuture(),
q.Fetch(x=>x.UpdateableData).ToFuture(),
session.QueryOver<User>().Where(x => x.Id == id)
.Fetch(x=>x.Characters).Eager
.Fetch(x => x.Characters.First().SmartChallengeTrackers).Eager
.Future()
};
var r = session.QueryOver<User>().Where(x => x.Id == id)
.TransformUsing(Transformers.DistinctRootEntity)
.Future();
foreach (IEnumerable el in lst)
{
foreach (object o in el)
{
}
}
return r.ToArray();
It's still better than join everything in a one query - NHibernate won't have to parse thousands of rows introduced by join x join x join x join...
You can add normal select queries (without Fetch) to to the same batch but they won't be used for retrieving collections on another entity.
I have tried the following:
var itemtoremove = db.ItemGroups.Single(x => x.ID == _id);
if (itemtoremove != null)
{
db.ItemGroups.Remove(itemtoremove);
db.SaveChanges();
}
The query 'works' but it deletes more than one row! Any ideas? The table itself is part of a set or really terrible tables, but I've got to use them.
Can having multiple keys on a table affect what EF does?
Thanks folks.
If the enitity has a navigation property and you have not set CascadeOnDelete(false); you could potentially remove more than one item.
//check item group for duplicates (just as a test)
var items = db.ItemGroups.Where(x => x.ID == _id); //breakpoint here to make sure there is only one match. If there is only one then you are deleting a linked item to the entity you are deleting.
var itemtoremove = db.ItemGroups.Single(x => x.ID == _id);
if (itemtoremove != null)
{
db.ItemGroups.Remove(itemtoremove);
db.SaveChanges();
}
I found a resolution.
I think with the table having 2 PK's setup on it (both on CHAR fields) was causing EF to become confused. The table goes back years and it is part of a group of tables that do not relate to one another properly. It's pretty poor but that's another story!
So, I started from scratch again:
After re-importing the table in it's original format I did the following:
Removed the PK on 'GroupID' (should never have been a PK)
Added an ID column, set it as 'IsIdentity' so it will auto-generate an ID
refreshed my EF Model to include:
[DatabaseGenerated(DatabaseGeneratedOption.Identity)]
public int ID { get; set; }
Now only the one correct row is removed when the SaveChanges is applied.
Thanks for your suggestions folks!