Clone dynamic object using Entity Framework Repository Pattern - c#

I'd like to be able to clone an object and its children.
In a previous project (EF6.0) an implementation like this worked fine;
Job job = db.Jobs.AsNoTracking()
.Include("Columns")
.Include("Exclusions")
.Include("Batches")
.Include("OtherColumns")
.First(x => x.Id == (int)jobId);
db.Jobs.Add(job);
db.SaveChanges();
A new job was added to the database, as well as new children.
Now in a new project (EFCore3.1.3) I'm using a Repository pattern and trying to create a clone function;
public async Task<int> CloneByIdAsync(int id, string childrenObjectsToInclude)
{
var query = _dbContext.Set<T>().AsNoTracking().AsQueryable(); // Get as no tracking so all objects can be saved as new (no ids)
string[] includes = childrenObjectsToInclude.Split(';');
foreach (string include in includes)
query = query.AsNoTracking().Include(include);
T thing = query.AsNoTracking().First(x => x.Id == id); // We have to get the thing we want
await _dbContext.Set<T>().AddAsync(thing); // Add to the collection
return await _dbContext.SaveChangesAsync(); // Save the changes - should generate all new ids and items
}
You can see I've tried to use AsNoTracking() in a couple of places, but this doesn't seem to work. It retrieves the object and it's children just fine, but when trying to add to the database it's trying to insert the object with the same Id and so I get a SQL Exception.
Am I doing something obviously wrong?

After trying a few things I eventually conceded that you need to set the object's Id to 0 in EF Core for it to be treated as a new object. I used something like the below, and manually detached all the child entities that I wanted on the cloned object. With their Id's all set to 0 when save changes occurs a new set of child objects will also be created correctly linked to the new parent object.
var project = await _dbContext.Projects
.Include("FloorAreaTypes")
.Include("FloorAreaTypes.FloorAreaCategories")
.SingleAsync(x => x.Id == id);
var fts = project.FloorAreaTypes.ToList();
var fcs = fts.SelectMany(ft => ft.FloorAreaCategories).ToList();
DetachEntity(project);
DetachEntities(fts);
DetachEntities(fcs);
await _dbContext.AddAsync(project);
await _dbContext.AddRangeAsync(fts);
await _dbContext.AddRangeAsync(fcs);
await _dbContext.SaveChangesAsync();
private T DetachEntity<T>(T entity) where T : class
{
_dbContext.Entry(entity).State = EntityState.Detached;
if (entity.GetType().GetProperty("Id") != null)
{
entity.GetType().GetProperty("Id").SetValue(entity, 0); // Id's must be set to 0 for a new object, even with no tracking
}
return entity;
}
private List<T> DetachEntities<T>(List<T> entities) where T : class
{
foreach (var entity in entities)
{
this.DetachEntity(entity);
}
return entities;
}
With some more time I'm sure this could be refactored to use Reflection with recursive functions for child objects based on a set of Include strings.

Related

EF Core disconnected update of collection navigation properties via full replacement

Using EF Core 5.0. I have a SPA page that loads a Group entity with its collection of Employee entities from the API:
var groupToUpdate = await context.Groups
.Include(g => g.Employees)
.FirstOrDefaultAsync(...);
//Used for UI, list of additional employees for selective adding
var employeeList = await context.Employees.
.Where(...)
.ToListAsync();
The user then modifies the groupToUpdate entity via the Javascript UI, including some non-navigation properties such as name/notes.
On the same screen, the user adds some employees to the group, removes some employees from the group, and leave some existing employees in the group intact. All employees are existing entities in the DB with existing primary keys. All changes done thus far are just to the disconnected entity in memory.
When the user clicks save, the groupToUpdate entity is sent to my backend code. Note that we did not keep track of which employees are added/removed/left alone, we just want to let this groupToUpdate completely override the old entity, notably replacing the old collection of Employees with the new one.
To achieve this, the backend code first loads the group again from the database to start tracking it in context. Then I attempt to update the entity, including replacing the old collection with the new one:
public async Task UpdateGroupAsync(Group groupToUpdate)
{
var groupFromDb = await context.Groups
.Include(g => g.Employees)
.FirstOrDefaultAsync(...);
// Update non-navigation properties such as groupFromDb.Note = groupToUpdate.Note...
groupFromDb.Employees = groupToUpdate.Employees;
await context.SaveChangesAsync();
}
Now if changes to the Employees collection is total replacement (all old ones removed, all new ones added) , this method succeeds. But whenever there are some existing Employees that are left alone, EF core throws the exception:
The instance of entity type 'Employee' cannot be tracked because another instance with the key value ... is already being tracked
So it seems EF Core attempts to track both the Employee entities fresh loaded from the DB with groupFromDb and the ones from groupToUpdate, even though the latter is merely passed in as a parameter from a disconnected state.
My question is how to handle this kind of update with the least amount of complications? Is it necessary to keep track of the added/removed entities manually and adding/removing them instead of trying to replace the entire collection?
You have to instruct ChangeTracker which operations are needed for updating navigation collection. Just replacing collection is not correct way.
This is extension which helps to do that automatically:
context.MergeCollections(groupFromDb.Employees, groupToUpdate.Employees, x => x.Id);
Implementation:
public static void MergeCollections<T, TKey>(this DbContext context, ICollection<T> currentItems, ICollection<T> newItems, Func<T, TKey> keyFunc)
where T : class
{
List<T> toRemove = null;
foreach (var item in currentItems)
{
var currentKey = keyFunc(item);
var found = newItems.FirstOrDefault(x => currentKey.Equals(keyFunc(x)));
if (found == null)
{
toRemove ??= new List<T>();
toRemove.Add(item);
}
else
{
if (!ReferenceEquals(found, item))
context.Entry(item).CurrentValues.SetValues(found);
}
}
if (toRemove != null)
{
foreach (var item in toRemove)
{
currentItems.Remove(item);
// If the item should be deleted from Db: context.Set<T>().Remove(item);
}
}
foreach (var newItem in newItems)
{
var newKey = keyFunc(newItem);
var found = currentItems.FirstOrDefault(x => newKey.Equals(keyFunc(x)));
if (found == null)
{
currentItems.Add(newItem);
}
}
}

Duplicating Entity Framework entities

I have this EF query:
var records = mydata.Where(a => some condition).ToList();
Records is a list of Record objects. Record is a database table that has a one to many relationship called "relation".
The EF object represents that member variable as Collection<Relation>.
After the where above, I see that "relation" member variable contains a collection of 18 entities.
I need to duplicate those Record objects, so that I am detaching them first, this way:
var detached = this.DetachEntities(records, db, "RecordId");
Where this.DetachEntities is defined this way:
private T DetachEntity<T>(T entity, Repositories.GestionActivosEntities db, string keyName) where T : class
{
db.Entry(entity).State = EntityState.Detached;
if (entity.GetType().GetProperty(keyName) != null)
{
entity.GetType().GetProperty(keyName).SetValue(entity, 0);
}
return entity;
}
private List<T> DetachEntities<T>(List<T> entities, Repositories.GestionActivosEntities db, string keyName) where T : class
{
foreach (var entity in entities)
{
this.DetachEntity(entity, db, keyName);
}
return entities;
}
After I detached the entities, I am doing:
db.Record.AddRange(detached);
The problem, is that, after I detached the list, the relation is not copied also, resulting that "relation" member variable contains no elements.
How can I keep the relation elements after detaching?
EDIT:
This shows the debugging session:
The breakpoint is placed in the line where the query is done. The cursor shows the line after the query was executed.
This screenshot shows the child records ValorCampo and Workflow
This screenshot shows the entity after detaching it. Note the child records are not present in the detached object
As I have told, the problem only is when detaching. Child records are not kept. Database rows remain untouched.
I had the same problem, unfortunately navigation properties are lost after detaching an Item or when entity state is changed to detached.
what you can do is clone the entity
one way to do this is : Context.Entry(your_entity).CurrentValues.ToObject();
however this will not clone the navigation properties either
if you fully want to clone an object among with navigation properties
the easiest way for me to achieve it was using automapper library for c#
Below is a sample usage:
var config = new MapperConfiguration(cfg => cfg.CreateMap<originalObject, T>());
var mapper = new Mapper(config);
// or
var mapper = config.CreateMapper();
T clonedObject = mapper.Map<T>(originalObject);
after you clone and detach the original object, you can add
db.Record.AddRange(clonedObject );
and below is a generic extension to do it
public static object Map<T>(this T source)
{
var fullName = source.GetType().FullName;
var sourceType = source.GetType();
var baseType = ObjectContext.GetObjectType(source.GetType());
var config = new MapperConfiguration(cfg =>
cfg.CreateMap(sourceType, baseType));
var mapper = config.CreateMapper();
var entity = mapper.Map(source, sourceType, baseType);
return entity;
}
where you can call it like
var clonedObject = originalObject.Map();
Hope this helps!

How can I detect changes to properties in a specific entity in Entity Framework Core 2.x?

When a user saves changes to an object myObject, I'd like to log the fields that were updated to that object.
I can get an object with
var myObject = await context.MyObjects.SingleAsync(x => x.Id == id);
And I see that I can get an IEnumerable<PropertyEntry> with
var changes = context.Entry(myObject).Properties.Where(x => x.IsModified);
But in my changes list I don't see the field name anywhere. Also, it seems to take 2 full seconds to make this members query in LINQPad. That doesn't seem right.
How do I complete the following statement?
Consolse.Write($"The field {what goes here?} was updated from {change.OriginalValue} to {change.CurrentCalue}.");
Other StackOverflow questions I've found are for previous versions of Entity Framework, or override SaveChanges and don't look for specific entities.
Update! Got it.
public string GetChangeLog<T>(
ApplicationDbContext context,
T entity)
{
var sb = new StringBuilder();
var changes = context.Entry(entity).Properties.Where(x => x.IsModified);
foreach (var change in changes)
{
var propertyBase = (IPropertyBase)change.Metadata;
sb.Append($"\"{propertyBase.Name}\" was changed from \"{change.OriginalValue}\" to \"{change.CurrentValue}\"\n");
}
return sb.ToString();
}
Use the .Metadata property to retrieve a IPropertyBase. That will tell you what has actually changed.

EF: Duplicate Object with all child including sub sub children (Deep Copy)

I have a table with properties like:
Id Name ParentId
ParentId is a foreign key to primary column Id. Now lets say I have a few rows like: (Only showing ParentId from rows)
NULL
/ \
1 2
/ \
3 4
Now, lets say we want to copy row object whose ParentId is NULL with it's all sub objects.
var row = db.FirstOrDefault(x=> x.Id == 1);
var new_row = new Table1();
var subrows = row.Table1.ToArray();
foreach(var row in subrows)
{
db.Entry(row).State = System.Data.Entity.EntityState.Detached;
}
new_row.Table1 = subrows;
db.Table.Add(new_row);
db.saveChanges();
Result: New inserted structure like:
NULL
/ \
1 2
I am assuming that only one sublevel is being copied. How to copy/insert all sublevels?
EDIT: Since detach was helping create a copy till one level, this is what I tried:
private void RecursiveDetach(Table1 parent)
{
var subrows = parent.Table1.ToArray();
foreach (var row in subrows)
{
if(row.Table1.Count() > 0)
{
RecursiveDetach(row);
}
db.Entry(row).State = System.Data.Entity.EntityState.Detached;
}
}
However, now I am getting an error:
Collection was modified; enumeration operation may not execute.
I've had to do this before. I've done it purely in code, recursively copying objects and sanitizing unique IDs where needed, but the cleanest approach I've ever built is serializing the object to XML, then de-serializing into a new object. The approach is less efficient, but fantastically flexible and easy to implement.
//Save object to XML file. Returns filename.
public string SaveObjectAsXML(int id)
{
//however you get your EF context and disable proxy creation
var db = GetContext();
bool currentProxySetting = db.Configuration.ProxyCreationEnabled;
db.Configuration.ProxyCreationEnabled = false;
//get the data
var item = db.GetItem(id); //retrieval be unique to your setup, but I have
//a more generic solution if you need it. Make
//sure you have all the sub items included
//in your object or they won't be saved.
db.Configuration.ProxyCreationEnabled = currentProxySetting;
//if no item is found, do whatever needs to be done
if (item == null)
{
return string.Empty;
}
//I actually write my data to a file so I can save states if needed, but you could
//modify the method to just spit out the XML instead
Directory.CreateDirectory(DATA_PATH); //make sure path exists to prevent write errors
string path = $"{DATA_PATH}{id}{DATA_EXT}";
var bf = new BinaryFormatter();
using (FileStream fs = new FileStream(path, FileMode.Create))
{
bf.Serialize(fs, repair);
}
return path;
}
//Load object from XML file. Returns ID.
public int LoadXMLData(string path)
{
//make sure the file exists
if (!File.Exists(path))
{
throw new Exception("File not found.");
}
//load data from file
try
{
using (FileStream fs = new FileStream(path, FileMode.Open))
{
var item = (YourItemType)new BinaryFormatter().Deserialize(fs);
db.YourItemTypes.Add(item);
db.SaveChanges();
return item.Id;
}
}
catch (Exception ex) {
//Exceptions here are common when copying between databases where differences in config entries result in mis-matches
throw;
}
}
The use is simple.
//save object
var savedObjectFilename = SaveObjectAsXML(myObjID);
//loading the item will create a copy
var newID = LoadXMLData(savedObjectFilename);
Best of luck!
Here's a second, totally different answer: recursively detach your whole object instead of just the parent object. The following is written as an extension method to your context object:
/// <summary>
/// Recursively detaches item and sub-items from EF. Assumes that all sub-objects are properties (not fields).
/// </summary>
/// <param name="item">The item to detach</param>
/// <param name="recursionDepth">Number of levels to go before stopping. object.Property is 1, object.Property.SubProperty is 2, and so on.</param>
public static void DetachAll(this DbContext db, object item, int recursionDepth = 3)
{
//Exit if no remaining recursion depth
if (recursionDepth <= 0) return;
//detach this object
db.Entry(item).State = EntityState.Detached;
//get reflection data for all the properties we mean to detach
Type t = item.GetType();
var properties = t.GetProperties(BindingFlags.Public | BindingFlags.Instance)
.Where(p => p.GetSetMethod()?.IsPublic == true) //get only properties we can set
.Where(p => p.PropertyType.IsClass) //only classes can be EF objects
.Where(p => p.PropertyType != typeof(string)) //oh, strings. What a pain.
.Where(p => p.GetValue(item) != null); //only get set properties
//if we're recursing, we'll check here to make sure we should keep going
if (properties.Count() == 0) return;
foreach (var p in properties)
{
//handle generics
if (p.PropertyType.IsGenericType)
{
//assume its Enumerable. More logic can be built here if that's not true.
IEnumerable collection = (IEnumerable)p.GetValue(item);
foreach (var obj in collection)
{
db.Entry(obj).State = EntityState.Detached;
DetachAll(db, obj, recursionDepth - 1);
}
}
else
{
var obj = p.GetValue(item);
db.Entry(obj).State = EntityState.Detached;
DetachAll(db, obj, recursionDepth - 1);
}
}
}
The biggest thing to look out for will be config-type properties--object that represent data not directly related to the object. These might create conflicts, so better to make sure your object doesn't include them.
Note:
This approach requires that all sub-objects you wish to copy be populated in advance, avoiding lazy loading. To ensure this, I use the following extension for my EF queries:
//Given a custom context object such that CustomContext inherits from DbContext AND contains an arbitrary number of DbSet collections
//which represent the data in the database (i.e. DbSet<MyObject>), this method fetches a queryable collection of object type T which
//will preload sub-objects specified by the array of expressions (includeExpressions) in the form o => o.SubObject.
public static IQueryable<T> GetQueryable<T>(this CustomContext context, params Expression<Func<T, object>>[] includeExpressions) where T : class
{
//look through the context for a dbset of the specified type
var property = typeof(CustomContext).GetProperties().Where(p => p.PropertyType.IsGenericType &&
p.PropertyType.GetGenericArguments()[0] == typeof(T)).FirstOrDefault();
//if the property wasn't found, we don't have the queryable object. Throw exception
if (property == null) throw new Exception("No queryable context object found for Type " + typeof(T).Name);
//create a result of that type, then assign it to the dataset
IQueryable<T> source = (IQueryable<T>)property.GetValue(context);
//return
return includeExpressions.Aggregate(source, (current, expression) => current.Include(expression));
}
This method assumes that you have a custom context object that inherits from DbContext and contains DbSet<> collections of your objects. It will find the appropriate one DbSet<T> and return a queryable collection that will pre-load specified sub-classes in your object. These are specified as an array of expressions. For example:
//example for object type 'Order'
var includes = new Expression<Func<Order, object>>[] {
o => o.SalesItems.Select(p => p.Discounts), //load the 'SalesItems' collection AND the `Discounts` collection for each SalesItem
o => o.Config.PriceList, //load the Config object AND the PriceList sub-object
o => o.Tenders, //load the 'Tenders' collection
o => o.Customer //load the 'Customer' object
};
To retrieve my queryable collection, I now call it as such:
var queryableOrders = context.GetQueryable(includes);
Again, the purpose here is to create a queryable object that will eagerly load only the sub-objects (and sub-sub-objects) that you actually want.
To get a specific item, use this like any other queryable source:
var order = context.GetQueryable(includes).FirstOrDefault(o => o.OrderNumber == myOrderNumber);
Note that you can also provide the include expression inline; however, you will need to specify the generic:
//you can provide includes inline if you just have a couple
var order = context.GetQueryable<Order>(o => o.Tenders, o => o.SalesItems).FirstOrDefault(o => o.OrderNumber == myOrderNumber);

Entity Framework, how to avoid this issue?

I have a method like this:
public FbUser FindUserByGraphOrInsert(dynamic json, bool commit = false)
{
string graphId = json.id;
EntityDataModelContext context = DataContext.GetDataContext();
FbUser user = context.FbUsers.FirstOrDefault(u => u.FbGraphId == graphId);
if (user == null)
{
user = new FbUser();
user.FbGraphId = json.id;
user.FbUsername = StringExtensions.UnicodeDecode(json.name);
context.FbUsers.AddObject(user);
if (commit)
context.SaveChanges();
}
return user;
}
I call this method repeatedly in a loop (say upwards of 80 times), with commit = false
Thing is, I expected this method to let me know if the user is already in the context, but this doesn't seem to be the case.
The result is that when I finally save changes, I get a list of 80 users, where 27 are distinct.
I expect this method to return those 27, how could I change it to achieve this?
Do I really need to save changes every single time?
You cant 'simply' do that, the problem is that each query will always hit the database by default since EF has no way of knowing you either query the same data or that there have been no underlying changes in the database since you opened the connection.
You can however check the ChangeTracker/ObjectStateManager for existing changed objects and query that one as well prior to deciding to add a new object.
Sample:
var addedObjects = context.ObjectStateManager.GetObjectStateEntries(System.Data.EntityState.Added);
var equalObjects = addedObjects.OfType<MyEntity>().Where(x => x.Name == newObject.Name);
Based on Polity's answer, I implemented the following extension method, which worked.
public static IEnumerable<T> IncludeUnsaved<T>(this ObjectSet<T> set) where T : class
{
var addedObjects = set.Context.ObjectStateManager.GetObjectStateEntries(System.Data.EntityState.Added);
var equalObjects = addedObjects.Select(e => e.Entity).OfType<T>();
return equalObjects.Concat(set);
}

Categories