I have three entity framework objects that are nested - see image below. Also, I have generated similar objects as DTO to map them.
So my question is, as the companyDocument object has a list of ListCompanyDocumentsType, when I execute a query to pick all companies, it returns a list of Companies and its Documents plus a list of documents types with a list of all documents that each type has within the database (like a looping).
Is there any way to return the companyDocument with only one document type? Or I have designed it incorrectly? To solve this problem, I've used a mapped object and a for each that returns the type name within a non-mapped DTO property. So it is working, but I'm not sure if it is the correct way.
Also, I've tried to use the EF .include() to go up to the ListCompanyDocumentTypes, but it is still returning all documents that each type has.
var config = new MapperConfiguration(cfg => { cfg.CreateMap<Company, CompanyDto>(); cfg.CreateMap<CompanyDocument,CompanyDocumentDto>();});
var mapper = config.CreateMapper();
var newDtoTest = mapper.Map<List<CompanyDto>>(companiesReturn);
var db = new entities();
foreach (var companyDto in newDtoTest)
{
foreach (var companyDtoCompanyDocument in companyDto.CompanyDocuments)
{
companyDtoCompanyDocument.dtoTypeName = (await db.ListCompanyDocumentsTypes.FirstOrDefaultAsync(p=> p.id.Equals(companyDtoCompanyDocument.typeId))).typeName;
}
}
Thank you,
You should use ForMember(),using this method you will be able to set the appropriate value for the specific field:
cfg.CreateMap<Company, CompanyDto>()
.ForMember(dest => dest.dtoTypeName, opt => opt.MapFrom(src => src.CompanyDocuments.ListCompanyDocumentsTypes.FirstOrDefault().typeName));
And companiesReturn should be a IQueryable<Company> type, so your sample method GetAll Company need include CompanyDocuments and ListCompanyDocumentsTypes and should look like:
public IQueryable<AttributeElement> GetAll()
{
return CompanyRepository.GetQueryable()
.Include(d => d.CompanyDocuments)
.ThenInclude(d => d.ListCompanyDocumentsTypes);
}
Automapper has easy mapping, you do not need to use additional loops, everything can be set up in configuration class.
Related
Simplified model:
Profile {Guid Id, string Name, List<Property> Properties}
Property {Guid Id, string Name, List<Type> Types}
Type {Guid Id, string Key, string Value}
DbContext:
{
public DbSet<Profile> Profiles { get; set; }
}
I didn't include Properties and Types in the DbContext so I used the ModelBuilder:
modelBuilder.Entity<Property>().HasMany<Type>();
In the Update service:
public async Task<Response> Update([FromBody] Profile profile)
{
var entity = await _context.Profiles
.Include(x => x.Properties)
.ThenInclude(x => x.Types)
.FirstOrDefaultAsync(x => x.Id == profile.Id);
foreach (var prop in profile.Properties)
{
var existingProp = entity.Properties.SingleOrDefault(a => a.Id == prop.Id);
//Update
if (existingProp != null)
{
var entry = _context.Entry(existingProp);
entry.State = EntityState.Modified;
existingProp.ChargeFrom(prop);//maps the new values to the db entity
_context.SaveChanges();
}
}
}
But the above code throws this exception at SaveChanges:
The instance of entity type 'Type' cannot be tracked because another
instance with the same key value for {'Id'} is already being tracked.
When attaching existing entities, ensure that only one entity instance
with a given key value is attached. Consider using
'DbContextOptionsBuilder.EnableSensitiveDataLogging' to see the
conflicting key values.
I marked the Types entity AsNoTracking:
.ThenInclude(x => x.Types).AsNoTracking()
and the problem is solved, but I don't know why this exception is thrown, some other thread mentions that the DbContext might be used by another process, and might be registered as Singleton, but in my case, it's registered as scoped.
I marked the Types entity AsNoTracking:
.ThenInclude(x => x.Types).AsNoTracking()
and the problem is solved, but I don't know why this exception is thrown
The reason for the error will be because this line:
existingProp.ChargeFrom(prop);//maps the new values to the db entity
... will be attempting to copy the untracked Types from prop into existingProp. Using AsNoTracking will remove the exception but it will most likely be resulting in a duplication of data on SaveChanges where Type would be set up with an Identity key or duplicate row exceptions. If you received no exception I would be checking the Types collection to see if there are duplicate rows appearing there.
When copying data across from an untracked entity to a tracked entity, you will want to ensure that only values, and not references, are copied across. Copying an untracked reference across, EF will treat that as a new entity by default. Even if you force its state over to Modified, the DbContext could already be tracking an entity with that ID.
If Property.Types is a collection of references, such as an association to a lookup, and these could change where associations are added and removed, then to apply changes you need to load the associated types from the database then use that to remove associations that are no longer valid and add ones that aren't currently associated.
For example: given a Property (PropertyA) with Types (Type1) and (Type2), if we edit that to have (Type1) and (Type3) we need to fetch Type1 & Type3 from the DbContext (tracked) then compare against the tracked PropertyA to determine to remove Type2 and add Type3
var entity = await _context.Profiles
.Include(x => x.Properties)
.ThenInclude(x => x.Types)
.SingleAsync(x => x.Id == profile.Id);
// Get the IDs for all Types we want to associate... In the above example this would
// ask for Type1 and Type3 if only the one property. We get a Distinct list because
// multiple properties might reference the same TypeId(s).
var existingTypeIds = profile.Properties
.SelectMany(x => x.Types.Select(t => t.Id))
.Distinct()
.ToList();
// Load references to all Types that will be needed. Where associating new types, these will be referenced.
var existingTypes = _context.Types
.Where(x => existingTypeIds.Contains(x.Id))
.ToList();
foreach (var prop in profile.Properties)
{
existingProp = entity.Properties.SingleOrDefault(x => x.Id == prop.Id);
if (existingProp == null)
continue;
var updatedTypeIds = prop.Types.Select(x => x.Id).ToList();
var existingTypeIds = existingProp.Types.Select(x => x.Id).ToList();
var addedTypeIds = updatedTypeIds.Except(existingTypeIds).ToList();
var removedTypeIds = existingTypeIds.Except(updatedTypeIds).ToList();
var addedTypes = existingTypes
.Where(x => addedTypeIds.Contains(x.Id))
.ToList();
var removedTypes = existingProp.Types
.Where(x => removedTypeIds.Contains(x.Id))
.ToList();
foreach(var removedType in removedTypes)
existingProp.Types.Remove(removedType);
foreach(var addedType in addedTypes)
existingProp.Types.Add(addedType);
}
If instead the type is a child row that contains properties that can be updated then these values should be copied across between the updated data and the existing data state. This adds a considerable amount of work, though tools like AutoMapper can be configured to help. You still need to manage cases where Types can be added, removed, or have contents changed. That would apply to Properties as well, as your example only handles cases where a property is updated, not potentially added or removed.
Ultimately it can be beneficial to try and structure update scenarios to be as atomic as possible to avoid an update that is going to make changes to an entire object graph of entities, properties, and types but rather one update for just entity values, one for property values, and one for a single type update. This would apply also to adding a property, adding a type, removing a property, removing a type. While it may look like more code to break up operations like this, it keeps them very simple and straight forward rather than one big, complex method trying to compare the before and after to figure out what to add, remove, and update. Bugs hide in complex code, not simple methods. :)
When editing an object graph you should also avoid calling SaveChanges more than once. Instead of calling it within the loop of properties it should be done once when the loop is completed. The reason for this is that something like an exception on one of the properties would result in an incomplete/invalid data state being persisted. If you have 4 properties in the object being saved, and the 3rd one fails with an exception for any reason, the first 2 will be updated with the last two not persisting. Generally within an update operation, the update should follow an "all or nothing" approach to persistence.
Hopefully that helps explain the behavior you are seeing and gives you something to consider moving forward.
While debugging some performance issues I discovered that Entity framework was loading a lot of records via lazy loading (900 extra query calls ain't fast!) but I was sure I had the correct include. I've managed to get this down to quite a small test case to demonstrate the confusion I'm having, the actual use case is more complex so I don't have a lot of scope to re-work the signature of what I'm doing but hopefully this is a clear example of the issue I'm having.
Documents have Many MetaInfo rows related. I want to get all documents grouped by MetaInfo rows with a specific value, but I want all the MetaInfo rows included so I don't have to fire off a new request for all the Documents MetaInfo.
So I've got the following Query.
ctx.Configuration.LazyLoadingEnabled = false;
var DocsByCreator = ctx.Documents
.Include(d => d.MetaInfo) // Load all the metaInfo for each object
.SelectMany(d => d.MetaInfo.Where(m => m.Name == "Author") // For each Author
.Select(m => new { Doc = d, Creator = m })) // Create an object with the Author and the Document they authored.
.ToList(); // Actualize the collection
I expected this to have all the Document / Author pairs, and have all the Document MetatInfo property filled.
That's not what happens, I get the Document objects, and the Authors just fine, but the Documents MetaInfo property ONLY has MetaInfo objects with Name == "Author"
If I move the where clause out of the select many it does the same, unless I move it to after the actualisation (which while here might not be a big deal, it is in the real application as it means we're getting a huge amount more data than we want to deal with.)
After playing with a bunch of different ways to do this I think it really looks like the issue is when you do a select(...new...) as well as the where and the include. Doing the select, or the Where clause after actualisation makes the data appear the way I expected it to.
I figured it was an issue with the MetaInfo property of Document being filtered, so I rewrote it as follows to test the theory and was surprised for find that this also gives the same (I think wrong) result.
ctx.Configuration.LazyLoadingEnabled = false;
var DocsByCreator = ctx.Meta
.Where(m => m.Name == "Author")
.Include(m => m.Document.MetaInfo) // Load all the metaInfo for Document
.Select(m => new { Doc = m.Document, Creator = m })
.ToList(); // Actualize the collection
Since we're not putting the where on the Document.MetaInfo property I expected this to bypass the problem, but strangely it doesn't the documents still only appear to have "Author" MetaInfo object.
I've created a simple test project and uploaded it to github with a bunch of test cases in, as far as I can tell they should all pass, bug only the ones with premature actualisation pass.
https://github.com/Robert-Laverick/EFIncludeIssue
Anyone got any theories? Am I abusing EF / SQL in some way I'm missing? Is there anything I can do differently to get the same organisation of results? Is this a bug in EF that's just been hidden from view by the LazyLoad being on by default, and it being a bit of an odd group type operation?
This is a limitation in EF in that Includes will be ignored if the scope of the entities returned is changed from where the include was introduced.
I couldn't find the reference to this for EF6, but it is documented for EF Core. (https://learn.microsoft.com/en-us/ef/core/querying/related-data) (see "ignore includes") I suspect it is a limit in place to stop EF's SQL generation from going completely AWOL in certain scenarios.
So while var docs = context.Documents.Include(d => d.Metas) would return the metas eager loaded against the document; As soon as you .SelectMany() you are changing what EF is supposed to return, so the Include statement is ignored.
If you want to return all documents, and include a property that is their author:
var DocsByCreator = ctx.Documents
.Include(d => d.MetaInfo)
.ToList() // Materialize the documents and their Metas.
.SelectMany(d => d.MetaInfo.Where(m => m.Name == "Author") // For each Author
.Select(m => new { Doc = d, Creator = m })) // Create an object with the Author and the Document they authored.
.ToList(); // grab your collection of Doc and Author.
If you only want documents that have authors:
var DocsByCreator = ctx.Documents
.Include(d => d.MetaInfo)
.Where(d => d.MetaInfo.Any(m => m.Name == "Author")
.ToList() // Materialize the documents and their Metas.
.SelectMany(d => d.MetaInfo.Where(m => m.Name == "Author") // For each Author
.Select(m => new { Doc = d, Creator = m })) // Create an object with the Author and the Document they authored.
.ToList(); // grab your collection of Doc and Author.
This means you will want to be sure that all of your filtering logic is done above that first 'ToList() call. Alternatively you can consider resolving the Author meta after the query such as when view models are populated, or an unmapped "Author" property on Document that resolves it. Though I generally avoid unmapped properties because if their use slips into an EF query, you get a nasty error at runtime.
Edit: Based on the requirement to skip & take I would recommend utilizing view models to return data rather than returning entities. Using a view model you can instruct EF to return just the raw data you need, compose the view models with either simple filler code or utilizing Automapper which plays nicely with IQueryable and EF and can handle most deferred cases like this.
For example:
public class DocumentViewModel
{
public int DocumentId { get; set; }
public string Name { get; set; }
public ICollection<MetaViewModel> Metas { get; set; } = new List<MetaViewModel>();
[NotMapped]
public string Author // This could be update to be a Meta, or specialized view model.
{
get { return Metas.SingleOrDefault(x => x.Name == "Author")?.Value; }
}
}
public class MetaViewModel
{
public int MetaId { get; set; }
public string Name { get; set; }
public string Value { get; set; }
}
Then the query:
var viewModels = context.Documents
.Select(x => new DocumentViewModel
{
DocumentId = x.DocumentId,
Name = x.Name,
Metas = x.Metas.Select(m => new MetaViewModel
{
MetaId = m.MetaId,
Name = m.Name,
Value = m.Value
}).ToList()
}).Skip(pageNumber*pageSize)
.Take(PageSize)
.ToList();
The relationship of an "author" to a document is implied, not enforced, at the data level. This solution keeps the entity models "pure" to the data representation and lets the code handle transforming that implied relationship into exposing a document's author.
The .Select() population can be handled by Automapper using .ProjectTo<TViewModel>().
By returning view models rather than entities you can avoid issues like this where .Include() operations get invalidated, plus avoid issues due to the temptation of detaching and reattaching entities between different contexts, plus improve performance and resource usage by only selecting and transmitting the data needed, and avoiding lazy load serialization issues if you forget to disable lazy-load or unexpected #null data with it.
I've been searching for an answer for hours. I've read many similar posts but still cannot get to the point.
In geolocating an ASPNET MVC website I need to switch between 'n' tables depending on user ip.
All tables have been named 'GeoData_' plus a convenient countryCode suffix (IT, FR, US...). Moreover, as they all contain the same fields, they inherit from the same 'GeoData' interface.
I know I can get needed dbset in the controller this way:
string countryTable = "GeoData_" + Server.HtmlEncode(Request.Cookies["_country"].Value.ToUpper());
var type = Assembly.GetExecutingAssembly()
.GetTypes()
.FirstOrDefault(t => t.Name == countryTable);
DbSet GeoContext = db.Set(type);
Now, I simply cannot pass this DbSet as I was used before, that is
var searchResult = GeoContext
.Where(c => c.PlaceName.ToUpper().Contains(term.ToUpper()))
.Select(c => new
{
PlaceName = c.PlaceName,
PostalCode = c.PostalCode,
GeoDataID = c.ID,
Latitude = c.Latitude,
Longitude = c.Longitude
}
)
.Distinct()
.OrderBy(c => (c.PlaceName.ToUpper().IndexOf(term.ToUpper())))
.ToList();
because model's fields (c.Placename etc...) can no more be referenced to GeoContext.
I suppose one way could be using System.Linq.Dynamic and changing the entire query consequently.
Is there a way to take advantage of all tables inheriting from same interface to get a better workaround?
try defining the DbSet class explicitly
like
DbSet<Class> GeoContext = db.Set<Class>(type);
i`m guessing you have this Class defined somewhere in your code
I am using WebAPI and Entity Framework to build a REST API that points to a large MSSQL database (~200 tables). The database is very normalized, so retrieving values that are useful to the consumer of the API requires a lot of drilling down into the tables.
In order to return useful data to the consumer, I have taken the approach of building models (or DTOs) using a factory pattern. However, I noticed that even though all the data was being returned in a nice format, there were performance issues due to lazy loading being enabled. In short, I was querying too much data even while I was returning only the data I needed.
So I resorted to turning off lazy loading and have attempted to grab the data explicitly using Include methods:
var accessions = db.AccessionParties
.Include(ap => ap.Accession.AccessionParties.Select(ap2 => ap2.Party))
.Include(ap => ap.Accession.AccessionParties.Select(ap2 => ap2.AccessionPartyPurposes.Select (app => app.PartyAccessionPurposeType)))
.Include(ap => ap.Accession.AccessionAnimals.Select(x => x.AnimalInformationType))
.Include(ap => ap.Accession.AccessionAnimals.Select(x => x.Specimens.Select(y => y.AccessionTestRequestSpecimens.Select(z => z.AccessionTestRequest.LabTestOffering.TestOffering))))
.ToList()
.Select(a => modelFactory.CreateAccessionModel(a.Accession));
Below is an example of the factory method I'm using to generate the model, which includes nested factory methods as well to shape my related data entities.
public AccessionModel CreateAccessionModel(Accession accession)
{
return new AccessionModel()
{
AccessionKey = accession.AccessionKey,
SubmittedDate = accession.SubmittedDate,
Parties = accession.AccessionParties
.Select(accessionParty => new { accessionParty = accessionParty, accessionParty.Party })
.Select(accessionParty => CreatePartyModel(accessionParty.Party)),
Animals = accession.AccessionAnimals.Select(accessionAnimal => CreateAccessionAnimalModel(accessionAnimal))
};
}
Are there any patterns or practices for handling the above situation? I've seen some examples of a method that allows you to pass in an array of include statements, but I cannot think of a way to handle this in an elegant, efficient, pragmatic way. Any input would be much appreciated.
You can use automapper to map between the entities and the DTOs and together with the Projection you can execute the query and load only the columns your DTO needs. Check http://automapper.org/ and https://github.com/AutoMapper/AutoMapper/wiki/Queryable-Extensions
Hope that helps.
I do this in one of my methods passing in an array of strings like so (it's ugly because I'm using reflection to get the property names but hopefully you get the idea)
private IQueryable<T> AddIncludes<T>(IDatabase db) // my Entity context class implements IDatabase
{
var props = typeof(IDatabase).GetProperties(BindingFlags.Public|BindingFlags.Instance);
IQueryable<T> ret = null;
foreach (var prop in props)
{
if (prop.PropertyType.Name == "IDbSet`1" &&
prop.PropertyType.GetGenericArguments()[0] == typeof(T))
{
ret = (IQueryable<T>)prop.GetValue(db, null);
break;
}
}
var includes = GetIncludes((DbContext)db, typeof(T)); // this returns an IEnumerable<string> of the includes
foreach (string include in includes) // replace string with a lambda
{
ret = ret.Include(include); // this is where the includes are added
}
return ret;
}
private ICollection<T> LoadObjectGraph<T>(IDatabase db, Func<T, bool> filter)
{
var queryableWithIncludesAdded = AddIncludes<T>(db);
return queryableWithIncludesAdded.Where(filter).ToList();
}
Instead of using strings, you could use the same technique to pass lambdas to build your query
I have a situation where I am calling an entity and putting in two includes in the ria services call.
public IQueryable<Position> GetPositions(int programID)
{
return _positionRepository.All()
.Where(x => x.ProgramID == programID)
.Include("RecPositions.Person");
}
Id like to get a handle on the Person entity on the front end. I have this working..the code below gives me a handle on the recPositions and in the intellisence I can see the Person object. id like to abstract that entity.
var test = _allRec.Select(x => x.RecPositions).ToList();
test now has my RecPosition...but i want to know how to write a lambda express so i can get a handle on the person object.
I came up with this..does anyone have any objections to this or a better way..
var test = _allRec.SelectMany(x => x.RecPositions)
.Select(p => p.Person)
.ToList();
this seems to give me what I want.