Simplified model:
Profile {Guid Id, string Name, List<Property> Properties}
Property {Guid Id, string Name, List<Type> Types}
Type {Guid Id, string Key, string Value}
DbContext:
{
public DbSet<Profile> Profiles { get; set; }
}
I didn't include Properties and Types in the DbContext so I used the ModelBuilder:
modelBuilder.Entity<Property>().HasMany<Type>();
In the Update service:
public async Task<Response> Update([FromBody] Profile profile)
{
var entity = await _context.Profiles
.Include(x => x.Properties)
.ThenInclude(x => x.Types)
.FirstOrDefaultAsync(x => x.Id == profile.Id);
foreach (var prop in profile.Properties)
{
var existingProp = entity.Properties.SingleOrDefault(a => a.Id == prop.Id);
//Update
if (existingProp != null)
{
var entry = _context.Entry(existingProp);
entry.State = EntityState.Modified;
existingProp.ChargeFrom(prop);//maps the new values to the db entity
_context.SaveChanges();
}
}
}
But the above code throws this exception at SaveChanges:
The instance of entity type 'Type' cannot be tracked because another
instance with the same key value for {'Id'} is already being tracked.
When attaching existing entities, ensure that only one entity instance
with a given key value is attached. Consider using
'DbContextOptionsBuilder.EnableSensitiveDataLogging' to see the
conflicting key values.
I marked the Types entity AsNoTracking:
.ThenInclude(x => x.Types).AsNoTracking()
and the problem is solved, but I don't know why this exception is thrown, some other thread mentions that the DbContext might be used by another process, and might be registered as Singleton, but in my case, it's registered as scoped.
I marked the Types entity AsNoTracking:
.ThenInclude(x => x.Types).AsNoTracking()
and the problem is solved, but I don't know why this exception is thrown
The reason for the error will be because this line:
existingProp.ChargeFrom(prop);//maps the new values to the db entity
... will be attempting to copy the untracked Types from prop into existingProp. Using AsNoTracking will remove the exception but it will most likely be resulting in a duplication of data on SaveChanges where Type would be set up with an Identity key or duplicate row exceptions. If you received no exception I would be checking the Types collection to see if there are duplicate rows appearing there.
When copying data across from an untracked entity to a tracked entity, you will want to ensure that only values, and not references, are copied across. Copying an untracked reference across, EF will treat that as a new entity by default. Even if you force its state over to Modified, the DbContext could already be tracking an entity with that ID.
If Property.Types is a collection of references, such as an association to a lookup, and these could change where associations are added and removed, then to apply changes you need to load the associated types from the database then use that to remove associations that are no longer valid and add ones that aren't currently associated.
For example: given a Property (PropertyA) with Types (Type1) and (Type2), if we edit that to have (Type1) and (Type3) we need to fetch Type1 & Type3 from the DbContext (tracked) then compare against the tracked PropertyA to determine to remove Type2 and add Type3
var entity = await _context.Profiles
.Include(x => x.Properties)
.ThenInclude(x => x.Types)
.SingleAsync(x => x.Id == profile.Id);
// Get the IDs for all Types we want to associate... In the above example this would
// ask for Type1 and Type3 if only the one property. We get a Distinct list because
// multiple properties might reference the same TypeId(s).
var existingTypeIds = profile.Properties
.SelectMany(x => x.Types.Select(t => t.Id))
.Distinct()
.ToList();
// Load references to all Types that will be needed. Where associating new types, these will be referenced.
var existingTypes = _context.Types
.Where(x => existingTypeIds.Contains(x.Id))
.ToList();
foreach (var prop in profile.Properties)
{
existingProp = entity.Properties.SingleOrDefault(x => x.Id == prop.Id);
if (existingProp == null)
continue;
var updatedTypeIds = prop.Types.Select(x => x.Id).ToList();
var existingTypeIds = existingProp.Types.Select(x => x.Id).ToList();
var addedTypeIds = updatedTypeIds.Except(existingTypeIds).ToList();
var removedTypeIds = existingTypeIds.Except(updatedTypeIds).ToList();
var addedTypes = existingTypes
.Where(x => addedTypeIds.Contains(x.Id))
.ToList();
var removedTypes = existingProp.Types
.Where(x => removedTypeIds.Contains(x.Id))
.ToList();
foreach(var removedType in removedTypes)
existingProp.Types.Remove(removedType);
foreach(var addedType in addedTypes)
existingProp.Types.Add(addedType);
}
If instead the type is a child row that contains properties that can be updated then these values should be copied across between the updated data and the existing data state. This adds a considerable amount of work, though tools like AutoMapper can be configured to help. You still need to manage cases where Types can be added, removed, or have contents changed. That would apply to Properties as well, as your example only handles cases where a property is updated, not potentially added or removed.
Ultimately it can be beneficial to try and structure update scenarios to be as atomic as possible to avoid an update that is going to make changes to an entire object graph of entities, properties, and types but rather one update for just entity values, one for property values, and one for a single type update. This would apply also to adding a property, adding a type, removing a property, removing a type. While it may look like more code to break up operations like this, it keeps them very simple and straight forward rather than one big, complex method trying to compare the before and after to figure out what to add, remove, and update. Bugs hide in complex code, not simple methods. :)
When editing an object graph you should also avoid calling SaveChanges more than once. Instead of calling it within the loop of properties it should be done once when the loop is completed. The reason for this is that something like an exception on one of the properties would result in an incomplete/invalid data state being persisted. If you have 4 properties in the object being saved, and the 3rd one fails with an exception for any reason, the first 2 will be updated with the last two not persisting. Generally within an update operation, the update should follow an "all or nothing" approach to persistence.
Hopefully that helps explain the behavior you are seeing and gives you something to consider moving forward.
Related
My question is: why does Entity Framework Core does it automatically load subclasses when using .Include? From my understanding I am supposed to specify the .ThenInclude but even without it, it loads..
The details
I have an unavoidable circular reference and specifically for that reason I avoided lazy loading, so that I could hard specify what should be included.
My data model - I have highlighted the circular reference:
A user should be able to subscribe to a bunch of categories.
A job should belong to one or more category
What I want to achieve
I built an endpoint that takes 3 optional parameters, a userId, JobId and a collection of categories.
I want to return a collection of jobs filtered by the above parameters
My attempt that is causing the infinite loop
var user = await UserAccessor.GetAppUser();
var result = DataContext.Jobs
.Include(job => job.Customer)
.Include(job => job.Categories)
.AsNoTracking()
.AsQueryable();
if (request.JobId != null)
{
result = result.Where(job => job.Id == request.JobId);
}
if (request.UserEmail != null)
{
result = result.Where(job => job.Customer.Email == request.UserEmail);
}
if (request.Categories != null)
{
var categories = await CategoryService.ConvertStringToCategories(request.Categories);
result = result.Where(job => job.Categories.Intersect(categories).Any());
}
return Mapper.Map<List<Job>, List<JobDto>>(await result.ToListAsync());
If I put a result.ToList() anywhere in the code and look at the actual values it seems it goes into an infinite loop without me specifying the .ThenInclude().
Have a look at the screenshot below, the infinite loop goes as follows
Collection of jobs (this is the result variable)
Job has a property collection of category (this I included with .Include(job => job.Categories))
Category has a property collection of jobs (this should be null as I didn't include it with a .ThenInclude() but it is not arrrggg)
I'm having trouble making this next query work correctly
var result = await db.Set<Employee>()
.Include(ca => ca.Person)
.Include(ca=> ca.Person).ThenInclude(x=> x.GenderType)
.Include(ca=> ca.Position).ThenInclude(x=> x.Department)
.AsNoTracking()
.ToListAsync();
When this gets executed the person entity is null but just in new registries, with this I mean that there is already employees in the DB that were inputted directly with SQL and with this registries it works fine.
I thought the problem may be with how the registries are saved since two different entities are saved at the same time with PersonId being the key to correlate the employee with person, here is how it's done:
var person = await db.Set<Person>().AddAsync(obj.Person);
await db.SaveChangesAsync();
obj.PersonId = person.PersonId;
db.Entry(obj.Person).State = EntityState.Detached;
await db.Set<Employee>().AddAsync(obj);
await db.SaveChangesAsync();
I use the EntityState.Detached since Employee.Person is already saved. This works fine for saving but when I try to get all the entities from Employee the query will return null even when the Employee.PersonId is correct.
If I make a more "direct" query it works:
var query = from e in a
join person in db.Set<Person>().AsNoTracking()
on new { e.PersonId, e.SubscriptionId}
equals new { person.PersonId, person.SubscriptionId}
select person;
So I'm sure the registry is there, that's why I can't seem to find the problem.
PS: Sorry for the ambiguous question
UPDATE:
I realized why this was happening, the correlation one to one in the FK had an error, since I didn't make that part of the code I didn't realized it earlier.
this was the problem :
modelBuilder.Entity<Employee>(entity =>
{
entity.HasKey(e => new { e.EmployeeId, e.SubscriptionId });
entity.HasOne(d => d.Person)
.WithOne(p => p.Employee)
.HasForeignKey<Person>(d => new { d.PersonId, d.SubscriptionId })
.OnDelete(DeleteBehavior.ClientSetNull)
.HasConstraintName("FK_Employee_PersonId_SubscriptionId");
When it should have been like this
modelBuilder.Entity<Employee>(entity =>
{
entity.HasKey(e => new { e.EmployeeId, e.SubscriptionId });
entity.HasOne(d => d.Person)
.WithOne(p => p.Employee)
.HasForeignKey<Employee>(d => new { d.PersonId, d.SubscriptionId })
.OnDelete(DeleteBehavior.ClientSetNull)
.HasConstraintName("FK_Employee_PersonId_SubscriptionId");
As you can see.HasForeignKey<Employee>(d => new { d.PersonId,d.SubscriptionId }) was HasForeignKey<Person>... I hope this can help someone facing the same problem.
I believe you may be over-complicating things with your detaching + add, and trying to manually ensure referenced entities are saved first. Under most normal scenarios, when allowed to track entities normally, EF can manage this perfectly find on it's own. I also highly recommend that you define your entities to use navigation properties or FK fields, not both. I.e. Navigation properties + shadow properties for the FK, or simply FK fields if you don't need any of the related entity properties or they are something like cached lookups. If you do use both, rely on the tracked navigation properties and do not set relationships by FK.
By using db.Entry(obj.Person).State = EntityState.Detached; you've basically told the DbContext to forget about that entity. I agree though that if you later tell the DbContext to load an Employee and .Include(x => x.Person) that it would be quite strange for that Person entity to be #null. But perhaps you can avoid this "bug"/behavior:
This code here is a smell:
var person = await db.Set<Person>().AddAsync(obj.Person);
await db.SaveChangesAsync();
obj.PersonId = person.PersonId;
EF manages FK assignments 100% automatically. When I see code like this, it hints at an SQL/ADO developer not trusting EF to manage the associations.
Taking the following simplified example code:
var person = new Person { Name = "Steve" };
var employee = new Employee { Title = "Developer", Person = person };
In SQL land, Employee has a Person ID, so we'd typically need to ensure the Person record is saved first, get it's ID, and associate that to our Employee.PersonId column. Hence code like this:
context.Persons.Add(person);
context.SaveChanges(); // Generate PersonId.
employee.PersonId = person.PersonId;
context.Employees.Add(employee);
context.SaveChanges();
However, if the relationship is mapped in EF, this is completely unnecessary and can potentially lead to "already referenced" errors, which might be behind reasons to be messing with detaching entities. In reality, all you would need instead all of the above would be:
context.Employees.Add(employee);
context.SaveChanges();
When the employee is added, EF goes through all the related entities. It finds a Person that it doesn't know about so it will treat that as an added entity too. Because of the relationship mapping it will know that the Person needs to be inserted first, and the Employee PersonId will be updated as a result before the Employee is inserted.
Where people typically get tripped up when dealing with relationships is untracked instances. Let's say the "Person" record already exists. We're creating an employee we want to associate to Person ID#14. The most common examples I see is when the Person was loaded from the DbContext, sent to the client, then passed back to the server and developers assume that it's still an "entity" rather than a deserialized POCO that the DbContext has no knowledge of. For instance:
public void CreateEmployeeForPerson(Person person)
{
var employee = new Employee( Title = "Developer", Person = person );
Context.Employees.Add(employee);
Context.SaveChanges();
}
This ends up raising a confusing error that a row already exists. This is due to the person reference being treated as a new entity, it isn't tracked. EF wanted to generate an INSERT statement for the Person as well as the Employee. By tinkering with attached state, and AsNoTracking() with entity references you might want to use for Updates and such you can run into issues like this as well. This can be solved by using Attach to associate it to the Context, though that can be risky as if anything sets a modified state on it and the data has been tampered with by the client that could persist unintentional changes. Instead, we should look to always deal with tracked instances:
public void CreateEmployeeForPerson(int personId)
{
var person = Context.Persons.Single(x => x.PersonId == personId);
var employee = new Employee( Title = "Developer", Person = person );
Context.Employees.Add(employee);
Context.SaveChanges();
}
The person reference is known/tracked by the DbContext, and we've asserted the PersonId actually exists in the DB via the Single call. Now when the Employee is added, it's .Person reference is pointing at a known instance to the DbContext so EF just generates the appropriate INSERT statement for the Employee only.
While this might not pinpoint why .Include was not including your newly created Person instances, hopefully this can help simplify your persistence code overall and avoid weird behaviour around detached entities.
I'm trying to load some user info in code, but EF returns null.
foreach (var user in allAutoUsers)
{
Wallet wallet = db.CabinetUsers
.Find(user.IdCabinetUser)?
.Wallets
.FirstOrDefault(x => x.TypeCurrency == currency);
}
Variable user reporting 1 wallet, but when I'm trying to get it in code above it returns null.
Are there some ways to solve this problem?
Read more into Linq expressions rather than relying on Find. You can be running into issues if your relationships between entities are not defined as virtual which would prevent EF from lazy loading them.
The issue with using .Find() is that it will return the entity if it exists, however, attempting to access any related property that the DbContext isn't already aware of will require a lazy load call. This can be easily missed if lazy loading is disabled, or the member isn't virtual, and it can be a performance issue when it is enabled.
Instead, Linq can allow you to query through the object graph directly to get what you want:
foreach (var user in allAutoUsers)
{
Wallet wallet = db.CabinetUsers
.Where(x => x.IdCabinetUser == user.IdCabinetUser)
.SelectMany(x => x.Wallets)
.SingleOrDefault(x => x.TypeCurrency == currency);
// do stuff with wallet...
}
This assumes that there will be only 1 wallet for the specified currency per user. When expecting 0 or 1, use SingleOrDefault. Use FirstOrDefault only when you are expecting 0 or many, want the "first" one and have specified an OrderBy clause to ensure the first item is predictable.
This will result in a query per user. To accomplish with 1 query for all users:
var userIds = allAutoUsers.Select(x => x.IdCabinetUser).ToList();
var userWallets = db.CabinetUsers
.Where(x => userIds.Contains(x.IdCabinetUser))
.Select(x => new
{
x.IdCabinetUser,
Wallet = x.SelectMany(x => x.Wallets)
.SingleOrDefault(x => x.TypeCurrency == currency);
}).ToList();
From this I would consider expanding the wallets SelectMany with a Select for the details in the Wallet you actually care about rather than a reference to the entire wallet entity. This has the benefit of speeding up the query, reducing memory use, and avoids the potential for lazy load calls tripping things up if Wallet references any other entities that get touched later on.
For example if you only need the IdWallet, WalletName, TypeCurrency, and Balance:
// replace this line from above...
Wallet = x.SelectMany(x => x.Wallets)
// with ...
Wallet = x.SelectMany(x => x.Wallets.Select(w => new
{
w.IdWallet,
w.WalletName,
w.TypeCurrency,
w.Ballance
}) // ...
From there you can foreach to your heart's content without extra queries:
foreach ( var userWallet in userWallets)
{
// do stuff with userWallet.Wallet and userWallet.IdCabinetUser.
}
If you want to return the wallet details to a calling method or view or such, then you cannot use an anonymous type for that ( new { } ). Instead you will need to define a simple class for the data you want to return and use Select into that. I.e. new WalletDTO { IdWallet = w.IdWallet, //... } Even if you use Entities, it is recommended to reduce these into DTOs or ViewModels rather than returning entities. Entities should not "live" longer than the DbContext that spawned them otherwise you get all kinds of nasty behaviour popping up like ObjectDisposedException and serialization exceptions.
While debugging some performance issues I discovered that Entity framework was loading a lot of records via lazy loading (900 extra query calls ain't fast!) but I was sure I had the correct include. I've managed to get this down to quite a small test case to demonstrate the confusion I'm having, the actual use case is more complex so I don't have a lot of scope to re-work the signature of what I'm doing but hopefully this is a clear example of the issue I'm having.
Documents have Many MetaInfo rows related. I want to get all documents grouped by MetaInfo rows with a specific value, but I want all the MetaInfo rows included so I don't have to fire off a new request for all the Documents MetaInfo.
So I've got the following Query.
ctx.Configuration.LazyLoadingEnabled = false;
var DocsByCreator = ctx.Documents
.Include(d => d.MetaInfo) // Load all the metaInfo for each object
.SelectMany(d => d.MetaInfo.Where(m => m.Name == "Author") // For each Author
.Select(m => new { Doc = d, Creator = m })) // Create an object with the Author and the Document they authored.
.ToList(); // Actualize the collection
I expected this to have all the Document / Author pairs, and have all the Document MetatInfo property filled.
That's not what happens, I get the Document objects, and the Authors just fine, but the Documents MetaInfo property ONLY has MetaInfo objects with Name == "Author"
If I move the where clause out of the select many it does the same, unless I move it to after the actualisation (which while here might not be a big deal, it is in the real application as it means we're getting a huge amount more data than we want to deal with.)
After playing with a bunch of different ways to do this I think it really looks like the issue is when you do a select(...new...) as well as the where and the include. Doing the select, or the Where clause after actualisation makes the data appear the way I expected it to.
I figured it was an issue with the MetaInfo property of Document being filtered, so I rewrote it as follows to test the theory and was surprised for find that this also gives the same (I think wrong) result.
ctx.Configuration.LazyLoadingEnabled = false;
var DocsByCreator = ctx.Meta
.Where(m => m.Name == "Author")
.Include(m => m.Document.MetaInfo) // Load all the metaInfo for Document
.Select(m => new { Doc = m.Document, Creator = m })
.ToList(); // Actualize the collection
Since we're not putting the where on the Document.MetaInfo property I expected this to bypass the problem, but strangely it doesn't the documents still only appear to have "Author" MetaInfo object.
I've created a simple test project and uploaded it to github with a bunch of test cases in, as far as I can tell they should all pass, bug only the ones with premature actualisation pass.
https://github.com/Robert-Laverick/EFIncludeIssue
Anyone got any theories? Am I abusing EF / SQL in some way I'm missing? Is there anything I can do differently to get the same organisation of results? Is this a bug in EF that's just been hidden from view by the LazyLoad being on by default, and it being a bit of an odd group type operation?
This is a limitation in EF in that Includes will be ignored if the scope of the entities returned is changed from where the include was introduced.
I couldn't find the reference to this for EF6, but it is documented for EF Core. (https://learn.microsoft.com/en-us/ef/core/querying/related-data) (see "ignore includes") I suspect it is a limit in place to stop EF's SQL generation from going completely AWOL in certain scenarios.
So while var docs = context.Documents.Include(d => d.Metas) would return the metas eager loaded against the document; As soon as you .SelectMany() you are changing what EF is supposed to return, so the Include statement is ignored.
If you want to return all documents, and include a property that is their author:
var DocsByCreator = ctx.Documents
.Include(d => d.MetaInfo)
.ToList() // Materialize the documents and their Metas.
.SelectMany(d => d.MetaInfo.Where(m => m.Name == "Author") // For each Author
.Select(m => new { Doc = d, Creator = m })) // Create an object with the Author and the Document they authored.
.ToList(); // grab your collection of Doc and Author.
If you only want documents that have authors:
var DocsByCreator = ctx.Documents
.Include(d => d.MetaInfo)
.Where(d => d.MetaInfo.Any(m => m.Name == "Author")
.ToList() // Materialize the documents and their Metas.
.SelectMany(d => d.MetaInfo.Where(m => m.Name == "Author") // For each Author
.Select(m => new { Doc = d, Creator = m })) // Create an object with the Author and the Document they authored.
.ToList(); // grab your collection of Doc and Author.
This means you will want to be sure that all of your filtering logic is done above that first 'ToList() call. Alternatively you can consider resolving the Author meta after the query such as when view models are populated, or an unmapped "Author" property on Document that resolves it. Though I generally avoid unmapped properties because if their use slips into an EF query, you get a nasty error at runtime.
Edit: Based on the requirement to skip & take I would recommend utilizing view models to return data rather than returning entities. Using a view model you can instruct EF to return just the raw data you need, compose the view models with either simple filler code or utilizing Automapper which plays nicely with IQueryable and EF and can handle most deferred cases like this.
For example:
public class DocumentViewModel
{
public int DocumentId { get; set; }
public string Name { get; set; }
public ICollection<MetaViewModel> Metas { get; set; } = new List<MetaViewModel>();
[NotMapped]
public string Author // This could be update to be a Meta, or specialized view model.
{
get { return Metas.SingleOrDefault(x => x.Name == "Author")?.Value; }
}
}
public class MetaViewModel
{
public int MetaId { get; set; }
public string Name { get; set; }
public string Value { get; set; }
}
Then the query:
var viewModels = context.Documents
.Select(x => new DocumentViewModel
{
DocumentId = x.DocumentId,
Name = x.Name,
Metas = x.Metas.Select(m => new MetaViewModel
{
MetaId = m.MetaId,
Name = m.Name,
Value = m.Value
}).ToList()
}).Skip(pageNumber*pageSize)
.Take(PageSize)
.ToList();
The relationship of an "author" to a document is implied, not enforced, at the data level. This solution keeps the entity models "pure" to the data representation and lets the code handle transforming that implied relationship into exposing a document's author.
The .Select() population can be handled by Automapper using .ProjectTo<TViewModel>().
By returning view models rather than entities you can avoid issues like this where .Include() operations get invalidated, plus avoid issues due to the temptation of detaching and reattaching entities between different contexts, plus improve performance and resource usage by only selecting and transmitting the data needed, and avoiding lazy load serialization issues if you forget to disable lazy-load or unexpected #null data with it.
Consider following LINQ query:
var item = (from obj in _db.SampleEntity.Include(s => s.NavProp1)
select new
{
ItemProp1 = obj,
ItemProp2 = obj.NavProp2.Any(n => n.Active)
}).SingleOrDefault();
This runs as expected, but item.ItemProp1.NavProp1 is NULL.
As it explains here this is because of the query actually changes after using Include(). but the question is what is the solution with this situation?
Edit:
When I change the query like this, every things works fine:
var item = (from obj in _db.SampleEntity.Include(s => s.NavProp1)
select obj).SingleOrDefault();
Regarding to this article I guess what the problem is... but the solution provided by author not working in my situation (because of using anonymous type in final select rather than entity type).
As you mentioned, Include is only effective when the final result of the query consists of the entities that should include the Include-d navigation properties.
So in this case Include has effect:
var list = _db.SampleEntity.Include(s => s.NavProp1).ToList();
The SQL query will contain a JOIN and each SampleEntity will have its NavProp1 loaded.
In this case it has no effect:
var list = _db.SampleEntity.Include(s => s.NavProp1)
.Select(s => new { s })
.ToList();
The SQL query won't even contain a JOIN, EF completely ignores the Include.
If in the latter query you want the SampleEntitys to contain their NavProp1s you can do:
var list = _db.SampleEntity
.Select(s => new { s, s.NavProp1 })
.ToList();
Now Entity Framework has fetched SampleEntitys and NavProp1 entities from the database separately, but it glues them together by a process called relationship fixup. As you see, the Include is not necessary to make this happen.
However, if Navprop1 is a collection, you'll notice that...
var navprop1 = list.First().s.Navprop1;
...will still execute a query to fetch Navprop1 by lazy loading. Why is that?
While relationship fixup does fill Navprop1 properties, it doesn't mark them as loaded. This only happens when Include loaded the properties. So now we have SampleEntity all having their Navprop1s, but you can't access them without triggering lazy loading. The only thing you can do to prevent this is
_db.Configuration.LazyLoadingEnabled = false;
var navprop1 = list.First().s.Navprop1;
(or by preventing lazy loading by disabling proxy creation or by not making Navprop1 virtual.)
Now you'll get Navprop1 without a new query.
For reference navigation properties this doesn't apply, lazy loading isn't triggered when it's enabled.
In Entity Framework core, things have changed drastically in this area. A query like _db.SampleEntity.Include(s => s.NavProp1).Select(s => new { s }) will now include NavProp1 in the end result. EF-core is smarter in looking for "Includable" entities in the end result. Therefore, we won't feel inclined to shape a query like Select(s => new { s, s.NavProp1 }) in order to populate the navigation property. Be aware though, that if we use such a query without Include, lazy loading will still be triggered when s.NavProp1 is accessed.
I know this will probably get a few laughs, but don't forget the obvious like i just did. The row in the database didn't actually have a foreign key reference! I should have checked the dam data first before thinking EF Include wasn't working! Grrr. 30 minutes of my life I won't get back.
If your model is defined properly it should work without any problems.
using System.Data.Entity;
var item = _db.SampleEntity
.Include(p => p.NavigationProperty)
.Select(p => new YourModel{
PropertyOne = p.Something,
PropertyTwo = p.NavigationProperty.Any(x => x.Active)
})
.SingleOrDefault(p => p.Something == true);
How did you find that item.ItemProp1.NavProp1 is null. EF uses proxies to load all required properties when you try to access it.
What about
var item = (from obj in _db.SampleEntity.Include(s => s.NavProp1)
select obj).SingleOrDefault();
Assert.IsNotNull(obj.NavProp1);
Assert.IsNotNull(obj.NavProp2);
You can also try with
var item = (from obj in _db.SampleEntity.Include(s => s.NavProp1)
select new
{
ItemProp1 = obj,
NavProp1 = obj.NavProp1,
ItemProp2 = obj.NavProp2.Any(n => n.Active)
}).SingleOrDefault();
Assert.IsNotNull(item.NavProp1)
Of course I assume that you don't have any problems with EF navigation property mappings.