I have following issue - I'm using generic repository to detach and reattach entity on update operation to prevent accessing it from two different objects.
This is how Update operation looks like:
public void Update(TEntity entityToUpdate)
{
if (this.context.Entry(entityToUpdate).State != EntityState.Detached)
{
this.context.Entry(entityToUpdate).State = EntityState.Detached;
}
this.context.Entry(entityToUpdate).State = EntityState.Modified;
}
When using two entities lets say like user and course
public class User
{
public Guid UserId { get; set; }
public string Email { get; set; }
public ICollection<Course> Courses{ get; set; }
}
}
and Course
public class Course
{
public string Name { get; set; }
public Guid CourseId { get; set; }
public virtual ICollection<User> Users { get; set; }
}
public class CourseDto
{
public string Name { get; set; }
public Guid CourseId { get; set; }
public virtual ICollection<Guid> Users { get; set; }
}
and I try to update course's users with following code:
public async Task<Course> Update(Guid existingCourseId, CourseDto courseModel)
{
var course = await course.Repository.Query(true)
.Include(c=> c.Users)
.Where(e => e.CourseId == existingCourseId )
.FirstOrDefaultAsync();
if (course == null)
{
return null;
}
course.Users = await FindUsersByIds(courseModel.Users);
course.Name = courseModel.Name;
courseRepository.Update(course);
await this.unitOfWork.SaveChangesAsync();
return course;
}
it doesn't work when I want to for example update only Name property.
If Users property doesn't change and there is at least one user it will try to insert record to the CourseUser join table violating primary key constraint instead of 'noticing' that it is already existing in database.
Edit:
Additionally when I use Entry(entityToUpdate).State = EntityState.Unchanged before changing it to modified and move repository.Update() call before overwriting entity properties it works all fine. If somebody could explain this behaviour to me I would be really grateful
In a nutshell, simply don't do this with detaching and setting state to modified. You're already doing the right thing by transporting details to update via a DTO. Let EF's change tracking do it's thing and just call SaveChanges().
I'm not sure what you're concern is warranting the Detach to "prevent accessing". All this does it tell the DbContext to treat the item as untracked, but then by setting the state to Modified, you're immediately tracking it again. The Main reasons you don't want to do this is that by setting the entity state to "Modified" you are telling EF to generate an UPDATE statement for the entire record regardless of whether any value was actually modified rather than optimizing an UPDATE statement to be run for values it detected had changed, and only if they had changed.
The issue you are likely seeing is because of this line:
course.Users = await FindUsersByIds(courseModel.Users);
I would bet money that your FindUsersByIds method is returning a non-tracked set of User entities either by AsNoTracking() or detaching those user entities before returning them. You have already eager loaded users for the given course you loaded. You might think this tells EF to remove existing references and replace them with the desired ones, but it doesn't. It just "adds" a set of Users that EF will treat as new associations where-by some of those associations already still exist in the database.
As a general rule when working with entities. Never, under any circumstances reset/overwrite a collection of related entities. Not to clear it and not to change the values associated with an entity.
If the model contains a revised list of IDs, then the proper way to update the set is to identify any users that need to be added, and any that need to be removed. Load references for the ones that need to be added to associate, and remove any from the eager loaded relationships. Also, where you expect 0 or 1 result, use SingleOrDefault rather than FirstOrdefault. First methods should only be used with an OrderBy clause where you could expect more than one entry and want a reliable, repeatable First. The exception to this would be when working with Linq against in-memory ordered sets where you can guarantee one unique find where First() will perform faster while Single() would scan the entire set. With EF generating queries First() generates a TOP(1) SQL statement where Single() generates a TOP(2) SQL statement so that performance assumption gets shot down to a bare minimum difference to enforce that expectation.
var course = await course.Repository.Query(true)
.Include(c=> c.Users)
.Where(e => e.CourseId == existingCourseId)
.SingleOrDefaultAsync();
if (course == null)
return null;
var existingUserIds = course.Users.Select(u => u.UserId);
var userIdsToAdd = courseModel.Users.Except(existingUserIds).ToList();
var userIdsToRemove = existingUserIds.Except(courseModel.Users).ToList();
if (userIdsToRemove.Any())
{
var usersToRemove = course.Users
.Where(u => userIdsToRemove.Contains(u.UserId))
.ToList();
foreach(var user in usersToRemove)
course.Users.Remove(user);
}
if (userIdsToAdd.Any())
{
var usersToAdd = FindUsersByIds(userIdsToAdd); // Important! Ensure this does NOT detach or use AsNoTracking()
foreach(var user in usersToAdd)
course.Users.Add(user);
}
course.Name = courseModel.Name;
await this.unitOfWork.SaveChangesAsync();
return course;
Basically this inspects the IDs selected to pick out the ones to add and remove, then proceeds to modify the eager loaded collection if needed. The change tracking will take care of determining what, if any SQL needs to be generated.
If the Users collection is exposed as List<User> then you can use AddRange and RemoveRange rather than the foreach loops. This won't work if they are exposed as IList<User> or IHashSet<User> or ICollection<User>.
Edit:
Based on the error mentioned, I would suggest starting by at least temporarily removing some of the variables in the equation around the particular implementation of the Repository and Unit of Work patterns and working with a scoped DbContext to begin with:
For a start try this edit:
using (var context = new AppDbContext())
{
var course = await context.Courses
.Include(c=> c.Users)
.Where(e => e.CourseId == existingCourseId)
.SingleOrDefaultAsync();
if (course == null)
return null;
var existingUserIds = course.Users.Select(u => u.UserId);
var userIdsToAdd =
courseModel.Users.Except(existingUserIds).ToList();
var userIdsToRemove = existingUserIds.Except(courseModel.Users).ToList();
if (userIdsToRemove.Any())
{
var usersToRemove = course.Users
.Where(u => userIdsToRemove.Contains(u.UserId))
.ToList();
foreach(var user in usersToRemove)
course.Users.Remove(user);
}
if (userIdsToAdd.Any())
{
var usersToAdd = FindUsersByIds(userIdsToAdd); // Important! Ensure this does NOT detach or use AsNoTracking()
foreach(var user in usersToAdd)
course.Users.Add(user);
}
course.Name = courseModel.Name;
await context.SaveChangesAsync();
context.Entry(course).State = EntityState.Detached; // Necesssary evil as this method is returning an entity.
return course;
}
This is only intended as a temporary measure to help identify if your repository or unit of work could be leading to transient DbContexts being used to load and track entities. This could still have issues depending on what happens to the Course after this method returns it. In this case since we are leaving the scope of the DbContext instance that is tracking it, we detach it. The next step would be to ensure that a DbContext, either directly or accessible through the unit of work can be injected and that it is guaranteed to be scoped to the web request (if web) or a scope suited to the unit of work this operation is part of. (A bit more work for things like WPF desktop applications)
Normally for something like a web application you would want to ensure that a DbContext has a lifetime scope of the web request (or shorter) but not Transient. We want to ensure that all operations within a unit of work reference the same DbContext instance otherwise you end up working with entities that might be tracked by multiple DbContexts, or start introducing code to mix tracked and untracked (detached) entities to get around problems when passing entity references around. Working with detached entities requires a lot of extra boiler-plate, disciplined coding to ensure DbContexts are working with the correct, single reference of entities to avoid "already tracked" type errors or duplicate data insertion / PK violation exceptions. I.e. pre-checking each and every DbSet.Local in a entity graph of related entities for any currently tracked instances with the same ID and replacing references when wanting to attach an entity graph to a DbContext that isn't already tracking that instance.
I have already read many posts about the entity framework problem with many to many and its beeing a pain in the neck again.
Colaborador has an ICollection of Time and
Time has an Icollection of Colaborador
Some people say that it´s necessary to attach the child entity before Add in the context(didn´t work for me, Pk error).
I am using simple injector and my context is per request.
My associative table is mapped like this:
HasMany<Time>(c => c.Times)
.WithMany(t => t.Colaboradores)
.Map(ct =>
{
ct.MapLeftKey("ColaboradorId");
ct.MapRightKey("TimeId");
ct.ToTable("Receptor");
});
It creates the associative table in the database.
When i try to insert a Colaborador(entity), i add in its list some Times(Teams), add to DbContext and then SaveChanges().
When i do this, it creates a new Colaborador, insert correctly in the associative table(the ids) but also duplicate the Time.
var colaborador = Mapper.Map<ColaboradorViewModel, Colaborador>(colaboradorVm);
List<TimeViewModel> timesVm = new List<TimeViewModel>();
colaboradorVm.TimesSelecionados.ForEach(t => timesVm.Add(_serviceTime.BuscarPorId(t)));
colaborador.Times = Mapper.Map<ICollection<TimeViewModel>, ICollection<Time>>(timesVm);
The function BuscarPorId does the Find method and returns a Time.
I have figured out that if i call the Add command, the entity will mark the child´s state as Added as well, but if i attempt to attach the Time or change it´s state to Unchanged, i get a primary key error...
foreach (var item in colaborador.Times)
{
lpcContext.Set<Time>().Attach(item);
//lpcContext.Entry(item).State = EntityState.Unchanged;
}
Is there any way of tell to entity framework to not insert a specific child? So only the main and associative table are populated?
Mapper.Map creates new Time objects which are not attached to your context, so you must attach them as Unmodified, but attaching them causes another error due to duplicate PK because your context is already tracking the original copies of the Time entities. Using the Find method will retrieve these tracked and locally cached entities.
Find and use the entity already attached to your context:
Instead of:
colaborador.Times = Mapper.Map<ICollection<TimeViewModel>, ICollection<Time>>(timesVm);
Use:
var times = new List<Time>();
var dbSet = lpcContext.Set<Time>();
foreach( var t in timesVm )
{
var time = dbSet.Find( t.Id );
if( null == time )
{
time = Mapper.Map<TimeViewModel, Time>( t );
}
times.Add( time );
}
collaborador.Times = times;
I'm using Sqlite database and System.Data.SQLite 1.0.92
There is 2 table here:
Table Person:
PersonId
PersonName
Table Student:
StudentId
PersonId(reference table Person FK)
StudentNo
Now every time I get the Persons Collection in EF5:
using (var ctx = new myEntities)
{
AllPersons = ctx.Persons.ToList();
}
There is also has AllPersons.student collection will include in the result;
But I don't need it. Of course that's just an example, There is a lot of big table has so many references, it always has performance problems here because of that.
So I'm trying to do not let it in my result. So I change it:
using (var ctx = new myEntities)
{
ctx.Configuration.ProxyCreationEnabled = false;
ctx.Configuration.LazyLoadingEnabled = false;
AllPersons= ctx.Persons.ToList();
}
Now fine, because AllPersons.student collection will always be null
But now I found: If I get Person and Student together:
using (var ctx = new myEntities)
{
ctx.Configuration.ProxyCreationEnabled = false;
ctx.Configuration.LazyLoadingEnabled = false;
AllPersons= ctx.Persons.ToList();
AllStudents = ctx.Student.ToList();
}
Now the reference still include in.
So Is there anyway to don't let the reference include in any time in this situation?
Thank you.
Update
For some friends request, I explain why I need it:
1: When I convert it to json it will be a dead loop. even I already use Json.net ReferenceLoopHandling, the json size very big to crash the server.(if no references, it's just a very small json)
2:Every time I get the client data and need to save, it will display exception about model state, until I set it to null.
Example:
using (myEntities ctx = new myEntities())
{
ctx.Configuration.LazyLoadingEnabled = false;
ctx.Configuration.ProxyCreationEnabled = false;
Person model= ThisIsAModel();
model.students = null; // This is a key, I need set the students collection references to null , otherwise it will throw exception
ctx.Entry(model).State = EntityState.Modified;
ctx.SaveChanges();
}
3: This is More important problem. I already get all data and cache on the server. But It will let the loading time very long when server start. (because the data and references are so many, that is the main problem), I don't know I'll meet what kind of problem again....
public List<Person> PersonsCache; // global cache
public List<Student> StudentsCache; // global cache
using (myEntities ctx = new myEntities())
{
ctx.Configuration.LazyLoadingEnabled = false;
ctx.Configuration.ProxyCreationEnabled = false;
// There is so many references and data, will let it very slow , when I first time get the all cache. even I only get the Person model, not other , just because some Collection has some references problem. It will very slow....
PersonsCache = ctx.Persons.ToList();
StudentsCache= ctx.Student.ToList();
}
The Problem
As you said, when you load both of Parent and Child lists even when LazyLoading is disabled, and then look in parent.Childs you see child items has been loaded too.
var db = new YourDbContext();
db.Configuration.LazyLoadingEnabled = false;
var parentList= db.YourParentSet.ToList();
var childList= db.YourChildSet.ToList();
What happened? Why childs are included in a parent?
The childs under a parent entity, are those you loaded using db.YourChildSet.ToList(); Exactly themselves; In fact Entity Framework never loads childs for a parent again but because of relation between parent and child in edmx, they are listed there.
Is that affect Perforemance?
According to the fact that childs only load once, It has no impact on perforemance because of loading data.
But for serialization or something else's sake, How can I get rid of it?
you can use these solutions:
Solution 1:
Use 2 different instance of YourDbContext:
var db1 = new YourDbContext();
db1.Configuration.LazyLoadingEnabled = false;
var parentList= db.YourParentSet.ToList();
var db2 = new YourDbContext();
db2.Configuration.LazyLoadingEnabled = false;
var childList= db.YourChildSet.ToList();
Now when you look in parent.Childs there is no Child in it.
Solution 2:
use Projection and shape your output to your will and use them.
var db1 = new YourDbContext();
db1.Configuration.LazyLoadingEnabled = false;
var parentList= db.YourParentSet
.Select(x=>new /*Model()*/{
Property1=x.Property1,
Property2=x.Property2, ...
}).ToList();
This way when serialization there is nothing annoying there.
Using a custom Model class is optional and in some cases is recommended.
Additional Resources
As a developer who use Entity Framework reading these resources is strongly recommended:
Performance Considerations for Entity Framework 4, 5, and 6
Connection Management
I'll focus on your third problem because that seems to be your most urgent problem. Then I'll try to give some hints on the other two problems.
There are two Entity Framework features you should be aware of:
When you load data into a context, Entity Framework will try to connect the objects wherever they're associated. This is called relationship fixup. You can't stop EF from doing that. So if you load Persons and Students separately, a Person's Students collection will contain students, even though you didn't Include() them.
By default, a context caches all data it fetches from the database. Moreover, it stores meta data about the objects in its change tracker: copies of their individual properties and all associations. So by loading many objects the internal cache grows, but also the size of the meta data. And the ever-running relationship fixup process gets slower and slower (although it may help to postpone it by turning off automatic change detection). All in all, the context gets bloated and slow like a flabby rhino.
I understand you want to cache data in separate collections for each entity. Two simple modifications will make this much quicker:
Evade the inevitable relationship fixup by loading each collection by a separate context
Stop caching (in the context) and change tracking by getting the data with AsNoTracking.
Doing this, your code will look like this:
public List<Person> PersonsCache;
public List<Student> StudentsCache;
using (myEntities ctx = new myEntities())
{
ctx.Configuration.ProxyCreationEnabled = false;
PersonsCache = ctx.Persons
.AsNoTracking()
.ToList();
}
using (myEntities ctx = new myEntities())
{
ctx.Configuration.ProxyCreationEnabled = false;
StudentsCache= ctx.Student
.AsNoTracking()
.ToList();
}
The reason for turning off ProxyCreationEnabled is that you'll get light objects and that you'll never inadvertently trigger lazy loading afterwards (throwing an exception that the context is no longer available).
Now you'll have cached objects that are not inter-related and that get fetched as fast as it gets with EF. If this isn't fast enough you'll have to resort to other tools, like Dapper.
By the way, your very first code snippet and problem description...
using (var ctx = new myEntities)
{
AllPersons = ctx.Persons.ToList();
}
There is also has AllPersons.student collection will include in the result;
...suggest that Entity Framework spontaneously performs eager loading (of students) without you Include-ing them. I have to assume that your code snippet is not complete. EF never, ever automatically executes eager loading. (Unless, maybe, you have some outlandish and buggy query provider).
As for the first problem, the serialization. You should be able to tackle that in a similar way as shown above. Just load the data you want to serialize in isolation and disable proxy creation. Or, as suggested by others, serialize view models or anonymous types exactly containing what you need there.
As for the second problem, the validation exception. I can only imagine this to happen if you initialize a students collection by default, empty, Student objects. These are bound to be invalid. If this is not the case, I suggest you ask a new question about this specific problem, showing ample detail about the involved classes and mappings. That shouldn't be dealt with in this question.
Explicitly select what you want to return from the Database.
Use Select new. With the select new clause, you can create new objects of an anonymous type as the result of a query and don't let the reference include in. This syntax allows you to construct anonymous data structures. These are created as they are evaluated (lazily). Like this:
using (var ctx = new myEntities())
{
var AllPersons = ctx.People.Select(c => new {c.PersonId, c.PersonName}).ToList();
}
And even you don't need to disable lazy loading anymore.
After running query above:
This query currently allocates an anonymous type using select new { }, which requires you to use var. If you want allocate a known type, add it to your select clause:
private IEnumerable<MyClass> AllPersons;//global variable
using (var ctx = new myEntities())
{
AllPersons = ctx.People
.Select(c => new MyClass { PersonId = c.PersonId, PersonName = c.PersonName }).ToList();
}
And:
public class MyClass
{
public string PersonId { get; set; }
public string PersonName { get; set; }
}
If entities are auto generated, then copy paste it to own code and remove the relation generated like child collection and Foreign key. Or you don't need all this kind of the functionality might be can user lightweight framework like dapper
In normally your student collection doesn't fill from database. it's fill when you reach to property. In addition if you use ToList() method so Entity Framework read data from data to fill your collection.
Pls check this.
https://msdn.microsoft.com/en-us/data/jj574232.aspx#lazy
https://msdn.microsoft.com/en-us/library/vstudio/dd456846(v=vs.100).aspx
Is there anyway to don't let the reference include in any time in this situation?
The solution to this seems to be very simple: don't map the association. Remove the Student collection. Not much more I can say about it.
Decorate any properties with [IgnoreDataMember] if you are using 4.5+
https://msdn.microsoft.com/en-us/library/system.runtime.serialization.ignoredatamemberattribute(v=vs.110).aspx
Also sounds like you are trying to do table inheritance which is a different problem with EF
http://www.asp.net/mvc/overview/getting-started/getting-started-with-ef-using-mvc/implementing-inheritance-with-the-entity-framework-in-an-asp-net-mvc-application
http://www.entityframeworktutorial.net/code-first/inheritance-strategy-in-code-first.aspx
If I understand you correctly, you're just trying to make sure you only get what you specifically ask for right?
This was mentioned a little above, but to do this correctly you just want to select an anonymous type.
var students = from s in _context.Students
select new{
StudentId,
StudentNo};
Then, when you want to update this collection/object, I'd recommend use GraphDiff. GraphDiff really helps with the problems of disconnected entities and updates (https://github.com/refactorthis/GraphDiff)
So your method would look similar to this:
void UpdateStudent(Student student){
_context.UpdateGraph(student, map =>
map
.AssociatedEntity(c => c.Person));
_context.SaveChanges();
}
This way, you're able to update whatever properties on an object, disconnected or not, and not worry about the association.
This is assuming that you correctly mapped your entities, and honestly, I find it easier to declare the object as a property, not just the ID, and use a mapping file to map it correctly.
So:
class Person{
int Id{get;set;}
string Name{get;set}
}
class Student{
int Id{get;set;}
string StudentNo{get;set;}
Person Person{get;set;}
public class StudentMap : EntityTypeConfiguration<Student>
{
public StudentMap()
{
// Primary Key
HasKey(t => t.Id);
// Table & Column Mappings
ToTable("Students");
Property(t => t.Id).HasColumnName("StudentId");
// Relationships
HasRequired(t => t.Person)
.HasForeignKey(d => d.PersonId);
}
}
Hopefully that makes sense. You don't need to create a view model, but you definitely can. This way does make it easier to map disconnected items back to the database though.
I had exact same situation.
All I did to solve it was ask for the Student.ToList() before I asked for Persons.ToList()
I didn't have to disable lazy loading. Just need to load the table that has reference to other table first after that you can load the other table and first table results are already in memory and don't get "fixed" with all the references.
They are automatically linked in the ObjectContext by there EntityKey. Depending on what you want to do with your Persons and Students, you can Detach them from the ObjectContext :
using (var ctx = new myEntities)
{
ctx.Configuration.ProxyCreationEnabled = false;
ctx.Configuration.LazyLoadingEnabled = false;
AllPersons= ctx.Persons.ToList();
foreach(var c in AllPersons)
{
ctx.Detach(c);
}
AllStudents = ctx.Student.ToList();
foreach(var c in AllStudents )
{
ctx.Detach(c);
}
}
Im Beginner and i wanted to know if what i am doing is right, cause i've being queering a single table of 350 records and it took almost a minute to display ! i think I'm doing something wrong.
So this is how i am doing it :
CODE INSIDE THE CLASS : SPOT
/// <summary>
/// Return The City Of this Spot
/// </summary>
/// <returns>City Of The Spot</returns>
public City GetCityOfThisSpot()
{
City value;
using (var ctx = new GhanDBEntities())
{
var firstOrDefault = ctx.Spots.FirstOrDefault(s => s.IdSpot == IdSpot);
var city = firstOrDefault.City;
value = city;
}
return value;
}
and then in my winform i use something life this :
CODE IN THE WINFORM :
List<City> listOfCities = new List<City>();
foreach (var spot in listOfspot)
{
listOfCities.Add(spot.GetCityOfThisSpot);
}
I think i shouldn't do it this way, because foreach Spot i am creating a context and destroying it !? can you correct me please.
I can give you few suggestions:
you are using FirstOrDefault, but not checking for null, i think it would be better to use Find, Single or First, or add null(you won't get any speed benefits from this)
most likely you will get speed benefits from batch request:
var ids = listOfspot.Select(p => p.IdSpot).ToList();
var listOfCities = ctx.Spots.Where(p => ids.Contains(p.IdSpot))
.Select(p => p.City).ToList();
DbContext isn't the problem in this case.
GetCityOfThisSpot()
is making a db-query, the code in the winform is running the query 350 times...
Make a join to make this much faster.
DbContext is a light-weight class which is not a problem to create for each query. But with WinForms application thats just an overkill - you can reuse same instance of context during whole application lifetime. Note - for web applications you usually create instance of context which is reused for all queries during same request.
So, you don't need to dispose context in WinForms application. What you really need here is navigation property for City in your Spot entity:
public virtual City City { get; set; }
In this case Entity Framework will lazy-load city when you will need it (remember - keep context not disposed). Also if you want to get cities of many spots, then just do eager-loading of cities when you are loading spots:
var spots = ctx.Spots.Include(s => s.City);
In this case Entity Framework will join both tables and return spots with already loaded cities.
In both cases getting cities of spots will look like:
List<City> listOfCities = listOfspot.Select(s => s.City).ToList();
I also suggest you to read Jon Gallant article Do I always have to call Dispose() on my DbContext objects? There you can find Diego Vega (the Senior SDE Lead on Entity Framework) response, which states that in common scenarios (if you don't open database connection manually) you don't need to call Dispose on DbContext.
We're using Entity Framework 4.1 for our data acces and while building up objects and we started asking questions to ourselves
about how chatty the application was going to be with the database. Now one item that we really started looking at is below:
public MasterPreAward()
{
public int ID
public int MemberID
public int CycleID
public virtual Cycle
public virtual Member
public virtual Status
public virtual ICollection<DataTracking> DataTrackings
public virtual ICollection<ReviewerAssignment> Reviewers
}
The MasterPreAward is a generated entity from the database and has the navigation properites of Cycle, Member, Status along with two collections for DataTrackings
Reviewers. What we were wondering was, how did Entity Framework load up the child objects based off of these items and bring back the data we use in the follow model?
As you can see, we're passing in MasterPreAward object and then accessing children properties which are loaded based on the MasterPreAward.
public ViewHeaderSummary(MasterPreAward masterPreAward)
{
MasterPreAwardId = masterPreAward.ID;
ClientId = masterPreAward.Cycle.Project.Program.ClientID;
ApplicationId = masterPreAward.MemberID;
ProgramId = masterPreAward.Cycle.Project.ProgramID;
ProjectId = masterPreAward.Cycle.ProjectID;
EventTypeId = masterPreAward.DataTrackings.FirstOrDefault(x=>x.Finished==true
&& x.EventTypeID==(int)FormEvents.Application).EventTypeID;
CycleId = masterPreAward.CycleID;
FormId = masterPreAward.Cycle.CycleForms.FirstOrDefault().FormID;
}
What we'd like to know, is this the best way to access these properties, or should be really be thinking doing this type of work in a different way?
I believe the default settings would be to lazy load each nested collection independently, which could cause a lot of database traffic.
The best way to verify the generated SQL is to start a SQL profiler and confirm the number of queries.
You can force EF to eagerly load related entities by calling .Include method. See here for more details.
You don't seem to query for full entities but only for a bunch of scalar values. In my opinion this would be a good candidate for a projection which collects all the needed values in a single database roundtrip:
var result = dbContext.MasterPreAwards
.Where(m => m.ID == masterPreAward.ID)
.Select(m => new
{
ClientId = m.Cycle.Project.Program.ClientID,
ProgramId = m.Cycle.Project.ProgramID,
ProjectId = m.Cycle.ProjectID,
EventTypeId = m.DataTrackings.Where(d => d.Finished
&& x.EventTypeID==(int)FormEvents.Application)
.Select(d => d.EventTypeID).FirstOrDefault(),
FormId = m.Cycle.CycleForms.Select(c => c.FormID).FirstOrDefault()
})
.Single();
MasterPreAwardId = masterPreAward.ID;
ClientId = result.ClientID;
ApplicationId = masterPreAward.MemberID;
ProgramId = result.ProgramID;
ProjectId = result.ProjectID;
EventTypeId = result.EventTypeId;
CycleId = masterPreAward.CycleID;
FormId = result.FormID;
As you can see, you need the DbContext to run such a query.
Your original way to lazily load all related entities will lead to 5 database queries as far as I can see (for Cycle, Project, Program, DataTrackings and CycleForms). Worst of all are the queries for DataTrackings.FirstOrDefault and CycleForms.FirstOrDefault which will actually load the full collections first from the database into memory and then execute FirstOrDefault in memory on the loaded collections to return only one single element from which you then only use one single property.
(Edit: Query for ApplicationId and CycleId not necessary, Code changed.)