Let's say I have a User Entity, and created partial User class so I can add some methods (like with NHibernate). I added GetByID to make getting user easier:
public static User GetByID(int userID)
{
using (var context = new MyEntities())
{
return context.Users.Where(qq => qq.UserID == userID).Single();
}
}
Now, somewhere in business logic I'd like to do something like this:
var user = User.GetByID(userID);
var posts = user.GetAllPostsForThisMonth();
foreach(var post in posts)
{
Console.WriteLine(post.Answers.Count);
}
GetAllPostsForThisMonth() is similar to GetByID - has context and is disposing it right after execution.
Normally I can't do this because context is disposed when I call post.Answers.Count. This, I think, renders my methods useless... Or am I missing something? Can I anyhow use my entities like this? Or should I create method for every single query I use (like post.GetAnswersCount())? Thanks in advance!
The behavior you're lamenting is actually good, because it keeps you from shooting yourself in the foot. If you had been allowed to do this, it would have cause n round-trips to the database (where n is the number of posts), and each one of those round-trips would have pulled all the data for all the Answers, when all you wanted was the Count. This could have an enormous performance impact.
What you want to do is construct an object that represents all the information you expect to need from the database, and then construct a LINQ query that will actually load in all the information you expect to use.
public class PostSummary
{
public Post Post {get;set;}
public int AnswerCount {get;set;}
}
public IEnumerable<PostSummary> GetPostSummariesByUserAndDateRange(
int userId, DateTime start, DateTime end)
{
using (var context = new MyEntities())
{
return context.Posts
.Where(p => p.UserId == userId)
.Where(p => p.TimeStamp < start && p.TimeStamp > end)
.Select(new PostSummary{Post = p, AnswerCount = p.Answers.Count()})
.ToList();
}
}
This produces a single SQL query and, in a single round-trip, produces exactly the information you wanted without loading in a ton of information you didn't want.
Update
If NHibernate works anything like Java's Hibernate, it won't do lazy loading after the context is disposed, either. Entity Framework does give you a lot of options along these lines: which one works best for you will depend on your particular situation. For example:
Rather than keeping your context scoped inside the data-access method, you can make the context last longer (like once per request in a web application), so that Lazy loading of properties will continue to work beyond your data-access methods.
You can eagerly load whatever entity associations you know you're going to need.
Here's an example of eager loading:
public GetAllPostsAndAnswersForThisMonth()
{
using (var context = new MyEntities())
{
return context.Posts.Include("Answers")
.Where(p => p.UserID == UserID)
.ToList();
}
}
However, since Entity Framework basically constitutes your "Data Access" tier, I would still argue that the best practice will be to create a class or set of classes that accurately model what your business layer actually wants out of the data tier, and then have the data access method produce objects of those types.
One method is to explicitly load related objects that you know you will need before you dispose the context. This will make the related data available, with the downside that if you don't need the related info it is wasted time and memory to retrieve. Of course, you can also handle this with flags:
... GetAllPostsForThisMonth(bool includeAnswers)
{
using (var context = new MyEntities())
{
context.ContextOptions.LazyLoadingEnabled = false;
// code to get all posts for this month here
var posts = ...;
foreach (var post in posts)
if (!post.Answers.IsLoaded)
post.Answers.Load();
return posts;
}
}
Related
I have following issue - I'm using generic repository to detach and reattach entity on update operation to prevent accessing it from two different objects.
This is how Update operation looks like:
public void Update(TEntity entityToUpdate)
{
if (this.context.Entry(entityToUpdate).State != EntityState.Detached)
{
this.context.Entry(entityToUpdate).State = EntityState.Detached;
}
this.context.Entry(entityToUpdate).State = EntityState.Modified;
}
When using two entities lets say like user and course
public class User
{
public Guid UserId { get; set; }
public string Email { get; set; }
public ICollection<Course> Courses{ get; set; }
}
}
and Course
public class Course
{
public string Name { get; set; }
public Guid CourseId { get; set; }
public virtual ICollection<User> Users { get; set; }
}
public class CourseDto
{
public string Name { get; set; }
public Guid CourseId { get; set; }
public virtual ICollection<Guid> Users { get; set; }
}
and I try to update course's users with following code:
public async Task<Course> Update(Guid existingCourseId, CourseDto courseModel)
{
var course = await course.Repository.Query(true)
.Include(c=> c.Users)
.Where(e => e.CourseId == existingCourseId )
.FirstOrDefaultAsync();
if (course == null)
{
return null;
}
course.Users = await FindUsersByIds(courseModel.Users);
course.Name = courseModel.Name;
courseRepository.Update(course);
await this.unitOfWork.SaveChangesAsync();
return course;
}
it doesn't work when I want to for example update only Name property.
If Users property doesn't change and there is at least one user it will try to insert record to the CourseUser join table violating primary key constraint instead of 'noticing' that it is already existing in database.
Edit:
Additionally when I use Entry(entityToUpdate).State = EntityState.Unchanged before changing it to modified and move repository.Update() call before overwriting entity properties it works all fine. If somebody could explain this behaviour to me I would be really grateful
In a nutshell, simply don't do this with detaching and setting state to modified. You're already doing the right thing by transporting details to update via a DTO. Let EF's change tracking do it's thing and just call SaveChanges().
I'm not sure what you're concern is warranting the Detach to "prevent accessing". All this does it tell the DbContext to treat the item as untracked, but then by setting the state to Modified, you're immediately tracking it again. The Main reasons you don't want to do this is that by setting the entity state to "Modified" you are telling EF to generate an UPDATE statement for the entire record regardless of whether any value was actually modified rather than optimizing an UPDATE statement to be run for values it detected had changed, and only if they had changed.
The issue you are likely seeing is because of this line:
course.Users = await FindUsersByIds(courseModel.Users);
I would bet money that your FindUsersByIds method is returning a non-tracked set of User entities either by AsNoTracking() or detaching those user entities before returning them. You have already eager loaded users for the given course you loaded. You might think this tells EF to remove existing references and replace them with the desired ones, but it doesn't. It just "adds" a set of Users that EF will treat as new associations where-by some of those associations already still exist in the database.
As a general rule when working with entities. Never, under any circumstances reset/overwrite a collection of related entities. Not to clear it and not to change the values associated with an entity.
If the model contains a revised list of IDs, then the proper way to update the set is to identify any users that need to be added, and any that need to be removed. Load references for the ones that need to be added to associate, and remove any from the eager loaded relationships. Also, where you expect 0 or 1 result, use SingleOrDefault rather than FirstOrdefault. First methods should only be used with an OrderBy clause where you could expect more than one entry and want a reliable, repeatable First. The exception to this would be when working with Linq against in-memory ordered sets where you can guarantee one unique find where First() will perform faster while Single() would scan the entire set. With EF generating queries First() generates a TOP(1) SQL statement where Single() generates a TOP(2) SQL statement so that performance assumption gets shot down to a bare minimum difference to enforce that expectation.
var course = await course.Repository.Query(true)
.Include(c=> c.Users)
.Where(e => e.CourseId == existingCourseId)
.SingleOrDefaultAsync();
if (course == null)
return null;
var existingUserIds = course.Users.Select(u => u.UserId);
var userIdsToAdd = courseModel.Users.Except(existingUserIds).ToList();
var userIdsToRemove = existingUserIds.Except(courseModel.Users).ToList();
if (userIdsToRemove.Any())
{
var usersToRemove = course.Users
.Where(u => userIdsToRemove.Contains(u.UserId))
.ToList();
foreach(var user in usersToRemove)
course.Users.Remove(user);
}
if (userIdsToAdd.Any())
{
var usersToAdd = FindUsersByIds(userIdsToAdd); // Important! Ensure this does NOT detach or use AsNoTracking()
foreach(var user in usersToAdd)
course.Users.Add(user);
}
course.Name = courseModel.Name;
await this.unitOfWork.SaveChangesAsync();
return course;
Basically this inspects the IDs selected to pick out the ones to add and remove, then proceeds to modify the eager loaded collection if needed. The change tracking will take care of determining what, if any SQL needs to be generated.
If the Users collection is exposed as List<User> then you can use AddRange and RemoveRange rather than the foreach loops. This won't work if they are exposed as IList<User> or IHashSet<User> or ICollection<User>.
Edit:
Based on the error mentioned, I would suggest starting by at least temporarily removing some of the variables in the equation around the particular implementation of the Repository and Unit of Work patterns and working with a scoped DbContext to begin with:
For a start try this edit:
using (var context = new AppDbContext())
{
var course = await context.Courses
.Include(c=> c.Users)
.Where(e => e.CourseId == existingCourseId)
.SingleOrDefaultAsync();
if (course == null)
return null;
var existingUserIds = course.Users.Select(u => u.UserId);
var userIdsToAdd =
courseModel.Users.Except(existingUserIds).ToList();
var userIdsToRemove = existingUserIds.Except(courseModel.Users).ToList();
if (userIdsToRemove.Any())
{
var usersToRemove = course.Users
.Where(u => userIdsToRemove.Contains(u.UserId))
.ToList();
foreach(var user in usersToRemove)
course.Users.Remove(user);
}
if (userIdsToAdd.Any())
{
var usersToAdd = FindUsersByIds(userIdsToAdd); // Important! Ensure this does NOT detach or use AsNoTracking()
foreach(var user in usersToAdd)
course.Users.Add(user);
}
course.Name = courseModel.Name;
await context.SaveChangesAsync();
context.Entry(course).State = EntityState.Detached; // Necesssary evil as this method is returning an entity.
return course;
}
This is only intended as a temporary measure to help identify if your repository or unit of work could be leading to transient DbContexts being used to load and track entities. This could still have issues depending on what happens to the Course after this method returns it. In this case since we are leaving the scope of the DbContext instance that is tracking it, we detach it. The next step would be to ensure that a DbContext, either directly or accessible through the unit of work can be injected and that it is guaranteed to be scoped to the web request (if web) or a scope suited to the unit of work this operation is part of. (A bit more work for things like WPF desktop applications)
Normally for something like a web application you would want to ensure that a DbContext has a lifetime scope of the web request (or shorter) but not Transient. We want to ensure that all operations within a unit of work reference the same DbContext instance otherwise you end up working with entities that might be tracked by multiple DbContexts, or start introducing code to mix tracked and untracked (detached) entities to get around problems when passing entity references around. Working with detached entities requires a lot of extra boiler-plate, disciplined coding to ensure DbContexts are working with the correct, single reference of entities to avoid "already tracked" type errors or duplicate data insertion / PK violation exceptions. I.e. pre-checking each and every DbSet.Local in a entity graph of related entities for any currently tracked instances with the same ID and replacing references when wanting to attach an entity graph to a DbContext that isn't already tracking that instance.
I'm using Sqlite database and System.Data.SQLite 1.0.92
There is 2 table here:
Table Person:
PersonId
PersonName
Table Student:
StudentId
PersonId(reference table Person FK)
StudentNo
Now every time I get the Persons Collection in EF5:
using (var ctx = new myEntities)
{
AllPersons = ctx.Persons.ToList();
}
There is also has AllPersons.student collection will include in the result;
But I don't need it. Of course that's just an example, There is a lot of big table has so many references, it always has performance problems here because of that.
So I'm trying to do not let it in my result. So I change it:
using (var ctx = new myEntities)
{
ctx.Configuration.ProxyCreationEnabled = false;
ctx.Configuration.LazyLoadingEnabled = false;
AllPersons= ctx.Persons.ToList();
}
Now fine, because AllPersons.student collection will always be null
But now I found: If I get Person and Student together:
using (var ctx = new myEntities)
{
ctx.Configuration.ProxyCreationEnabled = false;
ctx.Configuration.LazyLoadingEnabled = false;
AllPersons= ctx.Persons.ToList();
AllStudents = ctx.Student.ToList();
}
Now the reference still include in.
So Is there anyway to don't let the reference include in any time in this situation?
Thank you.
Update
For some friends request, I explain why I need it:
1: When I convert it to json it will be a dead loop. even I already use Json.net ReferenceLoopHandling, the json size very big to crash the server.(if no references, it's just a very small json)
2:Every time I get the client data and need to save, it will display exception about model state, until I set it to null.
Example:
using (myEntities ctx = new myEntities())
{
ctx.Configuration.LazyLoadingEnabled = false;
ctx.Configuration.ProxyCreationEnabled = false;
Person model= ThisIsAModel();
model.students = null; // This is a key, I need set the students collection references to null , otherwise it will throw exception
ctx.Entry(model).State = EntityState.Modified;
ctx.SaveChanges();
}
3: This is More important problem. I already get all data and cache on the server. But It will let the loading time very long when server start. (because the data and references are so many, that is the main problem), I don't know I'll meet what kind of problem again....
public List<Person> PersonsCache; // global cache
public List<Student> StudentsCache; // global cache
using (myEntities ctx = new myEntities())
{
ctx.Configuration.LazyLoadingEnabled = false;
ctx.Configuration.ProxyCreationEnabled = false;
// There is so many references and data, will let it very slow , when I first time get the all cache. even I only get the Person model, not other , just because some Collection has some references problem. It will very slow....
PersonsCache = ctx.Persons.ToList();
StudentsCache= ctx.Student.ToList();
}
The Problem
As you said, when you load both of Parent and Child lists even when LazyLoading is disabled, and then look in parent.Childs you see child items has been loaded too.
var db = new YourDbContext();
db.Configuration.LazyLoadingEnabled = false;
var parentList= db.YourParentSet.ToList();
var childList= db.YourChildSet.ToList();
What happened? Why childs are included in a parent?
The childs under a parent entity, are those you loaded using db.YourChildSet.ToList(); Exactly themselves; In fact Entity Framework never loads childs for a parent again but because of relation between parent and child in edmx, they are listed there.
Is that affect Perforemance?
According to the fact that childs only load once, It has no impact on perforemance because of loading data.
But for serialization or something else's sake, How can I get rid of it?
you can use these solutions:
Solution 1:
Use 2 different instance of YourDbContext:
var db1 = new YourDbContext();
db1.Configuration.LazyLoadingEnabled = false;
var parentList= db.YourParentSet.ToList();
var db2 = new YourDbContext();
db2.Configuration.LazyLoadingEnabled = false;
var childList= db.YourChildSet.ToList();
Now when you look in parent.Childs there is no Child in it.
Solution 2:
use Projection and shape your output to your will and use them.
var db1 = new YourDbContext();
db1.Configuration.LazyLoadingEnabled = false;
var parentList= db.YourParentSet
.Select(x=>new /*Model()*/{
Property1=x.Property1,
Property2=x.Property2, ...
}).ToList();
This way when serialization there is nothing annoying there.
Using a custom Model class is optional and in some cases is recommended.
Additional Resources
As a developer who use Entity Framework reading these resources is strongly recommended:
Performance Considerations for Entity Framework 4, 5, and 6
Connection Management
I'll focus on your third problem because that seems to be your most urgent problem. Then I'll try to give some hints on the other two problems.
There are two Entity Framework features you should be aware of:
When you load data into a context, Entity Framework will try to connect the objects wherever they're associated. This is called relationship fixup. You can't stop EF from doing that. So if you load Persons and Students separately, a Person's Students collection will contain students, even though you didn't Include() them.
By default, a context caches all data it fetches from the database. Moreover, it stores meta data about the objects in its change tracker: copies of their individual properties and all associations. So by loading many objects the internal cache grows, but also the size of the meta data. And the ever-running relationship fixup process gets slower and slower (although it may help to postpone it by turning off automatic change detection). All in all, the context gets bloated and slow like a flabby rhino.
I understand you want to cache data in separate collections for each entity. Two simple modifications will make this much quicker:
Evade the inevitable relationship fixup by loading each collection by a separate context
Stop caching (in the context) and change tracking by getting the data with AsNoTracking.
Doing this, your code will look like this:
public List<Person> PersonsCache;
public List<Student> StudentsCache;
using (myEntities ctx = new myEntities())
{
ctx.Configuration.ProxyCreationEnabled = false;
PersonsCache = ctx.Persons
.AsNoTracking()
.ToList();
}
using (myEntities ctx = new myEntities())
{
ctx.Configuration.ProxyCreationEnabled = false;
StudentsCache= ctx.Student
.AsNoTracking()
.ToList();
}
The reason for turning off ProxyCreationEnabled is that you'll get light objects and that you'll never inadvertently trigger lazy loading afterwards (throwing an exception that the context is no longer available).
Now you'll have cached objects that are not inter-related and that get fetched as fast as it gets with EF. If this isn't fast enough you'll have to resort to other tools, like Dapper.
By the way, your very first code snippet and problem description...
using (var ctx = new myEntities)
{
AllPersons = ctx.Persons.ToList();
}
There is also has AllPersons.student collection will include in the result;
...suggest that Entity Framework spontaneously performs eager loading (of students) without you Include-ing them. I have to assume that your code snippet is not complete. EF never, ever automatically executes eager loading. (Unless, maybe, you have some outlandish and buggy query provider).
As for the first problem, the serialization. You should be able to tackle that in a similar way as shown above. Just load the data you want to serialize in isolation and disable proxy creation. Or, as suggested by others, serialize view models or anonymous types exactly containing what you need there.
As for the second problem, the validation exception. I can only imagine this to happen if you initialize a students collection by default, empty, Student objects. These are bound to be invalid. If this is not the case, I suggest you ask a new question about this specific problem, showing ample detail about the involved classes and mappings. That shouldn't be dealt with in this question.
Explicitly select what you want to return from the Database.
Use Select new. With the select new clause, you can create new objects of an anonymous type as the result of a query and don't let the reference include in. This syntax allows you to construct anonymous data structures. These are created as they are evaluated (lazily). Like this:
using (var ctx = new myEntities())
{
var AllPersons = ctx.People.Select(c => new {c.PersonId, c.PersonName}).ToList();
}
And even you don't need to disable lazy loading anymore.
After running query above:
This query currently allocates an anonymous type using select new { }, which requires you to use var. If you want allocate a known type, add it to your select clause:
private IEnumerable<MyClass> AllPersons;//global variable
using (var ctx = new myEntities())
{
AllPersons = ctx.People
.Select(c => new MyClass { PersonId = c.PersonId, PersonName = c.PersonName }).ToList();
}
And:
public class MyClass
{
public string PersonId { get; set; }
public string PersonName { get; set; }
}
If entities are auto generated, then copy paste it to own code and remove the relation generated like child collection and Foreign key. Or you don't need all this kind of the functionality might be can user lightweight framework like dapper
In normally your student collection doesn't fill from database. it's fill when you reach to property. In addition if you use ToList() method so Entity Framework read data from data to fill your collection.
Pls check this.
https://msdn.microsoft.com/en-us/data/jj574232.aspx#lazy
https://msdn.microsoft.com/en-us/library/vstudio/dd456846(v=vs.100).aspx
Is there anyway to don't let the reference include in any time in this situation?
The solution to this seems to be very simple: don't map the association. Remove the Student collection. Not much more I can say about it.
Decorate any properties with [IgnoreDataMember] if you are using 4.5+
https://msdn.microsoft.com/en-us/library/system.runtime.serialization.ignoredatamemberattribute(v=vs.110).aspx
Also sounds like you are trying to do table inheritance which is a different problem with EF
http://www.asp.net/mvc/overview/getting-started/getting-started-with-ef-using-mvc/implementing-inheritance-with-the-entity-framework-in-an-asp-net-mvc-application
http://www.entityframeworktutorial.net/code-first/inheritance-strategy-in-code-first.aspx
If I understand you correctly, you're just trying to make sure you only get what you specifically ask for right?
This was mentioned a little above, but to do this correctly you just want to select an anonymous type.
var students = from s in _context.Students
select new{
StudentId,
StudentNo};
Then, when you want to update this collection/object, I'd recommend use GraphDiff. GraphDiff really helps with the problems of disconnected entities and updates (https://github.com/refactorthis/GraphDiff)
So your method would look similar to this:
void UpdateStudent(Student student){
_context.UpdateGraph(student, map =>
map
.AssociatedEntity(c => c.Person));
_context.SaveChanges();
}
This way, you're able to update whatever properties on an object, disconnected or not, and not worry about the association.
This is assuming that you correctly mapped your entities, and honestly, I find it easier to declare the object as a property, not just the ID, and use a mapping file to map it correctly.
So:
class Person{
int Id{get;set;}
string Name{get;set}
}
class Student{
int Id{get;set;}
string StudentNo{get;set;}
Person Person{get;set;}
public class StudentMap : EntityTypeConfiguration<Student>
{
public StudentMap()
{
// Primary Key
HasKey(t => t.Id);
// Table & Column Mappings
ToTable("Students");
Property(t => t.Id).HasColumnName("StudentId");
// Relationships
HasRequired(t => t.Person)
.HasForeignKey(d => d.PersonId);
}
}
Hopefully that makes sense. You don't need to create a view model, but you definitely can. This way does make it easier to map disconnected items back to the database though.
I had exact same situation.
All I did to solve it was ask for the Student.ToList() before I asked for Persons.ToList()
I didn't have to disable lazy loading. Just need to load the table that has reference to other table first after that you can load the other table and first table results are already in memory and don't get "fixed" with all the references.
They are automatically linked in the ObjectContext by there EntityKey. Depending on what you want to do with your Persons and Students, you can Detach them from the ObjectContext :
using (var ctx = new myEntities)
{
ctx.Configuration.ProxyCreationEnabled = false;
ctx.Configuration.LazyLoadingEnabled = false;
AllPersons= ctx.Persons.ToList();
foreach(var c in AllPersons)
{
ctx.Detach(c);
}
AllStudents = ctx.Student.ToList();
foreach(var c in AllStudents )
{
ctx.Detach(c);
}
}
Im Beginner and i wanted to know if what i am doing is right, cause i've being queering a single table of 350 records and it took almost a minute to display ! i think I'm doing something wrong.
So this is how i am doing it :
CODE INSIDE THE CLASS : SPOT
/// <summary>
/// Return The City Of this Spot
/// </summary>
/// <returns>City Of The Spot</returns>
public City GetCityOfThisSpot()
{
City value;
using (var ctx = new GhanDBEntities())
{
var firstOrDefault = ctx.Spots.FirstOrDefault(s => s.IdSpot == IdSpot);
var city = firstOrDefault.City;
value = city;
}
return value;
}
and then in my winform i use something life this :
CODE IN THE WINFORM :
List<City> listOfCities = new List<City>();
foreach (var spot in listOfspot)
{
listOfCities.Add(spot.GetCityOfThisSpot);
}
I think i shouldn't do it this way, because foreach Spot i am creating a context and destroying it !? can you correct me please.
I can give you few suggestions:
you are using FirstOrDefault, but not checking for null, i think it would be better to use Find, Single or First, or add null(you won't get any speed benefits from this)
most likely you will get speed benefits from batch request:
var ids = listOfspot.Select(p => p.IdSpot).ToList();
var listOfCities = ctx.Spots.Where(p => ids.Contains(p.IdSpot))
.Select(p => p.City).ToList();
DbContext isn't the problem in this case.
GetCityOfThisSpot()
is making a db-query, the code in the winform is running the query 350 times...
Make a join to make this much faster.
DbContext is a light-weight class which is not a problem to create for each query. But with WinForms application thats just an overkill - you can reuse same instance of context during whole application lifetime. Note - for web applications you usually create instance of context which is reused for all queries during same request.
So, you don't need to dispose context in WinForms application. What you really need here is navigation property for City in your Spot entity:
public virtual City City { get; set; }
In this case Entity Framework will lazy-load city when you will need it (remember - keep context not disposed). Also if you want to get cities of many spots, then just do eager-loading of cities when you are loading spots:
var spots = ctx.Spots.Include(s => s.City);
In this case Entity Framework will join both tables and return spots with already loaded cities.
In both cases getting cities of spots will look like:
List<City> listOfCities = listOfspot.Select(s => s.City).ToList();
I also suggest you to read Jon Gallant article Do I always have to call Dispose() on my DbContext objects? There you can find Diego Vega (the Senior SDE Lead on Entity Framework) response, which states that in common scenarios (if you don't open database connection manually) you don't need to call Dispose on DbContext.
I don't know Linq2Sql so well yet and I was wondering if there is a trick for this probably common MVVM scenario. I have Linq2Sql data context containing Domain models, but I am fetching data for my customized ViewModel object from it.
var query = from ord in ctx.Table_Orders
select new OrderViewModel()
{
OrderId = ord.OrderId,
OrderSum = ord.OrderSum,
OrderCurrencyId = ord.OrderCurrencyId,
OrderCurrencyView = ord.Currency.CurrencyText
};
So i want my ViewModel to inculde both CurrencyId from domain object and the CurrencyText from related table to show it nicely in the View.
This code works great. It generates one DB call with join to fetch the CurrencyText. But the model is simplified, real one has many more fields. I want to make the code reusable because I have many different queries, that returns the same ViewModel. Now every minor change to OrderViewModel requires lots of maintainance.
So I moved the code to OrderViewModel itself as a constructor.
public OrderViewModel(Table_Order ord)
{
OrderId = ord.OrderId,
OrderSum = ord.OrderSum,
OrderCurrencyId = ord.OrderCurrencyId,
OrderCurrencyView = ord.Currency.CurrencyText
}
And call it like this.
var query = from ord in ctx.Table_Orders
select new OrderViewModel(ord);
The Problem: The join is gone DB query is no more optimised. Now I get 1+N calls to database to fetch CurrencyText for every line.
Any comments are welcome. Maybe I have missed different great approach.
This is how far i could get on my own, to get the code reusability. I created a function that does the job and has multiple parameters. Then I need to explicitly pass it everything that has crossed the line of entity.
var query = ctx.Table_Orders.Select(m =>
newOrderViewModel(m, m.Currency.CurrencyText));
The DB call is again optimized. But it still does not feel like I am there yet! What tricks do You know for this case?
EDIT : The final solution
Thanks to a hint by #Muhammad Adeel Zahid I arrived at this solution.
I created an extension for IQueryable
public static class Mappers
{
public static IEnumerable<OrderViewModel> OrderViewModels(this IQueryable<Table_Order> q)
{
return from ord in q
select new OrderViewModel()
{
OrderId = ord.OrderId,
OrderSum = ord.OrderSum,
OrderCurrencyId = ord.OrderCurrencyId,
OrderCurrencyView = ord.Currency.CurrencyText
};
}
}
Now i can do this to get all list
var orders = ctx.Table_Order.OrderViewModels().ToList();
or this to get a single item, or anything in between with Where(x => ..)
var order = ctx.Table_Order
.Where(x => x.OrderId == id).OrderViewModels().SingleOrDefault();
And that completely solves this question. The SQL generated is perfect and the code to translate objects is reusable. Approach like this should work with both LINQ to SQL and LINQ to Entities. (Not tested with the latter) Thank You again #Muhammad Adeel Zahid
Whenever we query the database, we mostly require either enumeration of objects (more than one records in db) or we want a single entity (one record in db). you can write your mapping code in method that returns enumeration for whole table like
public IEnumerable<OrderViewModel> GetAllOrders()
{
return from ord in ctx.Table_Orders
select new OrderViewModel()
{
OrderId = ord.OrderId,
OrderSum = ord.OrderSum,
OrderCurrencyId = ord.OrderCurrencyId,
OrderCurrencyView = ord.Currency.CurrencyText
};
}
Now you may want to filter these records and return another enumeration for example on currencyID
public IEnumerable<OrderViewModel> GetOrdersByCurrency(int CurrencyID)
{
return GetAllOrders().Where(x=>x.CurrencyId == CurrencyID);
}
Now you may also want to find single record out of all these view models
public OrderViewModel GetOrder(int OrderID)
{
return GetAllOrders().SingleOrDefault(x=>x.OrderId == OrderID);
}
The beauty of IEnumerable is that it keeps adding conditions to query and does not execute it until it is needed. so your whole table will not be loaded unless you really want it and you have kept your code in single place. Now if there are any changes in ViewModel Mapping or in query itself, it has to be done in GetAllOrders() method, rest of code will stay unchanged
You can avoid the N+1 queries problem by having Linq2SQL eagerly load the referenced entites you need to construct your viewmodels. This way you can build one list of objects (and some referenced objects) and use it to construct everything. Have a look at this blog post.
One word of warning though: This technique (setting LoadOptions for the Linq2SQL data context) can only be done once per data context. If you need to perform a second query with a different eager loading configuration, you must re-initalize your data context. I automated this with a simple wrapper class around my context.
This is a pretty vague/subjective question. I want to know if this is the best way to send/retrieve data to/from the browser using ajax calls. On the back end webservice, I want to use the entity framework. Below are two example functions.
The criteria for "best" is speed of writing code, readable code, and robust architecture.
Thanks for any feedback and suggestions and comments.
Get Function
[WebMethod]
public AjaxEmployee EmployeeGetById(int employeeID, bool getTimeOff)
{
using (Time_TrackerEntities ctx = new Time_TrackerEntities())
{
var results = from item in ctx.Employees
where item.ID == employeeID
orderby item.Last_Name
select new AjaxEmployee
{
ID = item.ID,
Employee_ID = item.Employee_ID,
First_Name = item.First_Name,
Middle_Name = item.Middle_Name,
Last_Name = item.Last_Name,
Supervisor_ID = item.Supervisor_ID,
Active = item.Active,
Is_Supervisor = item.Is_Supervisor
};
var emp = results.FirstOrDefault();
if (getTimeOff)
{
var results2 = from item2 in ctx.Time_Off
where item2.Employee_ID == emp.Employee_ID
select new AjaxTime_Off
{
ID = item2.ID,
Employee_ID = item2.Employee_ID,
Date_Off = item2.Date_Off,
Hours = item2.Hours
};
emp.Time_Off = results2.ToList<AjaxTime_Off>();
}
return emp;
}
}
Save Function
[WebMethod]
public bool EmployeeSave(AjaxEmployee emp)
{
using (Time_TrackerEntities ctx = new Time_TrackerEntities())
{
var results = from item in ctx.Employees
where item.ID == emp.ID
select item;
var myEmp = results.FirstOrDefault();
if (myEmp == null)
{
myEmp = new Employee();
ctx.Employees.AddObject(myEmp);
}
myEmp.Employee_ID = emp.Employee_ID;
myEmp.First_Name = emp.First_Name;
myEmp.Middle_Name = emp.Middle_Name;
myEmp.Last_Name = emp.Last_Name;
myEmp.Supervisor_ID = emp.Supervisor_ID;
myEmp.Active = emp.Active;
myEmp.Is_Supervisor = emp.Is_Supervisor;
return ctx.SaveChanges() > 0;
}
}
There are a few improvements to be made.
Save() Method - Don't left-to-right copy, use EF built in logic
Instead of this:
myEmp.Employee_ID = emp.Employee_ID;
myEmp.First_Name = emp.First_Name;
myEmp.Middle_Name = emp.Middle_Name;
myEmp.Last_Name = emp.Last_Name;
myEmp.Supervisor_ID = emp.Supervisor_ID;
myEmp.Active = emp.Active;
myEmp.Is_Supervisor = emp.Is_Supervisor;
You can do this:
ctx.Employees.ApplyCurrentValues(emp).
What this does, is look for an entity with the same key in the graph (which there is, since you have just retrieved it with FirstOrDefault()), and override the scalar values with the entity you pass in - which is exactly what your doing.
So your 7 lines becomes 1, plus if you add any extra scalar properties - you won't have to refactor your code. Just remember - only works for scalar properties, not navigational properties.
Why build query for primary key retrieval? Just use predicate to SingleOrDefault()
Instead of this:
var results = from item in ctx.Employees
where item.ID == emp.ID
select item;
var myEmp = results.FirstOrDefault();
Do this:
var myEmp = ctx.Employees.SingleOrDefault(x => x.ID == emp.Id);
Or even better, use a pipe/filter technique:
var myEmp = ctx.Employees.WithId(emp.Id).SingleOrDefault();
Where WithId is an IQueryable<Employee> extension method, which filters the query based on the supplied employee ID. This allows de-coupling of filtering/business logic from your repository/DAL. It should go in your domain model, so you can have a nice fluent API for query your domain entities via your ORM.
When your retrieving an entity via the primary key, you should always use SingleOrDefault() or Single(), never FirstOrDefault() or First(). If it's a primary key - there should only be one of them, so you should throw an exception if more than one exists, which is what SingleOrDefault() does. And as #Shiraz mentions - your FirstOrDefault() will crash the query below. You always need null checking when you use <First/Single>OrDefault().
The same improvements can be made to your Get method.
Overall, there is nothing functionally wrong with your code - it just needs subtle improvements, null checking and exception handling.
The only functional improvement i highly recommend is refacting your web service code into a Generic Repository. As the code is very trivial and can be re-used across any entity. The web service shouldn't be concerned with transactions, primary key's or EF logic whatsoever. It shouldn't even have a reference to the EF DLL. Encapsulate this logic behind a repository and delegate the persistence logic to there (via an interface of course).
After making the changes i've mentioned above, your web service methods should have no more than 5-7 lines of code each.
You have far too much intelligence in your web service - it should be dumb and persistent ignorant.
I find that it's usually a pretty bad idea to try and use my entities directly on the data contract. It's possible, and works fine in certain cases, but anytime my object model gets even a little complex I start having to worry about the object graph in ways that I don't want to have to.
Instead, and this is regardless of the client, but it's applicable to a JS client just as much, I try and think of the data contract classes as pure data trucks (DTOs) with no mapping in EF at all. Those classes are just he documents I'm passing back and forth, the message body if you will. They might translate into commands on my model or they might be used to populate a query, or whatever, but they're not the entities themselves.
This, I find, simplifies things a great deal. It may feel like more code when you first write a simple service, but over the lifetime it makes things a great deal more maintainable.
And just as a side note, you should also consider separating your responsibilities a bit better. The web service class should not have the responsibility of directly creating adn disposing of the data context, it should depend on a DAO or repository interface (or domain service) which handles all that stuff for you (and applies transactions as needed, etc).
Your get method can crash.
If this line returns null:
var emp = results.FirstOrDefault();
Then this line will crash with a null reference exception:
where item2.Employee_ID == emp.Employee_ID
I would also put in some try catch blocks in with logging of errors.