Are different IQueryable objects combined? - c#

I have a little program that needs to do some calculation on a data range. The range maybe contain about half a millon of records. I just looked to my db and saw that a group by was executed.
I thought that the result was executed on the first line, and later I just worked with data in RAM. But now I think that the query builder combine the expression.
var Test = db.Test.Where(x => x > Date.Now.AddDays(-7));
var Test2 = (from p in Test
group p by p.CustomerId into g
select new { UniqueCount = g.Count() } );
In my real world app I got more subqueries that is based on the range selected by the first query. I think I just added a big overhead to let the DB make different selects.
Now I bascilly just call .ToList() after the first expression.
So my question is am I right about that the query builder combine different IQueryable when it builds the expression tree?

Yes, you are correct. LINQ expressions are lazily evaluated at the moment you evaluate them (via .ToList(), for example). At that point in time, Entity Framework will look at the total query and build an SQL statement to represent it.
In this particular case, it's probably wiser to not evaluate the first query, because the SQL database is optimized for performing set-based operations like grouping and counting. Rather than forcing the database to send all the Test objects across the wire, deserializing the results into in-memory objects, and then performing the grouping and counting locally, you will likely see better performance by having the SQL database just return the resulting Counts.

Related

Is Queryable.OrderBy unstable for SQL Server database?

OrderBy is stable for LINQ to Objects, but MSDN on Queryable.OrderBy doesn't mention if it is stable or not.
I guess it depends on the provider implementation. Is it unstable for SQL Server? Because it looks so. I did a quick look at Queryable source code, but it is not obvious from there.
I need to order a collection before other operations and I want to use IQueryable, rather than IEnumerable for the sake of performance.
// All the timestamps are the same and I am getting inconsistent
// results by running it multiple times, first few pages return the same results
var result = data.OrderBy(i => i.TimeStamp).Skip(start).Take(length);
but if I use
var result = data.ToList().OrderBy(i => i.TimeStamp).Skip(start).Take(length);
It works just fine, but I lose performance boost from LINQ to SQL. It seems combination of Queryable OrderBy/Skip/Take produce inconsistent results.
SQL Code generated seems fine to me:
SELECT
...
FROM [dbo].[Table] AS [Extent1]
ORDER BY [Extent1].[TimeStamp] ASC
OFFSET 0 ROWS FETCH NEXT 10 ROWS ONLY
In Linq-to-Entities LINQ queries are translated into SQL queries so Linq-to-Objects implementation of OrderBy doesn't matter. You should look at your database implementation of ORDER BY. If you are using MS SQL you can find in docs that:
To achieve stable results between query requests using OFFSET and FETCH, the following conditions must be met:
(...)
The ORDER BY clause contains a column or combination of columns that are guaranteed to be unique.
So ORDER BY for the same values does not guarantee the same order so limiting it could provide different results set. To solve this you can simply sort by some additional column that has unique values e.g. id. So basically you will have:
var result = data
.OrderBy(i => i.TimeStamp)
.ThenBy(i => i.Id)
.Skip(start)
.Take(length);
I take it that by "stable", you mean consistent. If you didn't have the ORDER BY in a SQL query, the order of the data is not guaranteed for each time you run the query. It will simply return all of the data in whatever order is most efficient for the server. When you add the ORDER BY, it will sort that data. Since you are sorting data where all of the sort values are the same, no rows are being reordered, so the ordered data is in an order you don't expect. If you need a specific order, you will need to add a secondary sort column such as an ID.
It is a best to never assume the order of data coming back from the server unless you explicitly define what that order is.

At what point do I need to off-load the work of a query to the DB?

I have a web app, and I'm connecting to a DB with entity framework.
To select all Employee records of a particular department, for example, I can quite easily write:
....Employees.Where(o => o.Department == "HR").ToList();
Works fine. But is it most optimal?
Should this Where clause be incorporated into a stored procedure or view? Or does my entity framework code do the job of converting it to SQL anyway?
We've had performance problems in our team in the past from when people pull records into memory and then do the filtering in .net instead of at a database level. I'm trying to avoid this happening again so want to be crystal clear on what I must avoid.
If Employees is provided by Entity Framework then the Where() will be translated into SQL and sent to the database. It is only the point that you materialise the objects does it take the filters you have applied before and turn them into SQL. Anything after that point is just plain LINQ to objects.
Methods that cause materialisation to happen include things like .ToList() and .ToArray() (there are more, but these two are probably the most common).
If you want to see what is happening on SQL Server, you should open up the SQL Profiler and have a look at the queries that are being sent.
We've had performance problems in our team in the past from when people pull records into memory and then do the filtering in .net instead of at a database level.
Just as an addendum to Colin's answer, and to target the quote above, the way to avoid this is to make sure your database queries are fully constructed with IQueryable<T> first, before enumerating the results with a call such as .ToList(), or .ToArray().
As an example, consider the following:
IEnumerable<Employee> employees = context.Employees;
// other code, before executing the following
var hrEmployees = employees.Where(o => o.Department == "HR").ToList();
The .ToList() will enumerate the results grabbing all of the employees from the context first, and then performing the filtering on the client. This isn't going to perform very well if you've got a lot of employees to contend with, and it's certainly not going to scale very well.
Compare that with this:
IQueryable<Employee> employees = context.Employees;
// other code, before executing the following
var hrEmployees = employees.Where(o => o.Department == "HR").ToList();
IQueryable<T> derives from IEnumerable<T>. The difference between them is that IQueryable<T> has a query provider built in, and the query you construct is represented as an expression tree. That means it's not evaluated until a call that enumerates the results of the query, such as .ToList().
In the second example above, the query provider will execute SQL to fetch only those employees that belong to the HR department, performing the filtering on the database itself.

EF LINQ ToList is very slow

I am using ASP NET MVC 4.5 and EF6, code first migrations.
I have this code, which takes about 6 seconds.
var filtered = _repository.Requests.Where(r => some conditions); // this is fast, conditions match only 8 items
var list = filtered.ToList(); // this takes 6 seconds, has 8 items inside
I thought that this is because of relations, it must build them inside memory, but that is not the case, because even when I return 0 fields, it is still as slow.
var filtered = _repository.Requests.Where(r => some conditions).Select(e => new {}); // this is fast, conditions match only 8 items
var list = filtered.ToList(); // this takes still around 5-6 seconds, has 8 items inside
Now the Requests table is quite complex, lots of relations and has ~16k items. On the other hand, the filtered list should only contain proxies to 8 items.
Why is ToList() method so slow? I actually think the problem is not in ToList() method, but probably EF issue, or bad design problem.
Anyone has had experience with anything like this?
EDIT:
These are the conditions:
_repository.Requests.Where(r => ids.Any(a => a == r.Student.Id) && r.StartDate <= cycle.EndDate && r.EndDate >= cycle.StartDate)
So basically, I can checking if Student id is in my id list and checking if dates match.
Your filtered variable contains a query which is a question, and it doesn't contain the answer. If you request the answer by calling .ToList(), that is when the query is executed. And that is the reason why it is slow, because only when you call .ToList() is the query executed by your database.
It is called Deferred execution. A google might give you some more information about it.
If you show some of your conditions, we might be able to say why it is slow.
In addition to Maarten's answer I think the problem is about two different situation
some condition is complex and results in complex and heavy joins or query in your database
some condition is filtering on a column which does not have an index and this cause the full table scan and make your query slow.
I suggest start monitoring the query generated by Entity Framework, it's very simple, you just need to set Log function of your context and see the results,
using (var context = new MyContext())
{
context.Database.Log = Console.Write;
// Your code here...
}
if you see something strange in generated query try to make it better by breaking it in parts, some times Entity Framework generated queries are not so good.
if the query is okay then the problem lies in your database (assuming no network problem).
run your query with an SQL profiler and check what's wrong.
UPDATE
I suggest you to:
add index for StartDate and EndDate Column in your table (one for each, not one for both)
ToList executes the query against DB, while first line is not.
Can you show some conditions code here?
To increase the performance you need to optimize query/create indexes on the DB tables.
Your first line of code only returns an IQueryable. This is a representation of a query that you want to run not the result of the query. The query itself is only runs on the databse when you call .ToList() on your IQueryable, because its the first point that you have actually asked for data.
Your adjustment to add the .Select only adds to the existing IQueryable query definition. It doesnt change what conditions have to execute. You have essentially changed the following, where you get back 8 records:
select * from Requests where [some conditions];
to something like:
select '' from Requests where [some conditions];
You will still have to perform the full query with the conditions giving you 8 records, but for each one, you only asked for an empty string, so you get back 8 empty strings.
The long and the short of this is that any performance problem you are having is coming from your "some conditions". Without seeing them, its is difficult to know. But I have seen people in the past add .Where clauses inside a loop, before calling .ToList() and inadvertently creating a massively complicated query.
Jaanus. The most likely reason of this issue is complecity of generated SQL query by entity framework. I guess that your filter condition contains some check of other tables.
Try to check generated query by "SQL Server Profiler". And then copy this query to "Management Studio" and check "Estimated execution plan". As a rule "Management Studio" generatd index recomendation for your query try to follow these recomendations.

Linq performance: does it make sense to move out condition from query?

There is a LINQ query:
int criteria = GetCriteria();
var teams = Team.GetTeams()
.Where(team=> criteria == 0 || team.Criteria == criteria)
.ToList();
Does it make sense from performance (or any other point of view) to convert it into the following?
var teams = Team.GetTeams();
if (criteria != 0)
{
teams = teams.Where(team => team.Criteria == criteria);
}
teams = teams.ToList();
I have a solid set of criteria; should I separate them and apply each criteria only if necessary? Or just apply them in one LINQ query and leave up-to .NET to optimize the query?
Please advise. Any thoughts are welcome!
P.S. Guys, I don't use Linq2Sql, just LINQ, that is a pure C# code
First, it is important to understand that LINQ actually comes in several varieties, which can mostly be divided by whether they use expression trees or compiled code to execute the query.
Linq2Sql (as well as any Linq provider that converts the query into another form to perform the query) use expression trees so that the query can be analyzed and converted into another form (often SQL). In this case it is possible for the provider to modify the query, potenitally performing some level of optimization. I am not aware of any providers that currently do this.
Linq to Objects uses compiled code, and will always execute the query as written. There is no opportunity for the query to be optimized other than by the developer.
Second, all Linq queries are deferred. That means that the query is not actually executed until an attempt to get results. A side effect of this is that you can build a query in several steps, and only the final query will be executed. The second example in the question still results in only one query being executed.
If you are using Linq2Sql then it is likely that both queries have roughly similar performance. However, the first example extended to handle many criteria could potentially lead to a poor overly general execution plan which ultimately degrades performance. The trimmed query does not run this risk, and will generate an execution plan for each real permutation of criteria.
If you are using Linq to Objects, then the second query is definitely preferable, since the first query will execute the predicate passed to Where once for every input even when the condition always returns true.
I think the 2nd option is better because it is checking the criteria value only once. Whereas in the first query , criteria is being checked for every row.
But I also believe that you will not get any significant performance gain from it. It may give you better results on tables with a high number of records (theoretically)

Entity Framework + LINQ + "Contains" == Super Slow?

Trying to refactor some code that has gotten really slow recently and I came across a code block that is taking 5+ seconds to execute.
The code consists of 2 statements:
IEnumerable<int> StudentIds = _entities.Filters
.Where(x => x.TeacherId == Profile.TeacherId.Value && x.StudentId != null)
.Select(x => x.StudentId)
.Distinct<int>();
and
_entities.StudentClassrooms
.Include("ClassroomTerm.Classroom.School.District")
.Include("ClassroomTerm.Teacher.Profile")
.Include("Student")
.Where(x => StudentIds.Contains(x.StudentId)
&& x.ClassroomTerm.IsActive
&& x.ClassroomTerm.Classroom.IsActive
&& x.ClassroomTerm.Classroom.School.IsActive
&& x.ClassroomTerm.Classroom.School.District.IsActive).AsQueryable<StudentClassroom>();
So it's a bit messy but first I get a Distinct list of Id's from one Table (Filters), then I query another Table using it.
These are relatively small tables, but it's still 5+ seconds of query time.
I put this in LINQPad and it showed that it was doing the bottom query first then running 1000 "distinct" queries afterwards.
On a whim I changed the "StudentIds" code by just adding .ToArray() at the end. This improved the speed 1000x ... it now takes like 100ms to complete the same query.
What's the deal? What am I doing wrong?
This is one of the pitfalls of deferred execution in Linq: In your first approach StudentIds is really an IQueryable, not an in-memory collection. That means using it in the second query will run the query again on the database - each and every time.
Forcing execution of the first query by using ToArray() makes StudentIds an in-memory collection and the Contains part in your second query will run over this collection that contains a fixed sequence of items - This gets mapped to something equivalent to a SQL where StudentId in (1,2,3,4) query.
This query will of course, be much much faster since you determined this sequence once up-front, and not every time the Where clause is executed. Your second query without using ToArray() (I would think) would be mapped to a SQL query with an where exists (...) sub-query that gets evaluated for each row.
ToArray() Materializes the initial query to the server memory.
My guess would be the query provider is not able to parse the expression StudentIds.Contains(x.StudentId). Hence it probably thinks that the studentIds is an array already loaded to memory. So it's probably querying the database over and over again during the parsing phase. The only way to know for sure is to setup the profiler.
If you need to do this on the db server, use a join, instead of "contains". If you need to use contains to do what looks like a join problem, you are likely to be missing a surrogate primary key or a foreign key somewhere.
You could also declare studentIds as IQueryable instead of IEnumerable. This might give the query provider the hint it needs to interpret the studentIds as expression aka. data not already loaded to memory. I somehow doubt this but worth a try.
If all else fails, use ToArray(). This will load the initial studentIds to memory.

Categories