I am querying for values from a database in AWS sydney, (I am in new zealand), using stopwatch i measured the query time, it is wildly inconsistent, sometimes in the 10s of milliseconds and sometimes in the hundreds of milliseconds, for the exact same query. I have no idea why.
Var device = db.things.AsQueryable().FirstOrDefault(p=>p.ThingName == model.thingName);
things table only has 5 entries, I have tried it without the asqueryable and it seems to make no difference. I am using visual studio 2013, entity framework version 6.1.1
EDIT:
Because this is for a business, I cannot put a lot of code up, another time example is that it went from 34 ms to 400 ms
thanks
This can be related to cold-warm query execution.
The very first time any query is made against a given model, the Entity Framework does a lot of work behind the scenes to load and validate the model. We frequently refer to this first query as a "cold" query. Further queries against an already loaded model are known as "warm" queries, and are much faster.
You can find more information about this in the following article:
https://msdn.microsoft.com/en-us/library/hh949853(v=vs.113).aspx
One way to make sure this is the problem is to write a Stored Procedure and get data by it(using Entity Framework) to see if the problem is in the connection or in the query(Entity Framework) itself.
Related
I have a big database which is created by entity framework core. This database stores round about 5 million datasets. To improve the query speed i'd like to aggregate the data of the days before.
In this case i would like to execute a SQL command once every day at 00:00 o'clock and aggregate the data of yesterday.
In the past i created stored-procs which are executed by a database-job in mssql. But these databases were created manually and now i'd like to get a similar functionallity by using the entity framework.
I read that there shouldn't be any logic in the database. So how could i do this instead? (The article where i get the base information is: Can you create sql views / stored procedure using Entity Framework 4.1 Code first approach)
So i'm searching a good solution to execute every day a "aggregation" function and store the aggregation data in the database.
You use the method you used before! It's ideally solved by SQL Agent and a proc, almost anything else will have more issues and worse performance.
If you really wanted to do it differently then you need two parts
a scheduler, this will most likely be the OS one, but has no where near
as many features as SQL Agent.
the actual program, a .NET app using EF will do this but EF is
not required, simple ADO will work, as will any other library.
The only reason you'd choose this route, is if you had further requirements that SQL would be inappropriate for, so you needed a more general language.
I'm using Entity Framework 6 on a SQL Server database to query an existing database (database first, so there's an EDMX in my project).
I've noticed that the first time I request an entity, it can take up to thirty seconds for the query to be executed. Subsequent queries to the same object then get completed in a matter of milliseconds. The actual SQL being executed is very fast so it's not a slow query.
I've found that Entity Framework generates views on the background and that this is the most likely culprit. What I haven't found, however, is a good solution for this. There's a NuGet package that can handle the View Generation (EFInteractiveViews), but it hasn't been updated since 2014 and I hardly seem to find any information on how to use it.
What options do I have nowadays? I've tried initializing Entity Framework on Application_Start by doing a few queries, but this doesn't seem to help much at all, and also it's quite difficult to perform the real queries on Application_Start, because most queries use data from the current user (who is not yet logged on at Application_Start) so it's difficult to run these in advance.
I've thought about creating an ashx file that constantly polls the application by calling the API and keep it alive. I've also set the Application Pool to "AlwaysRunning" so that EF doesn't restart when the app pool is recycled.
Does anyone have any tips or ideas on how I can resolve this or things I can try?
Thanks a lot in advance. I've spent the better part of two days already searching for a viable solution.
There are many practices to speed up Entity Framework, I will mention some of them
Turn off the LazyLoading (EDMX => open the file right click anywhere => properties => Lazy Loading Enabled set it to false )
Use AsNoTracking().ToList() and when you want to update, use Attach and update object state to EntityState.Modified
Use Indexes on your table
Use Paging, do not load all the data at once
Split your Edmx into many smaller, only include the ones you need in your page, ( this will effect the performance in good way)
If you want to load related objects "be eager and not lazy", use Include, you might include using System.Data.Entity to use the lambda include features
Example for splitting your Edmx
If you have the following objects for a rent a car app : Country, City , Person, Car, Rent, Gender, Engine, Manufacturers,..etc.
Now
If you are working on a screen to Manage (CRUD) person, this means you don't need Car,Rent,Manufacturer, so create ManagePerson.edmx contains ( Country, City, Person, Gender)
If you are working on managing (CRUD) Car then you don't need (Person,City, Gender,Rent), so you can create ManageCar.edmx containing ( Car, Manufacturer,Country, Engine)
Entity Framework must first compile and translate your LINQ queries into SQL, but after this it then caches them. The first hit to a query is always going to take a long time, but as you mention after that the query will run very quickly.
When I first used EF it was constantly an issue brought up by testers, but when the system went live and was used frequently (and queries were cached) it wasn't an issue.
See Hadi Hassans answer for general speed up tips.
I'm having troubles with the high cost on initial data model first load over 500+ tables. I've elaborated a little testing program to evidence this.
For the proof, the database is AdventureWorks with 72 tables where the largest table with row count is [Sales].[SalesOrderDetail] (121,317 records):
On EF5 without pre-generated views performing a basic query (select * from SalesOrderDetails where condition) the result is: 4.65 seconds.
On EF5 with pre-generated views performing (select * from SalesOrderDetails where condition) the result is: 4.30 seconds.
Now, on EF6 without pre-generated views performing the same query (select * from SalesOrderDetails where condition) the result is: 6.49 seconds.
Finally, on EF6 with pre-generated views performing the same query (select * from SalesOrderDetails where condition) the result is: 4.12 seconds.
The source code has been uploaded in my TFS online at [https://diegotrujillor.visualstudio.com/DefaultCollection/EntityFramework-PerformanceTest], please let me know the user(s) or email(s) to grant all access and proceed to download it to examine. The test was performed on a local server pointing to .\SQLEXPRESS.
So far, we can see some subtle differences and the picture doesn't look very daunting, however, the same scenario in a real prod environment with 538 tables definitely goes in a wrong direction. It is impossible to me to attach the original code and a database backup due to size and privacity stuff (i can send some pictures or even have a conference sharing my desktop to show running live). I've executed hundreds of queries attempting to compare the generated output trace on sql server profiler, and then paste and execute the sentence and the time consumed is 0.00 seconds in sql server editor.
On the live env, EF5 without pre-generated views it can take up 259.8 seconds in a table with 8049 records and 104 columns executing a very similar query respect to the mentioned above. This goes better with pre-generated views: 21.9 seconds, one more time, the statement generated in sql server profiler takes 0.00 seconds to execute.
Nevertheless, on the live env, EF6 it can take up 49.3 seconds executing the same query without pre-generated views, with pre-generated: 47.9 seconds. It looks like the pre-generated views doesn't cause any effect in EF6 or it already have pre-generated views, from core functionality or something else, don't know.
Thus, i had to perform a downgrade to EF5 as mentioned in my recent post [http://blogs.msdn.com/b/adonet/archive/2014/05/19/ef7-new-platforms-new-data-stores.aspx?CommentPosted=true#10561183].
I've already performed the same tests over database first and code first with same results. Actually i'm using the "Entity Framework Power Tools" add-in to pre-generate views, both projects, the real and testing are in .NET Framework 4.0, Visual Studio 2013 is the IDE and SQL Server 2008 R2 SP2 the DBMS.
Any help would be appreciated. Thanks in advance.
I have some performance measuring issue between the EF query run through the web application and running the Profiler generated T-SQL directly into the SQL Query window.
Following is my EF query that executes through the web application:
IEnumerable<application> _entityList = context.applications
.Include(context.indb_generalInfo.EntitySet.Name)
.Include(context.setup_budget.EntitySet.Name)
.Include(context.setup_committee.EntitySet.Name)
.Include(context.setup_fund.EntitySet.Name)
.Include(context.setup_appStatus.EntitySet.Name)
.Include(context.appSancAdvices.EntitySet.Name)
.Where(e => e.indb_generalInfo != null);
if (isIFL != null)
_entityList = _entityList.Where(e => e.app_isIFL == isIFL);
int _entityCount = _entityList.Count(); // hits the database server at this line
While tracing the above EF Query in SQL Profiler it reveals that it took around 221'095 ms to execute. (The applications table having 30,000+, indb_generalInfo having 11,000+ and appSancAdvices having 30,000+ records).
However, when I copy the T-SQL from Profiler and run it directly from Query window it takes around 4'000 ms only.
Why is it so?
The venom in this query is in the first words: IEnumerable<application>. If you replace that by var (i.e. IQueryable) the query will be translated into SQL up to and including the last Count(). This will take considerably less time, because the amount of transported data is reduced to almost nothing.
Further, as bobek already mentioned, you don't need the Includes as you're only counting context.applications items.
Apart from that, you will always notice overhead of using an ORM like Entity Framework.
That's because EF needs to translate your code into TSQL first which is costly as well. Look at this link here: http://peterkellner.net/2009/05/06/linq-to-sql-slow-performance-compilequery-critical/ It'll let you compile your LINQ and should help you with the speed. Also, do you really need that many tables for this query? Maybe you can think of a way to filter it out and only pull out what you need?
The EF definitely has a cost in terms of performance. But it also provides the flexibility to use storedprocs for complex TSQL. But in my opinion it should be your last resort.
in case you interested re performance and EF.
http://msdn.microsoft.com/en-us/data/hh949853.aspx
However...
EF Query in SQL Profiler it reveals that it took around 221'095 ms to
execute.
then..
copy the T-SQL from Profiler and run it directly from Query window
Where the SQL came from is irrelevant.
Q1 took x millisecs. Based on SQL profiler info
Exact same Query Q1' takes less based on SQL profiler. Which means the source of the SQL isnt the issue, it implies environmental issues are involved.
The most obvious explanation, SQL server has buffered many data pages and can much better serve the second identical request.
I'm using EF 4.3 Code First on SQL Server 2008. I run several test suites that delete and recreate the database with CreateIfNotExists. This works fine but is dog slow. It can take up to 15 seconds to create the database on the first call, and typically 3-6 seconds after that. I have several places where this is called. I've already optimized to call this as few times as I can. Is there something I can do to speed up database creation programmatically? I'm willing to go around EF to do this if that helps, but I would like to keep my database build in code and not go back to a SQL script. Thanks!
This works fine but is dog slow.
Yes. The point is to use the real database only for integration tests which don't have to be executed so often and the whole set of integration tests is usually executed only on build server.
It can take up to 15 seconds to create the database on the first call
This is because of slow initialization of EF when unit testing (you can try to switch to x86). The time is also consumed by view generation. Views can be pre-generated which is usually done to reduce startup and initialization of the real system but in case of speeding up unit tests using view pre-generation will not help too much because you will just move the time from test to build.
I'm willing to go around EF to do this if that helps, but I would like to keep my database build in code and not go back to a SQL
Going around would just mean using plain old SQL script. The additional time needed for this operation is may be spent in generating that SQL. I think the SQL is not cached because normal application execution normally doesn't need it more than once but you can ask EF to give you at lest the most important part of that SQL, cache it somewhere and execute it yourselves every time you need it. EF is able to give you SQL for tables and constraints:
var dbSql = ((IObjectContextAdapter) context).ObjectContext.CreateDatabaseScript();
You just need to have your own small SQL to create database and use them together. Even something like following script should be enough:
CREATE DATABASE YourDatabaseName
USE YourDatabaseName
You must also turn off database generation in code first to make this work and to take control over the process:
Database.SetInitializer<YourContextType>(null);
When executing database creation SQL you will need separate connection string pointing to Master database.