How to increase the server timeout in asp.net? - c#

We are using Linq to connect the database and in one situation we need to execute the big query and I can execute that query directly in sql server, but when I was trying to execute the same query through asp.net it was showing timeout error, can you help me?
Also we are using pagemethods => webmethods to contact the sql server.

Have a look at the answer to this question: Linq-to-SQL Timeout
You can set the command timeout on your DataContext object (https://msdn.microsoft.com/library/system.data.linq.datacontext.commandtimeout%28v=vs.110%29.aspx).
Example (from the linked answer):
using (MainContext db = new MainContext())
{
db.CommandTimeout = 3 * 60; // 3 Mins
}

You need to increase the CommandTimeout, not ConnectionTimout. ConnectionTimeout (In other answer) is the amount of time the app allows to connect to the DB rather than run a comment.
You probably also want to look into improving the performance of your sql query by adding indexes etc. You could use SQL profiler to catch the sql statement that Linq to SQL generated. Grab the query and run it through an execution plan on SSMS and see where it's taking most time to execute. This is generally a good place to start.

Related

Entity Framework first query is much slower than second

In my website, I am using ASP.NET MVC 5 with EF6.
I experience slow performance with only 20K records for the first call.
For example, I need to get all rows from the Person table.
First invocation: 7500 ms (after that in the second row it takes only 1000ms)
List<Person> persons = await _context.Person.ToListAsync(); // Time : 7500ms
List<Person> persons2 = await _context.Person.ToListAsync(); // Time : 1000ms
What I tried:
Canceled the lazyload from edmx schema
Refresh the schema
The same query in SQL Server Management Studio takes 400 ms (and it's a really simple query without joins and conditions)
and it happens every time client goes to the person page
I would have posted this in a comment, but it's too long.
There are many things that can factor into that time difference, in order of less likely/impactful to more likely/impactful:
The first query, once in SQL Server (if that's the underlying engine) has to "Warm Up" SQL sometimes. I doubt that this is the actual problem since SQL Server probably hasn't enough time to go "Down" between your tries. Also, the execution plan shouldn't be too problematic for that query.
The first query has to open the communication channel. For example, if it has to route through VPNs, or simply open an SQL connection, it adds up a delay.
Migrations: Unless you manually force migrations, when you create the DbContext, EF6 doesn't run the migrations (and Seeding) at that moment. It waits for the first time it actually has to query, then builds the configurations and execute migrations.
If you want to investigate, put a breakpoint in the OnModelCreating method and see when it's called. You can also add another query before these two queries to an unrelated entity and you'll see that it's not because of caching (AFAIK, the Caching is only used when using DbSet<T>.Find(...))

Disabling SQL Server's caching through .NET code

I am currently calling up several stored procedures in some .NET code, SqlConnection. I'd like to disable the caching done by SQL Server, so that I can measure performance periodically (I'm gonna be comparing it to another server that likely won't have any cached data either). Is this possible to do without modifying the sprocs?
This is the code that I am currently using:
using (SqlConnection connection = new SqlConnection(/* connection string goes here */)) {
SqlCommand command = new SqlCommand(procName, connection);
command.Parameters.AddRange(parameters);
command.CommandType = System.Data.CommandType.StoredProcedure;
connection.Open();
SqlDataReader r = command.ExecuteReader();
// todo: read data here
r.Close();
connection.Close();
}
First thing, by "cacheing" here I'm assuming you're referring to the Execution Plan Cache. Once SQL Server figures out the best order to execute your statements, it stores it for a while. This problem is commonly known as "Parameter Sniffing". This is what you clear when you run dbcc freeproccache. Unfortunately, that's an admin-privileged command and it affects all connections.
The root of the problem is that your SQL probably performs differently with a different set of parameters. SQL Server will only store the execution plan of the first execution it sees and the parameters associated with it. So if the arguments on first execution are good for the common case, your app will perform fine. But once in a while, the wrong arguments will get used on first execution and your entire application can perform poorly.
There are a number of ways to optimize your SQL statement to reduce the impact of this, but it's not completely avoidable.
Generate the SQL dynamically - You take the performance hit of generating a query plan on each execution, but this may be worth it if using the wrong execution plan causes your query to never return. I suggest this path, though it is more cumbersome. I found SET STATISTICS TIME ON and SQL Profiler helpful in reducing the plan generation time. The biggest improvement came from using 3-part naming (owner.schema.table) for the tables.
Specify a "good set" of initial parameters for your query with query hints.
SELECT Col1, Col2
FROM dbo.MySchema.MyTab
WHERE Col1=#Parameter
OPTION (OPTIMIZE FOR (#Parameter='value'));
This link describe the parameter sniffing problem fairly well. This was a bigger problem in SQL 2005. Later versions of SQL did a better job of avoiding this.

LINQ Query not hitting the database

I'm having an issue with a really simple password hash retrieval LINQ query. The problem is if the user logs out, then tries to log back it, it just uses the cached values of the query without querying the database again. The query in question is the following:
using (var db = new DataModel.DatabaseContext())
{
return (from emp in db.Employees where emp.Username == username select emp.Password).SingleOrDefault();
}
But when I break, it seems that EF IS executing a reader on a separate thread! Then why do I think it isn't really querying the database? Well the execution time is just too short. It messes up my async methods, it basically doesn't leave enough time for a MessageBox to be shown (works properly when I call the method for the first time). Maybe the database itself has some transient options set up?
EDIT: I thought I found out what the problem is but this is just unreal. It executes the query on a remote server faster than a ping request. <0.001s I'm stumped
It is because the first time you create a DbContext in your AppDomain (maybe first call to new YourDbContext() in your application) there is going a lot of initialization and configuration under the hood so it takes some time the first time, but after that (while application is running) the process speeds up so you can't feel it.

Entity Framework stored procedure over remote connection

I'm using EF 4, and I'm stumped on another quirk... Basically, I have a fairly simple stored procedure that's responsible for retrieving data from SQL and returning a complex type. I have the stored procedure added to my model via a function import. It's more or less in the following structure.
using (ModelContainer context = GetNewModelContainer())
{
return context.GetSummary(id, startDate, endDate, type).ToList();
}
I should mention that the code above executes over a remote SQL connection. It takes nearly 10 minutes to execute. However, using SQL Server Management Studio over the remote connection, the stored procedure executes almost instantaneously.
There are only 100 records or so that are returned, and each record has approximately 30 fields.
When I run the code above locally (no remote connection) against a backup of the customer's database, it executes without any delay.
I'm stumped on what could be causing this performance hit. 10 minutes is unacceptable. I don't think it's the stored procedure. Could it be the serialization due to the remote connection? Any thoughts on how I can track and correct down the culprit?
The symptoms you are describing are those usually associated with an incorrectly cached query plan (due to parameter sniffing).
Ensure your statistics are up to date, and rebuild indexes if they are fragmented.
The canonical reference is: Slow in the Application, Fast in SSMS? An absolutely essential read.
Possible useful SO links:
Option Recompile makes query fast - good or bad?
Strange problem with SQL Server procedure execution plan
SQL Server function intermittent performance issues
Why is some sql query much slower when used with SqlCommand?

Why does a database query only go slow in the application?

I have a webpage that takes 10 minutes to run one query against a database, but the same query returns in less than a second when run from SQL Server Management Studio.
The webpage is just firing SQL at the database that is executing a stored procedure, which in turn is performing a pretty simple select over four tables. Again the code is basic ADO, setting the CommandText on an SqlCommand and then performing an ExecuteReader to get the data.
The webpage normally works quickly, but when it slows down the only way to get it speeded up is to defragment the indexes on the tables being queried (different ones different times), which doesn't seem to make sense when the same query executes so quickly manually.
I have had a look at this question but it doesn't apply as the webpage is literally just firing text at the database.
Does anyone have any good ideas why this is going slow one way and not the other?
Thanks
I would suspect parameter sniffing.
The cached execution plan used for your application's connection probably won't be usable by your SSMS connection due to different set options so it will generate a new different plan.
You can retrieve the cached plans for the stored procedure by using the query below. Then compare to see if they are different (e.g. is the slow one doing index seeks and bookmark lookups at a place where the other one does a scan?)
Use YourDatabase;
SELECT *
FROM sys.dm_exec_cached_plans
CROSS APPLY sys.dm_exec_sql_text(plan_handle)
CROSS APPLY sys.dm_exec_query_plan(plan_handle)
cross APPLY sys.dm_exec_plan_attributes(plan_handle) AS epa
where sys.dm_exec_sql_text.OBJECTID=object_id('YourProcName')
and attribute='set_options'
Is there any difference between the command text of the query in the app and the query you are executing manually? Since you said that reindexing helps performance (which also updates statistics), it sounds like it may be getting stuck on a bad execution plan.
You might want to run a sql trace and capture the showplanxml event to see what the execution plan looks like, and also capture sql statement complete (though this can slow the server down if a lot of statements are coming through the system so be careful) to be sure the statement sent to SQL server is the same one you are running manually.

Categories