MVC: Timeout issue - c#

I'm trying to extract the results of a SQL server view in MVC. The view itself is relatively straightforward and UNIONs a couple of tables together - when run it takes around 2 seconds to return its rows. I've added the view to my model in MVC.
In my controller I have the following code, which is designed to return in JSON format the values from the SQL view:-
public JsonResult GetActivity(string LocalIdentifier)
{
return Json(db.Activities.Where(r => r.LocalIdentifier== LocalIdentifier).ToList(), JsonRequestBehavior.AllowGet);
}
When I try to run this, supplying a valid LocalIdentifier nothing happens for a while and then I get an exception (unhandled in user code) in Visual Studio. For reference, this would generally only return between 30 and 50 rows of data from SQL.
Looking at the Inner Exception I get this error:-
"Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding."
I've done something similar elsewhere in my application and it works fine - I can't see anything wrong with my code in MVC. Is there anything else I could look at to help diagnose and fix this issue?
Update
Interestingly I've just changed the code to extract just one row (just to see what happens - see below) and it runs instantly...could this be a problem with ToList()?? Is there another way of achieving what I'm trying to do that I could try?
public JsonResult GetActivity(string LocalIdentifier)
{
return Json(db.Activities.First(r => r.LocalIdentifier== LocalIdentifier), JsonRequestBehavior.AllowGet);
}

you should try running the resulting query directly on the sql server. I expect that the query runs for more then 30s, resulting in a timeout. The default timeout value is 30s, so the behaviour you're seeing is expected.
2 ways to deal with it:
speed up the query (maybe you're missing an index?)
change the default timeout (but this is not the preferred option in my opinion)

It depends on how db.Activities is defined. If it has properties for other entities (meaning other tables in the database) then these could also be loaded by EF, even though you did not specify so explicitly, because JSON is serializing each object and its properties. Half your database or more could end up being loaded, even from a simple statement like this.
This SO question has several answers that may help you understand better what is going on behind the scenes, and then fix it.

Enable profiling and see the query inside
Launch this query in SSMS and build the query execution plan
Analyse the plan and make the updates to the database - change structure, build index, change query
If the query runs fine in DB then it seems something wrong with your ORM side part and you should search for the problem there

Related

C#: Get Feedback from SqlConnection Object?

I am currently facing an issue where it takes quite a while to process information from a server, and I would like to provide active feedback for the user to know what is going on while the application appears to be just sitting around.
A little about the application: it allows the user to pull all databases from a specified server along with all content within those databases. This can take quite a while sometimes, since some of our databases can reach 2TB in size. Each server can contain hundreds of databases and so as a result, if I try to load a server with 100 databases, and 30% of those databases are over 100GB in size, it takes a good couple of minutes before the application is able to run effectively again.
Currently I just have a simple loading message that says: "Please wait, this could take a while...". However, in my opinion, this is not really sufficient for something that can take a few minutes.
So as a result, I am wondering if there is a way to track the progress of the SqlConnection object as it is executing the specified query? If so, what kind of details would I be able to provide and are there any readily available resources to look over and better understand the solution?
I am hopeful that there is a way to do this without having to recreate the SqlConnection object altogether.
Thank you all for your help!
EDIT
Also, as another note; I am NOT looking for handouts of code here. I am looking for resources that will help me in this situation if any are available. I have been looking into this issue for a few days already, I figure the best place to ask for help is here at this point.
Extra
A more thorough explanation: I am wondering if there is a way to provide the user with names of databases, tables, views, etc that are currently being received.
For example:
Loading Table X From Database Y On Server Z
Loading Table A From Database B On Server Z
The SqlConnection has an InfoMessage-Event to which you can assign a method.
Furthermore you have to set the FireInfoMessageEventOnUserErrors-Property to true.
No you can do something like
private void OnInfoMessage(object sender, SqlInfoMessageEventArgs e)
{
for (var index = 0; index < e.Errors.Count; index++)
{
var message = e.Errors[index];
//use the message object
}
}
Note that you should only evaluate messages with an errorcode lower then 11 (everything above is a 'real' error).
Depending on what command you are using sometime the server already generates such info messages (for example at VERIFY BACKUP).
Anyways you can also use this to report progress form a stored procedure or a query.
Pseudocode (im not an sqlguy):
RAISEERROR('START', 1,1) WITH NOWAIT; -- Will raise your OnInfoMessage-method
WHILE
-- Execute something
RAISEERROR('Done something', 1,1) WITH NOWAIT; -- Will raise your OnInfoMessage-method
-- etc.
END
Have fun with parsing, cause remember: such messages can also be generated by the server itself (so it is probably a good idea to start your own, relevant "errors" with a satic sequence which cannot occur under nomal circumstances).

C# SQL query blocks server memory

I'm a bit newbie still and I have been assigned with the task of maintaining previosuly done code.
I have a web that simulates SQL Management Studio, limitating deleting options for example, so basic users don't screw our servers.
Well, we have a function that expects a query or queries, it works fine, but our server RAM gets blown up with complex queries, maybe it's not that much data, but its casting xml and all that stuff that I still don't even understand in SQL.
This is the actual function:
public DataSet ExecuteMultipleQueries(string queries)
{
var results = new DataSet();
using (var myConnection = new SqlConnection(_connectionString))
{
myConnection.Open();
var sqlCommand = myConnection.CreateCommand();
sqlCommand.Transaction = myConnection.BeginTransaction(IsolationLevel.ReadUncommitted);
sqlCommand.CommandTimeout = AppSettings.SqlTimeout;
sqlCommand.CommandText = queries.Trim();
var dataAdapter = new SqlDataAdapter { SelectCommand = sqlCommand };
dataAdapter.Fill(results);
return results;
}
}
I'm a bit lost, I've read many different answers but either I don't understand them properly or they don't solve my problems in any way.
I know I could use Linq-toSql- or Entity, I tried them but I really don't know how to use them with an "unknown" query, I could try to research more anyway so if you think they will help me approaching a solution, by any means, I will try to learn it.
So to the point:
The function seems to stop at dataAdapter.Fill(results) when debugging, at that point is where the server tries to answer the query and just consume all its RAM and blocks itself. How can I solve this? I thought maybe by making SQL return a certain amount of data, store it in a certain collection, then continue returning data, and keep going until there is no more data to return from SQL, but I really don't know how to detect if there is any data left to return from SQL.
Also I thought about reading and storing in two different threads, but I don't know how the data that is in one thread can be stored in other thread async (and even less if it solves the issue).
So, yes, I don't have anything clear at all, so any guidance or tip would be highly appreciated.
Thanks in advance and sorry for the long post.
You can use pagination to fetch only part of the data.
Your code will be like this:
dataAdapter.Fill(results, 0, pageSize);
pageSize can be at size you want (100 or 250 for example).
You can get more information in this msdn article.
In order to investigate, try the following:
Start SQL profiler (it is usually installed along with SSMS and can be started from Management Studio, Tools menu)
Make sure you fill up some filters (either NT username or at least the database you are profiling). This is to catch as specific (i.e. only your) queries as possible
Include starting events to see when your query starts (e.g. RPC:Starting).
Start your application
Start the profiler before issuing the query (fill the adapter)
Issue the query -> you should see the query start in the profiler
Stop the profiler not to catch other queries (it puts overhead on SQL Server)
Stop the application (no reason to mess with server until the analysis is done)
Take the query within SQL Management Studio. I expect a SELECT that returns a lot of data. Do not run as it is, but put a TOP to limit its results. E.g. SELECT TOP 1000 <some columns> from ....
If the TOPed select runs slowly, you are returning too much data.
This may be due to returning some large fields such as N/VARCHAR(MAX) or VARBINARY(MAX). One possible solution is to exclude these fields from the initial SELECT and lazy-load this data (as needed).
Check these steps and come back with your actual query, if needed.

Simple LINQ query gets NullReferenceException even though SQL Server table has data

I have a simple Linq query that is failing suddenly. This worked up until yesterday and I have not been able to determine the problem.
The query is simply getting a complete view from SQL Server
using(JobEntities JE = new JobEntities())
{
var BatchData = (from x in JE.vwBatchObjectActiveBatches select x).ToList();
}
Starting yesterday this line gets a
NullReferenceException (Object reference not set to an instance of an object)
My suspicion was that a user put in bad data causing the view to fail on SQL Server, but I have checked SQL Server itself and the view runs fine and populates with data.
This query was running in the middle of a larger function loading data from many places, so I have created a test case where I simply load the main window and run this query directly in the code behind to make sure that nothing else is affecting it, and the query still fails. All other Linq queries that I run in this project work still, only this one fails. The app is not under any production right now, and has been static for several months at least.
When I look at the JE in the watch window I can see the vwBatchObjectActiveBatches and it lists 164 records in the Local section -- this matches the view results on SQL Server. Expanding the Results View shows the null error again.
How can I find and fix whatever is causing this query to fail? Why does the results view show an error but the local Line shows the data that I am attempting to get?
It looks like your database returns NULL where Entity Framework does not expect/allow it. Data returned should be in accordance with the definition of its datamodel objects.
Solution: either 'fix' the data, or fix the query that produces it, or change the definition of your datamodel objects to allow NULL for the conflicting field(s).

Sequence contains no elements... Thing is, I know it should

Background:
Using C# in the ASP.Net code-behind to call a SQL stored procedure via LINQ-to-Entities.
Also, please note that I am a complete newbie to all of this.
Problem:
I'm calling a stored procedure to get the max_length of a column in my database.
It keeps returning "Sequence Contains No Elements" when utilizing .First() and (alternately) the default when using .FirstOrDefault().
The problem is, I know it should be returning the number "4500" as I've run the query in SQL to see what should be there.
Any ideas on why it would be doing this?
I've searched high and low, but I can't find anyone else who has had this problem where they know it should be returning a value.
Code:
short? `shorty = db.sp_GetMaxLength("ApplicantEssay").First();`
The result of the SQL query used in the stored procedure:
Thanks so much for any assistance you can provide!!
Edit:
The only other code involved is the ADO.Net Entity auto-generated code.
At least, it's the only other code that the program steps through when I tried to examine what was going on.
I had a hard time looking at the return and trying to figure it out, but obviously it's returning nothing):
public virtual ObjectResult<Nullable<short>> sp_GetMaxLength(string cOLUMN_NAME)
{
var cOLUMN_NAMEParameter = cOLUMN_NAME != null ?
new ObjectParameter("COLUMN_NAME", cOLUMN_NAME) :
new ObjectParameter("COLUMN_NAME", typeof(string));
return ((IObjectContextAdapter)this).ObjectContext.ExecuteFunction<Nullable<short>>("sp_GetMaxLength", cOLUMN_NAMEParameter);
}
Edit #2: I guess I should close this question out.
I messed around with some stuff in sql, then put it back to what I had in the first place and re-executed the stored procedure. Following that, I deleted the Entity Model, reset web.config to its original state pre-model, closed Visual studio, re-opened Visual Studio, and then re-added an ADO.Net Entity Model to the project. I then re-ran the code, stepped through it, and then all of a sudden, "4500" popped up as the returned value.
I am completely flummoxed, since all the code is exactly the same without changes, but I guess that's the power of scrapping and re-starting??? No other explanation, though I hate when it's as simple as that. :-(
Thanks to all who commented. :-)

LINQ to SQL "1 of 2 Updates failed" on "SubmitChanges()"

I am updating an object of type X and its children Y using LINQ to SQL and then submitting changes and getting this error
Example Code
X objX = _context.X.ToList().Where(x => x.DeletedOn == null).First();
objX.DeletedOn = DateTime.Now;
EntitySet<Y> objYs = objX.Ys;
Y objY = objYs[0];
objY.DeletedOn = DateTime.Now;
_context.SubmitChanges();
On SubmitChanges() I get an exception "1 of 2 Updates failed", no other information as to why that happened. Any ideas?
Also the exception type is
ChangeConflictException
Sooo what was the cause of the problem
- A trigger
I did a sql profiler and saw that
When ObjY's DeletedOn property got updated a trigger updated
ObjX's property (value in table) called CountOfX
which led to an error as the SQL created by LINQ to SQL had the old CountOfX value in it.
Hence the conflict.
If you ever get this error - SQL profiler is the best place to start your investigation
ALSO NOT RELATED TO THE QUESTION
I am testing LINQ to SQL and ADO.net Framework, weirdly this error happened in LINQ to SQL but not in ADO.net framework. But I like LINQ to SQL for its Lazy Loading. Waiting for EF to get outta beta
I'm not sure what the cause of the error may be exactly, but there seem to be a number of problems with the example you've provided.
Using ToList() before the Where() method would cause your context to read the entire table from the DB into memory, convert it to an array; and then in the same line you immediately call Where which will discard the rows you've loaded, but don't need. Why not just:
_context.X.Where(...
The Where method will return multiple items, but the second line in the example doesn't appear to be iterating through each item individually. It appears to be setting the DeletedOn property for the collection itself, but the collection wouldn't have such a property. It should fail right there.
You are using DateTime.Now twice in the code. Not a problem, except that this will produce ever so slightly different date values each time it is called. You should call DateTime.Now once and assign the result to a variable so that everything you use it on gets identical values.
At the point where you have "Y objY = objYs[0]" it will fail if there are no items in the Y collection for any given X. You'd get an index out of bounds exception on the array.
So given this example, I'm not sure if anyone could speculate as to why code modeled after this example might be breaking.
In LINQ2SQL Data Context diagram select the Entity and the field where the count is stored. (A denormalized figure)
Now set the UpdateCheck = Never.
I had this kind of issue. I was debugging running single lines at a time. It turned out another process was modifying this record.
My manual debugging process was slowing down the normal speed of the function. When I ran it all the way to a line after the SubmitChanges method, it succeeded.
My scenario would be less common, but the nature of this error relates to the record becoming superceded by another function/process. In my case it was another process.

Categories