Setting session variable causes slowness in FileContentResult rendering in MVC - c#

Here's my predicament. I have a page in an MVC app that's displaying a list of search results via a partial view using an ajax call. The model is a List<List<string>> representing a dynamic data set, i.e., the users choose which columns they want returned and what order they come back in. All the view is doing is a nested loop which builds out the results table.
One of the potential returned fields is an image of a barcode which is being rendered by another method returning a FileContentResult. Normally this works great, it's slick and performant, to the point where I don't really notice all of the barcodes being rendered and downloaded at all, even in a data set that's hundreds of rows long.
The problem arises when I set a session variable using HttpContext.Current.Session, even something as simple as Session["thingy"] = "thingy";. When that happens there is a drastic performance hit with the barcode images. Result sets that would take a second to load fully are now suffering from image "pop in" for up to 10 seconds after the search button is hit. A few times an image has failed to load, giving an error to the effect of "the server is too busy right now".
Does anyone out there in overflowland have any insight into what could be causing this behavior? I've found a kludgy workaround but it involves unnecessary ajax calls and extra trips to the database.

So the issue was that IIS was treating requests synchronously whenever there was anything stored in the session. So all of my calls to the barcode action were waiting until the last one had finished before moving on, hence the pop-in.
The answer was in this link posted by Alexei. Oddly enough it was the most downvoted answer that provided the easiest solution. I created a new controller for my images and refactored the barcode rendering action into it, then decorated the controller with [SessionState(SessionStateBehavior.Disabled)], forcing IIS to treat any requests to the actions in the controller as asynchronous.

I was having the same issues a while ago. Fixed it by setting EnableSessionState to ReadOnly in my web.config.
I thought it might have some negative side effects but none so far. Even posted a question here in SO looking for comments.
See here: EnableSessionState = ReadOnly - possible side effects?

Related

Webpage is sometimes unresponsive to postback

I am working on an ASP.NET project on VS 2010 that is completely local as of now and a strange problem keeps popping up which I can't seem to explain.
Essentially, sometimes when I perform an action that causes some sort of postback event (e.g. change the selection of a drop down list which repopulates a gridview based on the drop down list selection), the page will "flicker" like a normal postback but then result in a blank page.
Since this occured rather infrequently I decided to ignore it for the time being and focus on other aspects of the project. However, after I implement partial postbacks using AJAX, I think I may have found out what is happening.
With partial postbacks, the page now is not reloading every time a postback occurs, and I noticed that occasionally, performing an action that should cause a postback does absolutely nothing.
So my hypothesis is that somehow, the event triggered causes a postback but then gets hung up somewhere? I'm rather new to web programming so I'm at a loss at this point.
Any help would be appreciated.
Thanks
You seem to have an intermittent error that affects the large majority of your website without giving you any errors or logging. This is colloquially known as 'a huge headache'.
Here are a few steps you can take to hopefully get more information on the problem:
On your browser/client side, install a developer add-on such as Firebug that allows you to step through your AJAX/javascript as you make the calls. Watch for anything that looks odd or that might imply you are failing to address information critical for the postback. Javascript's robustness means that it can very often do things that are logically nonsense and not have a huge crash with informative errors like you would expect out of a more particular language like C#.
On the connection, install Fiddler and keep a log of all the requests and responses that go out over the line. When you manage to reproduce the error, inspect the request that went to the server very carefully to see if it's any different from others. If you get a response, investigate that too.
On the server side, try to drop a visual studio debugger into the web server instance so you have visibility on when something is hit and when it's not. If you are able to do this, you can at least see if it is getting all the way into your code when it goes to fail.
It may also help to ask if there's been any known hardware problems in your office/work space recently. It sounds really unlikely since everything else is working, but when dealing with a problem like this it's a good idea to check all your bases.

Preventing Caching of Specific Images in Silverlight

It recently became apparent that my project is caching images. This is an issue because the user can upload a new image which does not get reflected until the browser is closed and reloaded (atleast when debugging in IE). I would like to not have to keep re-downloading images over and over again for things that have not changed, as that would very much increase the data we are sending out.
I have tried a couple solutions here and here2
The common factor seems to be that the variable that displays starts clean. But neither of those has worked for me.
I essentially am displaying images in two different ways.
1) I take a string and pass it into the source of an <Image />
2) I turn a string into a URI and turn that into a bitmap behind the scenes which then gets passed into the source of an <Image />
When the image gets updated server side the location of the user's image stays the same, only the data changes.
The coder doing server side stuff attempted a solution as well. He said he implemented some Cache preventing headers, the result was that the first time the image is requested after it has been updated it retrieves a new image and displays it. Any other places the image would be displayed do not get updated however.
I guess my ideal solution would be that once the user uploads the new image I implement something that notifies anyone that uses that particular URI to grab a new version.
Does anyone know how to selectively stop caching?
I would try append a timestamp to the Uri of the image you are requesting, this should help stop the browser (or any proxies) caching
e.g. http://www.example.com/myimage.jpg?ts=2011081107333454
First lets clear up the somewhat ambiguous term "caching".
We do all sorts of caching all the time. Whenever we take the result of an expensive operation and store that result for future use to avoid repeating the expensive operation we are in effect "caching". All frameworks including Silverlight will also be doing that sort of thing a lot.
However whenever the term "caching" is used in the context of a Web based application and refering to a resource fetched using HTTP the specific HTTP cache specification is what comes to mind. This is not unreasonable HTTP caches obviously play a major role and getting the response header settings on server right is important for correct operation.
An often missed aspect though of HTTP resource caching is that the reponsiblity to honor cache headers only lies with the HTTP stack itself, it does not lie with application using HTTP to even know anything about caching.
If then the application chooses to maintain its own "cache" of URIs to resources requested from the HTTP stack, it is not required to implement HTTP compliant caching algorithms. If such a "cache" is asked to provide a specific application object matching a specified Uri it is entirely free to do so without reference to HTTP.
If the HTTP caching were the only cache to worry about then assuming your "server coder" has actually got the cache headers set correctly then all should be well. However there may still be an application layer cache involved as well.
Ulitmately Robs suggestion makes sense in this case where you "version" the uri with a query string value. However its not about preventing caching, caching both at application and http level is a good thing, its about ensuring the resource referenced by the full Uri always the desired content.

Concurrent asynchronous callbacks

Question to die hard asp.net experts. I have spent much time trying to find an answer or to do it myself but no luck so far.
ASP.NET web application. I plan to improve page load time so that user experience is better. I want to delay load sections of page using UpdatePanels. I can make one UpdatePanel update itself right after page loads using timer with minimum interval. That works just fine but steps begin when trying to have it done with multiple UpdatePanels. Basically what happens is all panels are updated but sequentially rather than all at the same time.
Now, I have read that this is due to a fact that each async postback result caries full page viewstate and to prevent from viewstate inconsistencies asynchronous postbacks are serialized. Actually they say that only last callback would be successful so I am lucky to have them serialized, I guess.
And now the big question: has anyone found a way round it? In ASP.NET if possible. This would be a VERY valued answer probably not only for me.
Thanks, thanks, thanks (for working answer :-)
UpdatePanels are synchronous by design.
If you want to execute multiple requests concurrently, you'll need to use page methods, AJAX services, or raw AJAX. Either way means giving up on ViewState.
If you want to render ASP.Net controls concurrently for multiple AJAX requests, you can make small independent ASPX files that contain the controls, send AJAX requests to them, and insert the rendered HTML into the DOM. In jQuery, you would do this like this: $('selector').load('something.aspx'). Note that neither postbacks nor viewstate would work.

Simulate the page lifecycle to grab the html from the UI layer

I'm working with a rather large .net web application.
Users want to be able to export reports to PDF. Since the reports are based on aggregation of many layers of data, the best way to get an accurate snapshot is to actually take a snapshot of the UI. I can take the html of the UI and parse that to a PDF file.
Since the UI may take up to 30 seconds to load but the results never change, I wand to cache a pdf as soon as item gets saved in a background thread.
My main concern with this method is that if I go through the UI, I have to worry about timeouts. While background threads and the like can last as long as they want, aspx pages only last so long until they are terminated.
I have two ideas how to take care of this. The first idea is to create an aspx page that loads the UI, overrides render, and stores the rendered data to the database. A background thread would make a WebRequest to that page internally and then grab the results from the database. This obviously has to take security into consideration and also needs to worry about timeouts if the UI takes too long to generate.
The other idea is to create a page object and populate it manually in code, call the relevant methods by hand, and then grab the data from that. The problems with that method, aside from having no idea how to do it,is that I'm afraid I may forget to call a method or something may not work correctly because it's not actually associated with a real session or webserver.
What is the best way to simulate the UI of a page in a background thread?
I know of 3 possible solutions:
IHttpHandler
This question has the full answer. The general jiste is you capture the Response.Filter output by implementing your own readable stream and a custom IHttpHandler.
This doesn't let you capture a page's output remotely however, it only allows you to capture the HTML that would be sent to the client beforehand, and the page has to be called. So if you use a separate page for PDF generation something will have to call that.
WebClient
The only alternative I can see for doing that with ASP.NET is to use a blocking WebClient to request the page that is generating the HTML. Take that output and then turn it into a PDF. Before you do all this, you can obviously check your cache to see if it's in there already.
WebClient client = new WebClient();
string result = client.DownloadString("http://localhost/yoursite");
WatiN (or other browser automation packages)
One other possible solution is WatiN which gives you a lot of flexibility with capturing an browser's HTML. The setback with this is it needs to interact with the desktop. Here's their example:
using (IE ie = new IE("http://www.google.com"))
{
ie.TextField(Find.ByName("q")).TypeText("WatiN");
ie.Button(Find.ByName("btnG")).Click();
Assert.IsTrue(ie.ContainsText("WatiN"));
}
If the "the best way to get an accurate snapshot is to actually take a snapshot of the UI" is actually true, then you need to refactor your code.
Build a data provider that provides your aggregated data to both the UI and the PDF generator. Layer your system.
Then, when it's time to build the PDFs, you have only a single location to call, and no hacky UI interception/multiple-thread issues to deal with.

Set ASP.net executionTimeout in code / "refresh" request

I'll have an ASP.net page that creates some Excel Sheets and sends them to the user. The problem is, sometimes I get Http timeouts, presumably because the Request runs longer than executionTimeout (110 seconds per default).
I just wonder what my options are to prevent this, without wanting to generally increase the executionTimeout in web.config?
In PHP, set_time_limit exists which can be used in a function to extend its life, but I did not see anything like that in C#/ASP.net?
How do you handle long-running functions in ASP.net?
If you want to increase the execution timeout for this one request you can set
HttpContext.Current.Server.ScriptTimeout
But you still may have the problem of the client timing out which you can't reliably solve directly from the server. To get around that you could implement a "processing" page (like Rob suggests) that posts back until the response is ready. Or you might want to look into AJAX to do something similar.
I've not really had to face this issue too much yet myself, so please keep that in mind.
Is there not anyway you can run the process async and specify a callback method to occur once complete, and then keep the page in a "we are processing your request.." loop cycle. You could then open this up to add some nice UI enhancements as well.
Just kinda thinking out loud. That would probably be the sort of thing I would like to do :)

Categories