I have a .NET MVC web app for data reporting. When the app first loads, it requests a lot of data from remote servers then caches it locally in web server memory. Before the cache loads, there's no way to respond to incoming requests. Every request that comes in while the cache is loading must wait for the thread loading the cache.
If I use a critical section (lock) on the caching code, all the requests will block. That's a huge waste of resources and I could even exhaust my IIS worker thread pool.
If I use async requests, the web requests will all return before the cache is loaded. I can't "callback" a web request which has already returned its contents to the client!
How can I manage the threads properly? Maybe there a way to move all requests to a single thread which asynchronously waits for the cache to load and then move them back out to individual threads once the cache is loaded?
If I use a critical section (lock) on the caching code, all the requests will block.
That is true. You can use SemaphoreSlim.WaitAsync to asynchronously wait. This behaves like a lock. You probably want a solution for the case that many requests queue up (like 1000s). You can use a second semaphore for that with a max count of 1000 and a wait timeout of zero. If the wait fails you know that >= 1000 requests are running and you can fail the request.
Big downside is that now all your requests have some async component to it. Maybe you can do this in some central place such as an async MVC action filter. Otherwise you will be forced to make all MVC actions async which is a headache.
Be sure to correctly configure all ASP.NET and IIS queues.
If I use async requests
Not sure you understand what async means in the context of ASP.NET. Async request processing is an implementation detail of the server. The client can't detect it. The request is not prematurely completed. Async IO (and any other form of async blocking) does not cause the request to end prematurely.
Don't put the Loading of the cache in a new thread: If you want that no requests are served you should load the cache in app_start. If the application loaded reuqests are served again.
Does the app require restarting often? If not, you can have some type of a fetch routine that requests updated data from remote web servers, and swap out all relevant caches in one critical section.
In essence, fetch the data in the background and commit all at once (you can lock during the commit to avoid contention over proper values).
Related
I'm maintaining an ASP.NET Core web application that needs to repeatedly run some background threads. I know it's not a good design but currently I have to fix its major issues with minimum effort. Now I wonder if I should worry about handling users http requests by web server or not?
Question is simple but I can't find any clear answer for it:
What is the difference between threads that are created in application like this:
Task.Run(() => { // some parallel job })
and worker threads of IIS that handle http requests?
Are they come from the same thread pool or they're reside in separate pools?
According to this it's all one pool: "ASP.NET Core already runs app code on normal Thread Pool threads." In other words, there isn't a separate max for threads serving requests vs. background threads.
The biggest difference is that IIS is aware of the threads it creates itself for an incoming request. IIS is not aware of any threads you create yourself.
When an app pool is recycled, or IIS is shut down, it waits until all requests have finished processing - it waits until the threads it creates for each request has finished processing - and then it kills the process. If you create any threads that outlive the request (for example, if you create a background thread and then send the response back to the client) IIS has no idea that thread is still running and could kill the whole process at any time.
If you don't return a response until all the threads are complete, then you won't have that specific problem.
The other issue is that you may hit the maximum number of allowable threads. Then all kinds of weird performance issues would happen. But that depends on how many threads you are creating and how many HTTP requests are coming in.
I'm working in an environment with a light-weight web application (ASP.NET MVC 4.5), with pretty much all of the actual work being implemented in a separate web service application (ASP.NET Web API).
In nearly all cases, all the web site proper does is deliver pages that make AJAX calls to the web service site. But I have one use case in which a page needs to do an httppost to the main web site, and then have the controller action make a call against the webservice, and wait for a response.
Microsoft has, of course, a how-to page:
http://www.asp.net/web-api/overview/web-api-clients/calling-a-web-api-from-a-net-client
On that page, it says:
Many of the HttpClient methods are async, because they perform network I/O. In the RunAsync method, I'll show the correct way to invoke those methods asynchronously. It's OK to block the main thread in a console application, but in a GUI application, you should never block the UI thread.
So, the question, is blocking in a Controller Action, while we're waiting for a response from the webservice going to cause a problem? Yes, it means that the response to the browser is going to be delayed, but that seems inevitable given the circumstances.
My expectation would be that this wouldn't be an issue. Multiple requests are coming into the web service simultaneously, and that for this one request to block for a bit wouldn't have any unusual impact on overall performance. But I thought I'd ask.
When you await in your action, you're actually just returning the current thread back to the app's thread pool. So it really won't make much difference if you do your server side stuff synchronously or asynchronously unless you're getting enough requests to exhaust your thread pool.
Request comes in and is assigned a free thread from the thread pool (if no thread
is free then the request will sit and wait until one is free)
Web Api creates your controller and executes your action
When your action awaits, the thread is returned back to the thread pool
When the awaited task is done, a thread is assigned back to the request
The action finishes its work and returns or is awaited again which will return
the thread back to the thread pool again
With that in mind, it's obvious that you're not holding your api back by executing the action synchronously unless your thread pool is running out of available threads. In that case, the synchronous operation would hold you back because new requests are sitting and waiting to be assigned to a thread. But if it was async, that operation will return it's thread back to the thread pool so the new requests will be taken care of sooner.
I am trying to get some asynchronous work done with the System.Threading.Tasks.Task class. The scenario is simple. I have a web app and in one button click event I start a Task which must run to check some outside service for a couple of minutes. It is not a heavy task. All it's going to do is send a request every 5 seconds and get a response. But it must do it for at least a couple of minutes. So, I don't want user to wait until this task gets job done. After I have started the task, I immediately return to the user saying that the task started and he/she will be informed when it is done. I wonder if this task I created will cause any problems, since I returned and ended the HTTP response.
This type of "asynchronous work" isn't possible by using the Task type. As I mention on my blog, async does not change the HTTP protocol; you still get one response per request, that's it!
The ideal ASP.NET app does not do any work outside of a request/response pair. There are ways to make it work (also described on my blog), but it's almost never recommended.
The proper solution is to split up the processing. A web site (or service) should start the processing by placing a request into persistent storage (e.g., Azure queue), a separate worker service (e.g., Azure worker role / Win32 service) would do the polling and put the results into persistent storage (e.g., Azure table), and the web site/service could poll that.
You can consider using message based service bus, and a good tutorial on MSDN Building
Distributed Apps with NHibernate and Rhino Service Bus will be
very useful.
If you just return from a standard asp.net request then wouldn't you expect the HttpResponse to end? Starting up a task in itself won't hold the HttpResponse open, to that you'd need to stream your response and block on the server until your task is finished which is presumably not what you want to do?
Maybe you should look at some ajax on the client that periodically pings the server to see if the task has finished, or at HTML 5 push notifications if you know your browser is going to support it.
You can use this http://www.asp.net/web-forms/tutorials/aspnet-45/using-asynchronous-methods-in-aspnet-45 but imho ajax with web service much better
I am following the following sample to create a very simple Comet in ASP.NET 4.5. What is the best way of showing progress on an Ajax call?
I have also downloaded the sample from http://www.aaronlerch.com/blog/2007/07/08/creating-comet-applications-with-aspnet/. I am not getting any response from Server.
Is Response.Flush is changed in ASp.NEt 4.5?
Update: Just remove the Thread.Sleep, everything works now.
I think dose not have radical change just add Asynchronously flushing a response
Asynchronously flushing a response
Sending responses to an HTTP client can take considerable time when
the client is far away or has a low-bandwidth connection. Normally
ASP.NET buffers the response bytes as they are created by an
application. ASP.NET then performs a single send operation of the
accrued buffers at the very end of request processing.
If the buffered response is large (for example, streaming a large file
to a client), you must periodically call HttpResponse.Flush to send
buffered output to the client and keep memory usage under control.
However, because Flush is a synchronous call, iteratively calling
Flush still consumes a thread for the duration of potentially
long-running requests.
ASP.NET 4.5 adds support for performing flushes asynchronously using
the BeginFlush and EndFlush methods of the HttpResponse class. Using
these methods, you can create asynchronous modules and asynchronous
handlers that incrementally send data to a client without tying up
operating-system threads. In between BeginFlush and EndFlush calls,
ASP.NET releases the current thread. This substantially reduces the
total number of active threads that are needed in order to support
long-running HTTP downloads.
I have a webpage with a button that generates some files to a server path. (It takes somewhere from 5 to 20 minutes). I want to create an async task that will continue executing even after the user closes the browser. Is it possible to do this with asp.net 4 and C#?
You do not control the thread pool in an asp.net application. You cannot even guarantee that a request will be completed on the same thread that it started with. Creating threads uses the same application pool that the web server uses, and you can use up all the request threads leaving your web server unavailable to process requests.
You should implement a windows service that hosts a WCF service that you can call from within your web application. In the service you can then fire off a thread to process the long running process. At the end of that process you can then update a status flag (e.g from Processing to Complete) that the user can view to determine if the files are done processing.
I would recommend using Topshelf to implement your windows service, it will save you much headache.
Actually, it is recommended that you not do this. Instead, the recommended way is to create a service (e.g. a windows service) that performs your processing asynchronously. In your web application, you create methods that starts the process, and another method that polls the service to determine if processing has completed.
There are a number of reasons for this, but one of the biggest is that the default and recommended configuration for webservers allows the server to kill long-running requests.
Or that I didn't understand what you want to do, or that you don't need to do a thing.
After the request was sent, the request process continues no matter if the user browser was closed or not. You don't need to do a thing
Fabulous nature of stateless WEB applications...
Creating new thread / using thread pool is the easiest approach to create run away tasks.
Note that there is no guarantees that process will stay alive for duration of a long task - so be prepared to handle partial completion and manual restarts. I.e. AppPoll recycle due to config change.
Easiest way is to put your task on the ThreadPool. The thread pool threads will stay alive even after the web page has completed rendering. The code would look like the following:
/* Beginning Method */
object someData = new object();
ThreadPool.QueueUserWorkItem(new WaitCallback(ProcessAsync), someData);
/* Ending Method */
static void ProcessAsync(Object stateInfo)
{
string dataToString = stateInfo.ToString();
}
you have to create a thread that does the long running task
have a look at the below:
http://kiranpatils.wordpress.com/2010/01/01/performing-a-long-running-task-with-asp-net/
Anyway what ever you start on the server it will continue running even if the user close the browser(until you recycle the app-pool or restart the web server).