Hi,
I have a ASP.NET application where I have added a Webservice that contains a "fire and forget" method. When this method is executed it will start a loop (0-99999) and for every loop it will read a xml file and save it to the database.
The problem is that this action will take a couple of hours and it usually ends with a Thread Aborted exception?
I have seen that you can increase the executionTimeout and this is how :
<httpRuntime executionTimeout="604800"/>
<compilation debug="true">
But this does not help?
I have also tried to add a thread.sleep within the loop. If I set it to 500 it will go half way and if I set <100 it will just go a couple of 1000 loops before the thread aborted exception?
How can I solve this?
Don't run the loop inside the web service. Instead, have it in a console app, a winforms app, or possibly even a windows service. Use the web service to start up the other program.
A web service is basically a web page, and asp.net web pages are not meant to host long running processes.
This article does not directly answer your question, but contains a snippet of info relevant to my answer:
http://msdn.microsoft.com/en-us/magazine/dd296718.aspx
However, when the duration of the
operation grows longer than the
typical ASP.NET session duration (20
minutes) or requires multiple actors
(as in my hiring example), ASP.NET
does not offer sufficient support. You
may recall that the ASP.NET worker
processes automatically shut down on
idle and periodically recycle
themselves. This will cause big
problems for long-running operations,
as state held within those processes
will be lost.
and the article is a good read, at any rate. It may offer ideas for you.
Not sure if this is 'the answer', but when you receive the web service call you could consider dispatching the action onto another thread. That could then run until completion. You would want to consider how you ensure that someone doesn't kick off two of these processes simultaneously though.
I have a ASP.NET application where I
have added a Webservice that contains
a "fire and forget" method. When this
method is executed it will start a
loop (0-99999) and for every loop it
will read a xml file and save it to
the database.
Lets not go into that I fhind this approach quite... hm... bad for many reasons (like a mid of the thing reset). I would queue the request, then return, and have a queue listener do the processing with transactional integrity.
Anyhow, what you CAN do is:
Queue a WorkItem for a wpool thread to pick things up.
Return immediately.
Besides that, web services and stuff like this are not a good place for hourly long running processes. Tick off a workflow, handle it separately.
Related
Background: I have a simple ASP.NET Core 3.1 site. Very rarely (three or four times per week), a user might fill out a form that triggers an email to be sent.
I don't want to delay the page response while running the 'send email' operation (even though it only takes a second or two), so from everything I've read, it seems like the code that should handle the email should be a background worker/hosted service, and the Razor pages code should place the data object to be sent in a collection that gets monitored by the background service.
What I'm not fully understanding is why this is necessary in modern ASP.NET Core.
If I was doing this in a normal C# application (not ASP), I'd simply make the 'send email' method async (it's using MailKit, which has async methods), and call the async method without awaiting, allowing the the work be done on the threadpool while allowing the response thread to continue.
But existing answers and blog posts say that calling an async method without an await in ASP is dangerous, due to the fact that IIS can restart ASP processes (application pool recycling).
Yet, most things I've read say Application Recycling is an artifact of old ASP when memory leaks were common, and it's not really a thing on .Net Core. Additionally, many ASP applications aren't even hosted in IIS anymore.
Further, as far as I can tell, IHostedService/Background Worker objects aren't doing anything special - they don't seem to add any additional threading; they just look like singletons that have additional notification for environment startup and shutdown.
So:
Is calling a fire-and-forget async method in ASP.NET Core still considered poor practice, especially if the fire and forget task is short-lived? If so, why? [see edit below for clarification]
Other than notifications for shutdown, is there any reason why a background service is considered better than borrowing a managed threadpool thread (via Task.Run or QueueBackgroundWorkItem)? Wouldn't waking a background service (if it was awaiting on object to be placed in a collection) consume a pool thread in the same way?
Edit: I acknowledge that starting a task, and reporting success to the user, when there's a chance that operation could be terminated, is poor form. There's benefit to being notified of a shutdown and being able to finalize tasks.
Perhaps a better question is, does the old behavior of cycling still exist in modern ASP (on IIS or Kestrel)? Are there other reasons an orderly shutdown might be triggered (other than server shutdown/manual stop)?
I would still call it a poor practice.
The main concern here as well as in the referenced post is mainly about the promise of task completion.
Without being aware of the ghost background tasks, the runtime will not be able to notify the tasks to gracefully stop. This may or may not cause serious issues depending on the status of the tasks at the point the termination occurs.
Using fire forget task often means, your task is at the risk of having to start all over again when the process restarts. And sometimes this is not possible due to loss of context. Imagine your fire-forget task is calling another web API with parameters provided by a web request. The parameters are likely to get wiped out from memory if the process restarts.
And remember, the recycling is not always triggered by IIS / server. It could also be triggered by people. Say when your application runs into a memory leak issue, and you may want to recycle the app process every 1 hour as a temporary relief. Then you need to make sure you don't break your background tasks.
In terms of hosting - it is still possible to host ASP.Net Core applications in-process, in which the app pool gets recycled by IIS after a configured time period, or by default 29 hours.
In terms of lifetime - hosted services are types you register to DI, so DI features could be used, for example, this built-in hosted service implements IDisposable, which means proper clean up could be done upon shutting down.
Frankly, background tasks and hosted services both allow you to do fire and forget. But when you need reliability and resilience, hosted services win.
To answer the second half of your question, the app will wait for all hosted services' StopAsync methods to finish before shutting down. As long as you await your Tasks in the hosted service, this effectively means you can assume your Tasks will be allowed to finish running before the app shuts down. The app could still be force-shutdown, which in that case, nothing is guaranteed anymore.
If you need more guarantees about your background tasks, you should move them to run in a separate process. You could use something like Runly to make it easier to break out functionality into background jobs. It also makes it easy to provide real-time feedback to the user so that you are not lying to the user when you say "everything is done" while something is still running in the background.
Full disclosure: I cofounded Runly.
I need to execute an infinite while loop and want to initiate the execution in global.asax.
My question is how exactly should I do it? Should I start a new Thread or should I use Async and Task or anything else? Inside the while loop I need to do await TaskEx.Delay(5000);
How do I do this so it will not block any other processes and will not create memory leaks?
I use VS10,AsyncCTP3,MVC4
EDIT:
public void SignalRConnectionRecovery()
{
while (true)
{
Clients.SetConnectionTimeStamp(DateTime.UtcNow.ToString());
await TaskEx.Delay(5000);
}
}
All I need to do is to run this as a singleton instance globally as long as application is available.
EDIT:SOLVED
This is the final solution in Global.asax
protected void Application_Start()
{
Thread signalRConnectionRecovery = new Thread(SignalRConnectionRecovery);
signalRConnectionRecovery.IsBackground = true;
signalRConnectionRecovery.Start();
Application["SignalRConnectionRecovery"] = signalRConnectionRecovery;
}
protected void Application_End()
{
try
{
Thread signalRConnectionRecovery = (Thread)Application["SignalRConnectionRecovery"];
if (signalRConnectionRecovery != null && signalRConnectionRecovery.IsAlive)
{
signalRConnectionRecovery.Abort();
}
}
catch
{
///
}
}
I found this nice article about how to use async worker: http://www.dotnetfunda.com/articles/article613-background-processes-in-asp-net-web-applications.aspx
And this:
http://code.msdn.microsoft.com/CSASPNETBackgroundWorker-dda8d7b6
But I think for my needs this one will be perfect:
http://forums.asp.net/t/1433665.aspx/1
ASP.NET is not designed to handle this kind of requirement. If you need something to run constantly, you would be better off creating a windows service.
Update
ASP.NET is not designed for long running tasks. It's designed to respond quickly to HTTP requests. See Cyborgx37's answer or Can I use threads to carry out long-running jobs on IIS? for a few reasons why.
Update
Now that you finally mentioned you are working with SignalR, I see that you are trying to host SignalR within ASP.NET, correct? I think you're going about this the wrong way, see the example NuGet package referenced on the project wiki. This example uses an IAsyncHttpHandler to manage tasks.
You can start a thread in your global.asax, however it will only run till your asp.net process get recycled. This will happen at least once a day, or when no one uses of your site. If the process get recycled, the only way the thread is restarted agian, is when you have a hit on your site. So the thread is not running continueuosly.
To get a continues process it is better to start a windows service.
If you do the 'In process' solution, it realy depends on what your are doing. The Thread itself will not cause you any problems in memory or deadlocks. You should add a meganism to stop your thread when the application stops. Otherwise restarting will take a long time, because it will wait for your thread to stop.
This is an old post, but as I was seraching for this, I would like to report that in .NET 4.5.2 there is a native way to do it with QueueBackgroundWorkItem.
Take a look at this post: https://blogs.msdn.microsoft.com/webdev/2014/06/04/queuebackgroundworkitem-to-reliably-schedule-and-run-background-processes-in-asp-net/
MarianoC
It depends what you are trying to accomplish in your while loop, but in general this is the kind of situation where a Windows Service is the best answer. Installing a Windows Service is going to require that you have admin privileges on the web server.
With an infinite loop you end up with a lot of issues regard the Windows message pump. This is the thing that keeps a Windows application alive even when the application isn't "doing" anything. Without it, a program simply ends.
The problem with an infinite loop is that the application is stuck "doing" something, which prevents other applications (or threads) from "doing" their thing. There have been a few workarounds, such as the DoEvents in Windows Forms, but they all have some serious drawbacks when it comes to responsiveness and resource management. (Acceptable on a small LOB application, maybe not on a web server.) Even if the while-loop is on a separate thread, it will use up all available processing power.
Asynchronus programming is really designed more for long-running processes, such as waiting for a database to return a result or waiting for a printer to come online. In these cases, it's the external process that is taking a long time, not a while-loop.
If a Window Service is not possible, then I think your best bet is going to be setting up a separate thread with its own message pump, but it's a bit complicated. I've never done it on a web server, but you might be able to start an Application. This will provide a message pump for you and allow you to respond to Windows events, etc. The only problem is that this is going to start a Windows application (either WPF or WinForms), which may not be desirable on a web server.
What are you trying to accomplish? Is there another way you might go about it?
I found this nice article about how to use async worker, will give it a try. http://www.dotnetfunda.com/articles/article613-background-processes-in-asp-net-web-applications.aspx
And this:
http://code.msdn.microsoft.com/CSASPNETBackgroundWorker-dda8d7b6
But I think for my needs this one will be perfect:
http://forums.asp.net/t/1433665.aspx/1
I currently have an application which is basically a wrapper for ~10 "LongRunning" Tasks. Each thread should keep running indefinitely, but sometimes they lock up or crash, and sometimes the wrapper app spontaneously exits (I haven't been able to track that down yet). Additionally, the wrapper application can currently only be running for one user, and that user has to be the one to restart the threads or relaunch the whole app.
I currently have a monitor utility to let me know when the threads stop doing work so that they can be manually restarted, but I'd like to automatically restart them instead. I'd also like the wrapper to be available to everyone to check the status of the threads, and for the threads to be running even when the wrapper isn't.
Based on these goals, I think I want to separate the threads into a Windows Service, and convert the wrapper into something which can just connect to the service to check its status and manipulate it.
How would I go about doing this? Is this a reasonable architecture? Should I turn each thread into a separate service, or should I have a single multi-threaded service?
Edit: All the tasks log to the same set of output files (via a TextWriter.Synchronized(StreamWriter)), and I would want to maintain that behavior.
They also all currently share the same database connection, which means I need to get them all to agree to close the connection at the same time when it's necessary. However, if they were split up they could each use their own database connection, and I wouldn't need to worry about synchronizing that. I actually suspect that this step is one of the current failure points, so splitting it up would be a Good Thing.
I would suggest you to stay inside one multithreading service if possible. Just make sure that threads are handled correctly when Service Stop is triggered. Put brake flags inside blocks of code that will take a lot of time to execute. This way you will make your service responsive on Stop event. Log any exceptions and make sure to wait for all threads to exit until service is finally stopped. This will prevent you to run same "task" in multiple threads.
Maintaining one service is in the end easier then multiple services.
Splitting to multiple services would be reasonable if you require some separate functionalities that can run or not beside each other.
I don't think moving the threads to a Windows Service removes any of the problems. The service will still crash randomly and the threads will still exit randomly.
I assume that your long-running tasks implement a kind of worker loop. Wrap the body of that loop in a try-catch and log all exceptions. Don't rethrow them so that the task does not ever exit. Examine the logs to find the bugs.
I’m looking for the best way of using threads considering scalability and performance.
In my site I have two scenarios that need threading:
UI trigger: for example the user clicks a button, the server should read data from the DB and send some emails. Those actions take time and I don’t want the user request getting delayed. This scenario happens very frequently.
Background service: when the app starts it trigger a thread that run every 10 min, read from the DB and send emails.
The solutions I found:
A. Use thread pool - BeginInvoke:
This is what I use today for both scenarios.
It works fine, but it uses the same threads that serve the pages, so I think I may run into scalability issues, can this become a problem?
B. No use of the pool – ThreadStart:
I know starting a new thread takes more resources then using a thread pool.
Can this approach work better for my scenarios?
What is the best way to reuse the opened threads?
C. Custom thread pool:
Because my scenarios occurs frequently maybe the best way is to start a new thread pool?
Thanks.
I would personally put this into a different service. Make your UI action write to the database, and have a separate service which either polls the database or reacts to a trigger, and sends the emails at that point.
By separating it into a different service, you don't need to worry about AppDomain recycling etc - and you can put it on an entire different server if and when you want to. I think it'll give you a more flexible solution.
I do this kind of thing by calling a webservice, which then calls a method using a delegate asynchronously. The original webservice call returns a Guid to allow tracking of the processing.
For the first scenario use ASP.NET Asynchronous Pages. Async Pages are very good choice when it comes to scalability, because during async execution HTTP request thread is released and can be re-used.
I agree with Jon Skeet, that for second scenario you should use separate service - windows service is a good choice here.
Out of your three solutions, don't use BeginInvoke. As you said, it will have a negative impact on scalability.
Between the other two, if the tasks are truly background and the user isn't waiting for a response, then a single, permanent thread should do the job. A thread pool makes more sense when you have multiple tasks that should be executing in parallel.
However, keep in mind that web servers sometimes crash, AppPools recycle, etc. So if any of the queued work needs to be reliably executed, then moving it out of process is a probably a better idea (such as into a Windows Service). One way of doing that, which preserves the order of requests and maintains persistence, is to use Service Broker. You write the request to a Service Broker queue from your web tier (with an async request), and then read those messages from a service running on the same machine or a different one. You can also scale nicely that way by simply adding more instances of the service (or more threads in it).
In case it helps, I walk through using both a background thread and Service Broker in detail in my book, including code examples: Ultra-Fast ASP.NET.
Using C# ASP.NET I want to program a queue. I want to have X number of a process. When it finishes it should take the next item on the list and process it. I figure the most simple way is to insert and delete it from an SQL database. My problem is:
How do I start this when I add the first item? Do I launch a separate thread? AFAIK every connection to my development environment and server is its own thread? I would need to lock something launch a thread to process the list then unlock and let the thead keep going until its done? So... 1) Should I be launching threads? If so, what kind? (I haven't done any multithreading in C# yet) 2) Should I have a static mutex in my ASP.NET project? And lock it when launching threads? (are static variables still shared across ASP connections/threads correct?) Or should I not be doing this and launch them a different way?
NOTE: I may want to launch 2 processes instead of 1 and I may want to launch other processes for other things (example 2 FFmpeg + 5 ImageMagick.)
A typical ASP.NET application will actually be sharing a thread for multiple requests (although it is possible to configure it to use one thread per request). I wouldn't recommend changing it to use one thread per request though.
Also, any work being done during an ASP.NET request has to be completed by the time you finish returning your response to the client, or it will be terminated. This includes any child threads you spawn.
Your best bet here is to set up MSMQ (or perhaps even using the SQS from Amazon) and have a windows service that pulls messages off the queue and processes them. The process would look like:
First off, it sounds like what you really want is a Windows Service.
That said, if you're committed to using ASP.NET for this, the following might work:
Make a single .aspx page whose sole purpose is to process one unit of work.
Have another page (HTML will do) that uses JavaScript to asynchronously load your first page (using something like jQuery's $.load() method).
When $.load() returns, you'll know the job is complete and you can make another request to process the next job.
Optionally, you can put something into the returned page to indicate whether or not there are any remaining units of work left in the queue. This would allow you to throttle back on the client-side when there's not any work to be done.
Client-Side Example
<script type="text/javascript">
$(document).ready(function() {
processJob();
});
function processJob() {
$('#result').load("ProcessOneJob.aspx", function() {
// called when ProcessOneJob.aspx comes back
processJob();
});
}
</script>
Pros to this approach:
Relatively simple to implement
No need to deal with threads/locking
No service to install
Cons
Relies entirely on having a machine somewhere with a browser open, pointing at this page.
A brainstorm answer. I don't know if it will work.
The issue is threads get terminated when a request is finished so create a thread outside of a request in Application_Start to avoid the problem.
To keep everything organized and simple; Have a master class that acts as a utility controlling the amount of each process you would like to launch and to do the actual launching. Call the class in void Application_Start(object sender, EventArgs e) to create the initial master thread which will launch any process (or threads) you'll need. Allow it to put itself to sleep and use the utility methods to wake it up and pass messages. Then handle the rest as needed :)