I have a piece of functionality in my web application that kicks off a long running report on my server. Currently, I have an ajax method on my client that is constantly running to check if the currently running report is complete. It does this by repeatedly calling a method that queries the database to determine if a given report's status has changed from Running to either Error or Complete. Once the report is finished, I perform an ajax load to get the data.
I'm looking at implementing SignalR to add some additional functionality to other pages, and I figured this would be a good test case to get things going. My main concern is how to alert the client when the report is complete. Using SignalR, I can simply say something like:
public class ReportHub : Hub
{
public async Task ReportComplete(string userId, ReportRunStatus guid)
{
await Clients.User(userId).SendAsync("ReportComplete", guid);
}
}
However, I want to try to avoid putting a long running loop on the server as I'm afraid this could degrade performance as operations scale up. Is there a better way to handle checking the report status and alerting clients than simply polling until completion? Or is there some easy way to constantly be looking at the table and alerting on completed reports?
Related
I have the following WebAPI 2 method:
public HttpResponseMessage ProcessData([FromBody]ProcessDataRequestModel model)
{
var response = new JsonResponse();
if (model != null)
{
// checks if there are old records to process
var records = _utilityRepo.GetOldProcesses(model.ProcessUid);
if (records.Count > 0)
{
// there is an active process
// insert the new process
_utilityRepo.InsertNewProcess(records[0].ProcessUid);
response.message = "Process added to ProcessUid: " + records[0].ProcessUid.ToString();
}
else
{
// if this is a new process then do adjustments rules
var settings = _utilityRepo.GetSettings(model.Uid);
// create a new process
var newUid = Guid.NewGuid();
// if its a new adjustment
if (records.AdjustmentUid == null)
{
records.AdjustmentUid = Guid.NewGuid();
// create new Adjustment information
_utilityRepo.CreateNewAdjustment(records.AdjustmentUid.Value);
}
// if adjustment created
if (_utilityRepo.CreateNewProcess(newUid))
{
// insert the new body
_utilityRepo.InsertNewBody(newUid, model.Body, true);
}
// start AWS lambda function timer
_utilityRepo.AWSStartTimer();
response.message = "Process created";
}
response.success = true;
response.data = null;
}
return Request.CreateResponse(response);
}
The above method sometimes can take from 3-4 seconds to process (some db calls and other calculations) and I don't want the user to wait until all the executions are done.
I would like the user hit the web api method and almost inmediatly get a success response, meanwhile the server is finishing all the executions.
Any clue on how to implement Async / Await to achieve this?
If you don't need to return a meaningful response it's a piece of cake. Wrap your method body in a lambda you pass to Task.Run (which returns a Task). No need to use await or async. You just don't await the Task and the endpoint will return immediately.
However if you need to return a response that depends on the outcome of the operation, you'll need some kind of reporting mechanism in place, SignalR for example.
Edit: Based on the comments to the original post, my recommendation would be to wrap the code in await Task.Run(()=>...), i.e., indeed await it before returning. That will allow the long-ish process to run on a different thread asynchronously, but the response will still await the outcome rather than leaving the user in the dark about whether it finished (since you have no control over the UI). You'd have to test it though to see if there's really any performance benefit from doing this. I'm skeptical it'll make much difference.
2020-02-14 Edit:
Hooray, my answer's votes are no longer in the negative! I figured having had the benefit of two more years of experience I would share some new observations on this topic.
There's no question that asynchronous background operations running in a web server is a complex topic. But as with most things, there's a naive way of doing it, a "good enough for 99% of cases" way of doing it, and a "people will die (or worse, get sued) if we do it wrong" way of doing it. Things need to be put in perspective.
My original answer may have been a little naive, but to be fair the OP was talking about an API that was only taking a few seconds to finish, and all he wanted to do was save the user from having to wait for it to return. I also noted that the user would not get any report of progress or completion if it is done this way. If it were me, I'd say the user should suck it up for that short of a time. Alternatively, there's nothing that says the client has to wait for the API response before returning control to the user.
But regardless, if you really want to get that 200 right away JUST to acknowledge that the task was initiated successfully, then I still maintain that a simple Task.Run(()=>...) without the await is probably fine in this case. Unless there are truly severe consequences to the user not knowing the API failed, on the off chance that the app pool was recycled or the server restarted during those exact 4 seconds between the API return and its true completion, the user will just be ignorant of the failure and will presumably find out next time they go into the application. Just make sure that your DB operations are transactional so you don't end up in a partial success situation.
Then there's the "good enough for 99% of cases" way, which is what I do in my application. I have a "Job" system which is asynchronous, but not reentrant. When a job is initiated, we do a Task.Run and begin to execute it. The code in the task always holds onto a Job data structure whose ID is returned immediately by the API. The code in the task periodically updates the Job data with status, which is also saved to a database, and checks to see if the Job was cancelled by the user, in which case it wraps up immediately and the DB transaction is rolled back. The user cancels by calling another API which updates said Job object in the database to indicate it should be cancelled. A separate infinite loop periodically polls the job database server side and updates the in-memory Job objects used by the actual running code with any cancellation requests. Fundamentally it's just like any CancellationToken in .NET but it just works via a database and API calls. The front end can periodically poll the server for job status using the ID, or better yet, if they have WebSockets the server pushes job updates using SignalR.
So, what happens if the app domain is lost during the job? Well, first off, every job runs in a single DB transaction, so if it doesn't complete the DB rolls back. Second, when the ASP.NET app restarts, one of the first things it does is check for any jobs that are still marked as running in the DB. These are the zombies that died upon app pool restart but the DB still thinks they're alive. So we mark them as KIA, and send the user an email indicating their job failed and needs to be rerun. Sometimes it causes inconvenience and a puzzled user from time to time, but it works fine 99% of the time. Theoretically, we could even automatically restart the job on server startup if we wanted to, but we feel it's better to make that a manual process for a number of case-specific reasons.
Finally, there's the "people will die (or worse, get sued) if we get it wrong" way. This is what some of the other comments are more directed to. This is where have to break down all jobs into small atomic transactions that are tracked in a database at every step, and which can be picked up by any server (the same or maybe another server in a farm) at any time. If it's really top notch, multiple servers can even work on the same job concurrently, depending on what it is. It requires carefully coding every background operation with this in mind, constantly updating a database with your progress, dealing with concurrent changes to the database (because now the entire operation is no longer a single atomic transaction), etc. Needless to say, it's a LOT of effort. Yeah, it would be great if it worked this way. It would be great if every app did everything to this level of perfection. I also want a toilet made out of solid gold, but it's just not in the cards now is it?
So my $0.02 is, again, let's have some perspective. Do the cost benefit analysis and unless you're doing something where lives or lots of money is at stake, aim for what works perfectly well 99%+ of the time and only causes minor inconvenience when it doesn't work perfectly.
I have two web pages . In one page I am uploading some file and processing that data which takes lot of time to be completed. While on other page I am simply rendering the data on database.
I have implemented this application in c# mvc.
My requirement is that once user upload the file the file processing start in background and user would be able to navigate other pages.
Can we achieve this through asynchronous controller.
You are saying that processing the data takes a lot of time. Using an asynchronous controller, you will free up the web server to serve other requests, however the request will complete in the same time as it would when invoked synchronously. (source: https://msdn.microsoft.com/en-us/library/ee728598%28v=vs.100%29.aspx)
If you do not want your user to wait, add a job queue to your stack, tell the user that you've accepted the file and are processing it, and notify him when the operation completes.
There are many job queue implementations available in .NET, a concrete suggestion would depend on whether you're running on "full" .NET or .NET Core.
Using async controllers won't do what you want here, although you should still use them as a first step. Async controllers will just free up the server threads so that more requests can be processed, without async any long running operations will block the threads they're using and stop other requests being processed. If there are enough long running threads then other client requests will get rejected.
You'll also (or instead) need to look into a different mechanism to process the file, for example the API action could just put the file in a folder and another (non-web service) process could monitor that folder and pick up new files to process. Alternatively you could look at queuing or message bus technology, this adds more complexity but also gives you safety around queue processing.
The other thing to consider is how you report validation issues or errors back to the uploading client, you could do some checks in the API action but you'll probably still need to consider how to notify clients when an error occurs during processing of a file. How you best do this will depend on your system.
I came a across a nice little tool that has been added to ASP.NET in v4.5.2
I am wandering how safe it is and how one can effectively utilize it in an ASP.NET MVC or Web API scenario.
I know I am always wanting to do a quick and simple fire and forget task in my web applications. For example:
Sending an emails/s
Sending push notifications
Logging analytics or errors to the db
Now typically I just create a method called
public async Task SendEmailAsync(string to, string body)
{
//TODO: send email
}
and I would use it like so:
public async Task<ActionResult> Index()
{
...
await SendEmailAsync(User.Identity.Username, "Hello");
return View();
}
now my concern with this is that, I am delaying the user in order to send my email to them. This doesn't make much sense to me.
So I first considered just doing:
Task.Run(()=> SendEmailAsync(User.Identity.Username, "Hello"));
however when reading up about this. It is apparently not the best thing to do in IIS environment. (i'm not 100% sure on the specifics).
So this is where I came across HostingEnvironment.QueueBackgroundWorkItem(x=> SendEmailAsync(User.Identity.Username, "Hello"));
This is a very quick and easy way to offload the send email task to a background worker and serve up the users View() much quicker.
Now I am aware this is not for tasks running longer than 90 seconds and is not 100% guaranteed executution.
But my question is:
Is HostingEnvironment.QueueBackgroundWorkItem() sufficient for: sending emails, push notifications, db queries etc in a standard ASP.NET web site.
It depends.
The main benefit of QueueBackgroundWorkItem is the following, emphasis mine (source):
Differs from a normal ThreadPool work item in that ASP.NET can keep track of how many work items registered through this API are currently running, and the ASP.NET runtime will try to delay AppDomain shutdown until these work items have finished executing.
Essentially, QueueBackgroundWorkItem helps you run tasks that might take a couple of seconds by attempting not to shutdown your application while there's still a task running.
Running a normal database query or sending out a push notification should be a matter of a couple hundred milliseconds (or a few seconds); neither should take a very long time and should thus be fine to run within QueueBackgroundWorkItem.
However, there's no guarantee for the task to finish — as you said, the task is not awaited. It all depends on the importance of the task to execute. If the task must complete, it's not a good candidate for QueueBackgroundWorkItem.
I've not dealt much with Async/threads/Tasks other than some web services.
I'm using MVC4. I have existing code which takes some time to run. It is using an existing method in the service layer, which uses various other the areas in further layers.
Essentially I was hoping to be able to make an ASync call from the Asynccontroller to that method. However it appears that I would need to change/create another method to implement all the Task & await keywords, quite a hefty job altering all the way down the chain.
Is it possible to call/'fire' a synchronous method in this manner?
I want the long process (creating some documents in the background) to continue running even if the user closes their browser. However if the user still has the browser open then I would like to return a notification to them.
Is there a better way to fire a background task to execute from the MVC Application?
I think you're trying to use async for something it cannot do. As I describe on my blog, async does not change the HTTP protocol.
Is it possible to call/'fire' a synchronous method in this manner?
Sort of. You can use Task.Run if you have CPU-bound work that you want to move off the UI thread in a desktop/mobile application. But there is no point in doing that in an ASP.NET MVC application.
I want the long process (creating some documents in the background) to continue running even if the user closes their browser. However if the user still has the browser open then I would like to return a notification to them.
The problem with this is that you'd be returning early from an ASP.NET request, and (as I describe on my blog), that's quite dangerous.
A proper solution would be to queue the work in a reliable queue (e.g., Azure queue or MSMQ), have an independent backend for processing (e.g., Azure worker role / web job or Win32 service), and use something like SignalR for notification.
As soon as you attempt to do work in an ASP.NET process without a request context, then you run into the danger that your process may exit without completing the work. If you are OK with this, then you can use the BackgroundTaskManager type from my blog above to minimize the chance of that happening (but keep in mind: it can still happen).
I'm writing a small internal web app.
I need to do a very long operation (which can take about an hour) at the press of a button and let the user know once the operation completes with its result.
In other words, I need the button to start the async operation and refresh the view with the results once it is finished (after about an hour run).
Edit: The operation could use some sort of a refresh mechanism in the View. The result can be sent to the view after a small lag (doesn't have to refresh realtime)
You could use SignalR, but that might be a little bit overkill for this. Another option is to set up another controller action which checks if the task has been completed. Then, on the client side, you could use jQuery to make ajax requests to that controller action. When the action comes back as complete you can show an alert or otherwise update the page.
$.ajax({
type: 'GET',
url: http://mysite.info/tasks/checkComplete/5,
success: function (response) {
if (response == 'true') {
alert('Task complete');
}
}
});
As for what happens on the server side, I don't think this is a case where I would use async/await. If your task is really going to be running for an hour I have a feeling you're going to run into timeout issues, etc. I would have a controller action which is used to start the task, but all it does it put the request to start in the database. I would then have an external "worker" which checks for requests in that database and performs the tasks. Once the task is complete it would update that database entry to mark it as complete. Then the "CheckComplete" controller action from my example above could check the database to see if the task is in fact completed.
As suggested in the comments, I recommend you to take a look at SignalR, it's not that hard to implement.
You'll have to implement a server on your website, send a message to the server when your long operation is complete, and refresh your view when the message is received.
The tutorials from Microsoft should be clear enough for that.
A low tech solution from the age way before AJAX and WebSockets was to simply continually write data to the response (usually a progress report).
So instead of doing the work asynchronously, you would call something like
Response.Write("<script>reportProgress(" + progress + ");</script>");
Response.Flush();
once in a while. It has its issues, but it might still be a good way if you have to support weird platforms or very old browsers.
Ummm,
I will try to create flag into the server when finished the work.
In the MVC you can will create a TIMER obj that check, for example each minute, if the server has finished the work, and launch this timer in a new thread that not block the app.
(sorry, for my english).