Handling aborted requests in ASP.NET Core - c#

Say I have the following action
public async TaskIActionResult> UploadFile()
{
var form = await Request.ReadFormAsync();
// do something with form contents
}
and it's called with a large, say 100MB, file.
How can I handle what happens when a user cancels/aborts the request? (e.g it's uploaded 50MB and user stops the request) Is there a way to detect this? Is this even a valid question to be asking?

Related

Process incoming FileStream asynchronously

I'm reading a file from user upload and it was working synchronously. I needed to change it in order to immediately send a "received" alert to the user, then read the file asynchronously while the user would periodically poll back to see if the read was finished.
Here is what my code looks like right now:
public FileUpload SaveFile(Stream stream)
{
FileUpload uploadObj = //instantiate the return obj
var task = Task.Run(async () => await ProcessFileAsync(stream));
return upload;
}
public async Task ProcessFileAsync(Stream stream)
{
StreamReader file = new StreamReader(stream);
CsvReader csv = new CsvReader(file, CultureInfo.InvariantCulture);
while (await csv.ReadAsync())
{
//read the file
}
}
the issue I'm having is that by the time I call the csv.ReadAsync() method, the Stream object has been disposed. How do I access the Stream when I want the SaveFile() method to return a value to the user, but the act of returning disposes the Stream object?
The point here is that you're working within the constraints of ASP.NET, which abstracts away a lot of the underlying HTTP stuff.
When you say you want to process a user-uploaded file asynchronously, you want to step out of the normal order of doing things with HTTP and ASP.NET. You see, when a client sends a request with a body (the file), the server receives the request headers and kicks off ASP.NET to tell your application code that there's a new request incoming.
It hasn't even (fully) read the request body at this point. This is why you get a Stream to deal with the request, and not a string or a filename - the data doesn't have to be arrived at the server yet! Just the request headers, informing the web server about the request.
If you return a response at that point, for all HTTP and ASP.NET care, you're done with the request, and you cannot continue reading its body.
Now what you want to do, is to read the request body (the file), and process that after sending a response to the client. You can do that, but then you'll still have to read the request body - because if you return something from your action method before reading the request, the framework will think you're done with it and dispose the request stream. That's what's causing your exception.
If you'd use a string, or model binding, or anything that involves the framework reading the request body, then yes, your code will only execute once the body has been read.
The short-term solution that would appear to get you going, is to read the request stream into a stream that you own, not the framework:
var myStream = new MemoryStream();
await stream.CopyTo(myStream);
Task.Run(async () => await ProcessFileAsync(myStream));
Now you'll have read the entire request body and saved it in memory, so ASP.NET can safely dispose the request stream and send a response to the client.
But don't do this. Starting fire-and-forget tasks from a controller is a bad idea. Keeping uploaded files in memory is a bad idea.
What you actually should do, if you still want to do this out-of-band:
Save the incoming file as an actual, temporary file on your server
Send a response to the client with an identifier (the temporarily generated filename, for example a GUID)
Expose an endpoint that clients can use to request the status using said GUID
Have a background process continuously scan the directory for newly uploaded files and process them
For the latter you could hosted services or third-party tools like Hangfire.
You'll need to either do this if the environment warrants:
var result = task.Result;
//do stuff
...or
public Task<FileUpload> SaveFile(Stream stream)
{
var uploadObj = //instantiate the return obj
await ProcessFileAsync(stream);
return uploadObj;
}
See here for a thorough discussion on fire-and-forget if you go that route:
Web Api - Fire and Forget

await blocks the request

I have an action in my controller in which there is an async function call
public async Task<ActionResult> Subscribe(AspNetUser user, string newUploadPath)
{
//do some work
await DocServiceImpl.CopyUserAllModels(user, newUploadPath);
//do some work
return RedirectToAction("List", "ClientDashboard");
}
and this function uploads a lot of files to Azure CDN so it takes too long.
The problem is that the client who makes this request has to wait until the CopyUserAllModels function finishes, because all this waiting time the client sees his page is reloading in the browser.
I've tried not to wait at all and used without await
DocServiceImpl.CopyUserAllModels(user, newUploadPath);
but I've searched that this is a bad experience, and except it,
in this case I find that some part of the files haven't been uploaded at all, so
without await it does not work properly in my case(I couldn't understand why).
My problem is: How to finish the request earlier, and then after it do all the work in CopyUserAllModels function?

MVC HttpClient multiple post requests, get mismatched responses

The scenario
I need to show N reports on a web page. The reports need to be requested to an external service. The time for the service to generate a report can vary from 2 seconds to 50 seconds, depending on the requested content.
To call the service I use HttpClient in an async action. To generate 1 report I call the service once. To generate 5 reports I call it 5 times and so on.
The Problem
Let's suppose we request 3 reports BigReport, MediumReport and SmallReport with a known relative generation time of 1 minute, 30 seconds and 2 seconds, and we call the service in the following order:
BigReport, MediumReport, SmallReport
The result of the HttpCalls will be as following:
HttpCall response for BigReport returns SmallReport (which is the quickest to be generated)
MediumReport will be correct
SmallReport response will contain the BigReport (which is the longest and the last)
Basicly, although the HttpCalls are different, for the fact they are made over a very short period of time, and they are still "active", the server will repond based on first arrived, first served, instead of serving each call with its exact response.
The Code
I have a Request controller with an async action like this:
public async Task<string> GenerateReport(string blockContent)
{
var formDataContent = new MultipartFormDataContent
{
AddStringContent(userid, "userid"),
AddStringContent(passcode, "passcode"),
AddStringContent(outputtype, "outputtype"),
AddStringContent(submit, "submit")
};
var blockStream = new StreamContent(new MemoryStream(Encoding.Default.GetBytes(blockContent)));
blockStream.Headers.Add("Content-Disposition", "form-data; name=\"file\"; filename=\"" + filename + "\"");
formDataContent.Add(blockStream);
using (var client = new HttpClient())
{
using(var message = await client.PostAsync(Url, formDataContent))
{
var report = await message.Content.ReadAsStringAsync();
return report;
}
}
}
The action is being called from a view via Ajax, like this
//FOREACH BLOCK, CALL THE REPORT SERVICE
$('.block').each(function(index, block) {
var reportActionUrl = "Report/GenerateReport/"+block.Content;
//AJAX CALL GetReportAction
$(block).load(reportActionUrl);
});
Everything works fine if I covert the action from async to sync, by removing async Task and instead of "awaiting" for the response, I just get result as
var result = client.PostAsync(Url, formDataContent).Result.
This will make everything run synchronously and working fine, but the waiting time for the user, will be much longer. I would really like to avoid this by making parallel calls or similar.
Conclusions and questions
The problem itself make sense, after inspecting it also with Fiddler, as we have multiple opened HttpRequests pending almost simultaneously.
I suppose I need a sort of handler or something to identify and match request/response, but I don't know what's the name of the "domain" I need to look for. So far, my questions are:
What is the technical name of "making multiple http calls in parallel"?
If the problem is understandable, what is name of the problem? (concurrency, parallel requests queuing, etc..?)
And of course, is there any solution?
Many thanks.
With a "bit" of delay, I post the solution.
The problem was that the filename parameter was incorrectly called filename instead of blockname. This was causing the very weird behaviour, as a file could have had many blocks.
The lesson learned was that in case of very weird behaviour, in this case with a HttpClient call, analyse all the possible parameters and test it with different values, even if it doesn't make too much sense. At worst it can throw an error.

About ASP.NET Web API - Async and Await

I have the following Asynchronous method inside my AsyncController:
public async Task<Dashboard> GetFeeds()
{
var movies = new HttpClient().GetStringAsync("http://netflix/api/MyMovies");
var tweets = new HttpClient().GetStringAsync("http://twitter/api/MyTweets");
await Task.WhenAll(movies, tweets);
Dashboard dash = new Dashboard();
dash.Movies = Deserialize<Movies >(movies.Result);
dash.Tweets = Deserialize<Tweets >(tweets.Result);
return dash;
}
In this method do different call APIs, one with different return time from each other. What I can not understand about Task<> is because I have to wait for the return of the two to update my client? Being that I'm creating new threads.
Imagining that I play the return of each API in a PartialView, the result I thought would get:
-First I would have my Movies list (it only takes 5s), -> Show for my user
-And Then would my list of Tweets -> Show for my user
But what I see is:
-While The Twitter request does not end I did not get to play the data I got from Netflix on-screen for my user.
The big question is: A Task<> serves only for the processing to be done faster?
I can not play the information on the screen according to the turnaround time of each API that I ordered?
This is the call to my method
public async Task<ActionResult> Index()
{
var feeds = await GetFeeds();
return View(feeds);
}
I confess, I'm very confused, or, maybe you did not understand the concept of Task<>.
The way ASP.NET MVC works is that a single controller action handles a single HTTP request, and produces a single HTTP response. This is true whether the action is synchronous or asynchronous.
In other words (as I explain on my blog), async doesn't change the HTTP protocol. To return an "initial result" to the client browser and update the page (or part of the page) with other data, you'll need to use a technology designed for that: AJAX, or SignalR.
For more information, see the "Asynchronous Code Is Not a Silver Bullet" section of my MSDN article on async ASP.NET.

Web API allow only single async task

What is a proper scenario for handling only single async action? For example I need to import large file and while it being imported I need to disable that option to ensure that second import not triggered.
What comes in mind that:
[HttpPost]
public async Task<HttpResponseMessage> ImportConfigurationData()
{
if (HttpContext.Current.Application["ImportConfigurationDataInProcess"] as bool? ?? false)
return Request.CreateErrorResponse(HttpStatusCode.InternalServerError, "Task still running");
HttpContext.Current.Application["ImportConfigurationDataInProcess"] = true;
string root = HttpContext.Current.Server.MapPath("~/App_Data");
var provider = new MultipartFormDataStreamProvider(root);
await Request.Content.ReadAsMultipartAsync(provider);
//actual import
HttpContext.Current.Application["ImportConfigurationDataInProcess"] = false;
Request.CreateResponse(HttpStatusCode.OK, true)
}
But it seems like very hard-coded solution. What is a proper way of handling that?
Another thing it is not properly works on client side at it still waits for a response. So is it possible for user just to send that file to server and not wait unlit it will finishes but reload page after file sent to server without waiting while await stuff will finish.
async does not change the HTTP protocol (as I explain on my blog). So you still just get one response per request.
The proper solution is to save a "token" (and import data) for the work in some reliable storage (e.g., Azure table/queue), and have a separate processing backend that does the actual import.
The ImportConfigurationData action would then check whether a token already exists for that data, and fault the request if found.

Categories