ASP.NET background processing blocks status or UI feedback - c#

I know this question has been asked many times, but my problem is a little different.
I have page which lets user download and upload excel file. During downloading excel, it takes approx 2 mins to generate the file. I have added checkpoints which updates the database with status like (started processing, working on header ...etc). I have done the same thing for upload.
I also have a ajax request which checks the database in fixed interval and prints status to user to give feedbacks like (started processing, working on header ...etc).
The problem is, i get the feedback only when the process is complete. It looks like the session is blocked during the background process and any other request(ajax) are only completed once the background process is over. ajax makes approx 10 requests within 4 sec intervals.I get the 10 response back only in the end.
I have tried two iframes and also frames, one running the ajax and other running the process, Doesn't work. i tried separate browser(Process running in IE, ajax running in FF) and that works (so i now my code works). Can anybody advise? Thanks
p.s. My environment is IIS 6, ASP.NET 3.5 with MVC 1.0 browser is IE6.0

Your browser has a limitation on the number of connections that can be working concurrently.
I believe IE has a limitation of 2 connections. That means that even if you are running AJAX requests you can only have two requests running concurrently at the same time.
That is most likely why you're not seeing results until the end, because it's processing other connections and doesn't get to the status request until it's already done. That also explains why it works when you do it from different browsers, because you don't suffer from the same connection limitation.
Here's an article that details the issue.

This is exactly what i was looking for
(asynchronous-processing-in-asp-net-mvc-with-ajax-progress-bar)
Using delegate BeginInvoke of IAsyncResult helped with the blocked session

Related

Throttle outgoing connection to external API

I'm currently developing website in asp core 2.2. This site use external API. But I have one big problem and don't know how to solve this. This external API has limit 10 reguest per IP/s. If 11 user click button on my site and call API at the same time, the API can cut me off for a couple hours. The API owner tells clients to take care of not exceeding the limit. Can you have any idea how doing this?
ps. Of course, a million users are a joke, but I want the site to be publicly available :)
That 10 request/s is a hard limit and it seems like theres no way around it. So you have to solve it on your end.
There are couple options:
Calls that API directly using Javascript. This way each user will be able to do 10 request/s instead of 10 request/s for all users (recommended)
Queue the requests and only send out at most 10/s (highly not recommended, kills your thread pool and can block everyone from accessing your site when the speed of input coming is > output)
Drop the request on server side when you are reaching that 10/s limit and have the client retry at a later time. (wait time will be infinite when speed of input coming is > output)
And depending on the content returned by the API you might be able to cache it on server side to avoid having to request it from the 3rd party again.
In this scenario you would need to account for the possibility that you can't process requests in real time. You wouldn't want to have thousands of requests waiting on access to a resource that you don't control.
I second the answer about calling the API from the client, if that's an option.
Another option is to keep a counter of current requests, limit it to ten, and return a 503 error if a request comes in that exceeds that capacity. That's practical if you really don't expect to exceed ten concurrent requests often or ever but want to be sure that in the odd chance that it happens it doesn't shut down this feature of your site.
If you actually expect large volumes where you would exceed ten concurrent requests then you would need to queue the requests, but do it in a process separate from your web application. As mentioned, if you have tons of requests waiting for the same resource your application will become overloaded. You could enqueue the request with an entirely different process, and then the client would have to poll your application with occasional requests to see if there's a response.
The big flaw in this last scenario is that it means your users could end up waiting a long time because your application depends on a finite resource that you cannot scale. You can manage it in a way that keeps your application from failing, but not in a way that makes it respond quickly.

Increasing the max concurrent handled request on a special wcf service

I have a mock service that receives a request, loads a xml file from disk, waits 2 seconds and returns the xml content.
Now the wait is done using Task.Delay to prevent thread blocking.
My problem is that my application is allowing only 10 concurrent requests, while the others are waiting in the queue for the previous ones to finish.
Here is a print screen of the Fiddler timeline for 30 requests:
The first 10 requests finish within 2 seconds.
The second 10 requests finish within 4 seconds.
The third 10 requests finish within 6 seconds.
I've tried multiple configuration changes, registry updates and other perks while googling the solution and none of them helped me.
How can I achieve something like this?
What version of OS are you using? Are you on windows server or a normal windows version? There is a limit of 10 connections on the normal windows version. try deploying to a server and test again

asp.net connection reset with long running process

In an asp.net web form, I keep getting a connection reset error message. The page is doing a some long running processing (about 2-5 minutes).
I have no problem when the web request comes from the same machine as the web server. But when the request originates across the network, I get a connection reset error about 1:30 or 2 minutes into waiting for a response.
I have set the in web.config for this application and put the application it's own application pool.
What else can I try?
Edit
The purpose of this page is to accept input from the user, calculate something, and send the result back to them. The long running calculation isn't something I can offload until a later time.
A common way to handle this is to kick of a background thread to process your data, but immediately return an identifier to the browser, normally as a link. When the link is clicked, the server checks to see if processing is complete - if it is, show the results, if not, display a "please wait" message and the link again, or auto-refresh the page...
Do not make the browser wait for this long. Make the request asynchronous, make it return right away. If you need to make user wait - do it using Javascript

Problems Sending Large Volume of Emails using ASP.Net

I'm having an issue sending large volumes of emails out from an ASP.Net application. I won't post the code, but instead explain what's going on. The code should send emails to 4000 recipients but seems to stall at 385/387.
The code creates the content for the email in a string.
It then selects a list of email address to send to.
Looping through the data via a datareader it picks out the email address and sends an email.
The email sending is done by a separate method which can handle failures and returns it's outcome.
As each record is sent I produce an XML node in an XML document to log each specific attempt to send.
The loop seems to end prematurely and the XML document is saved to disk.
Now I know the code works. I have run it locally using the same SMTP machine and it worked fine with 500 records. Granted there was less content, but I can't see how that would make any difference.
I don't think the page itself times out, but even if it did, I was sure .Net would continue processing the page, even if the user saw a page time out error.
Any suggestions appreciate because I'm pretty stumped.
You're sending lots of emails. During the span of a single request? IIS will kill a request if it takes longer than a certain (configurable) amount of time.
You need to use a separate process to do stuff like this. Whether that's a Timer you start from within global.asax, or a Thread which checks for a list of emails in a database/app_data directory, or a service you send a request to via WCF, or some combination of these.
The way I've handled this in the past is to queue the emails into a SQL Server table and then launch another thread to actually process/send the emails. Another aspx utility page can give me the status of the queue or restart the processing.
I also highly recommend that use an existing, legit, third-party mailing service for your SMTP server if you are sending mail out to the general public. Otherwise you run the risk of your ISP shutting off your mail access or (worse) your own server being blacklisted.
If the web server has a timeout setting, it will kill the page if it runs too long.
I recommend you check the value of HttpServerUtility.ScriptTimeout - if this is set then when a script has run for that length of time, it will be shut down.
Something you could do to help is go completely old-school - combine some Response.Writes with a few Response.Flush to send some data back to the client browser, and this tends to keep the script alive (certainly worked on an old ASP.NET 1.1 site we had).
Also, you need to take into account when this script is being run - the server may well also have been configured to perform an application reset (by default this is set to every 29 hours in IIS), if your server is set to something like 24 hours and this coincides with the time your script it run, you could be seeing that too - although the fact that the script's logging its response probably rules that out - unless your XML document is badly formed?
All that being said, I'd go with Will's answer of using a seperate process (not just a thread hosted by the site), or as Bryan said, go with a proper mailing service, which will help you with things like bounce backs, click tracking, reporting, open counts, etc, etc.

stop sql insert in a for or while loop

i created a loop up to 100 to insert any data in db.
but when i close my browser I hope it stops but it in background it continues looping and filling my db. how can I stop it?
thanks
No guarantees that it would work, but you could try the HttpResponse.IsClientConnected property.
for (int i = 0; i < 100; i++)
{
if (!Response.IsClientConnected) break;
// insert the next row
}
It sounds like you have started off this process from an ASP.NET form with the server-side code doing the looping in response to something (either the ASP.NET form loading, or clicking a button etc).
Note that closing your browser window will NOT stop any long-running code process that you have invoked on the "server". Although this may be running on the same machine, the "server" here is the IIS/Cassini process that serves up your webpage to the "client" (your browser).
In order to stop this, you'll need to stop/shutdown the IIS or Cassini (Cassini small web server used by Visual Studio when running a web app on your local development machine), or stop/shutdown the SQL Server process.
EDIT: I noticed the downvotes for the other answer than mentions stopping IIS/SQL. This, of course, is a "last-gasp" mechanism for stopping a long-running process invoked by the "client" but running on the "server". If this were to be done in a "real" application, and you wished to allow the user to cancel a long-running application, an approach as suggested by Marc would be required (i.e. use AJAX, "poll" the web-page with AJAX "refreshes" and respond to subsequent user input (i.e. button click)).
If you are doing this at the server in a single http request (for example, responding to a button event in your code-behind), then the server doesn't really care much about the browser. If you want the inserts to stop when the browser stops, I would suggest perhaps doing this via an ajax loop - but note that this will be considerably slower, as you will have 100 round-trips, and 100 separate sets of processing at the server.
How about a band-aid solution:
Let your webpage send keep-alive AJAX calls every 5 seconds or so to the server and abort if the keep alive is not received :)
Shut down SQL Server and/or IIS.
Edit: ...assuming that you made a programming mistake, you DB keeps filling up and you desperately want to stop it -- this is how I understood your question.

Categories