long process in asp - c#

My situation is this:
was created a page that will run a long process . ... This process consists in:
- Read a file. Csv, for each row of the file wil be created an invoice...in the end of this process shows a successful message.
for this it was decided to use an updatepanel so that the process is asynchronous and can display an UpdateProgress while waiting to finish the process ... for this in the property of scriptmanagment was added the AsyncPostBackTimeout = 7200 (2 hours) and also timeout was increased in the web.config of the app as in qa and production servers.
Tests were made in the localhost as a qa server and works very well, the problem arises when testing the functionality on the production server.
that makes:
file is loaded and starts the process ... during this period is running the UpdateProgress but is only taking between 1 or 2 min and ends the execution without displaying the last message, as if truncated the process. When reviewing the invoices created are creating only the top 10 records of the file.(from a file with 50,100 or + rows)
so I would like to help me with this, because i don't know what could be wrong.

asp.net is not suited for long running processes.
The default page timeout for IIS is 110 seconds (90 for .net 1.0). You can increase this, but it is not recommended.
If you must do it, here is the setting:
<system.web>
...
<httpRuntime executionTimeout="180"/>
...
<\system.web>
Refer httpRuntime
Pass on this work to a windows service, WCF or a stand alone exe.
Use your page to get the status of the process from that application.
Here is an example that shows how to use workflows for long running processes.
You move the bulk of the processing out of asp.net, and free its threads to handle page requests.

Related

ASP.net: Concurrent file upload Fails after N number of large uploads

I am working on an asp.net (Webforms, asp.net 2.0, Framework 3.5) application. It is 32 bit application running on IIS 7.0, with OS Windows 2008 R2 SP1
I am facing an issue with large file uploads. The files which are more than 20 MB or so. The application is able to upload large files however, it is noticed that after N number of uploads, the next set of uploads keep on failing until IIS is restarted.
The application supports concurrent file uploads. It is noticed that, single large file upload always works. Only when we start upload for more than 1 file, one of the uploads get stuck.
I tried looking at the temp folders in which posted file data gets uploaded and noticed that when the issue happens, the upload for the failing file never starts from server's view point as it never generates any temp file and after few sec, the request fails.
When the things fail,
CPU is all OK
W3wp stands at 2 GB memory usage (against total 4 GB RAM)
W3wp does not show an crash as the other pages of the application still works fine
I tried using wireshark to see network traffic, but it also say ERR_connection_RESET. Apart from that, I am not getting any clue.
I am suspecting below things but not sure how to conclude or fix.
1) To start concurrent uploads, server needs to cop up with data pumping rate from client side and when it is unable to match that, it must be failing internally. This could be due to server's inability to server concurrent requests.
2) Frequent large uploads increases the memory footprint of the application to an extent where it cannot work with concurrent uploads, because to dump the files at temporary location in chunked manner, RAM is still required
Here is my web config setting
<httpRuntime maxRequestLength="2097151" executionTimeout="10800" enableVersionHeader="false"/>
From the implementation perspective,
1) We have client side implementation written in Java script, which creates FormData and sends the XHR to server
2) Server has a method which gets called when complete file is copied to server's temp directory, and we extract the file data using Request.Files collection and then processes further
When issue happens, the server method gets called, but the Request.Files comes empty.
Please let me know if anyone have very good insight on this which can guide me to the root cause and fix.
UPDATE:
Client side code representation:
//Set HTTP headers
_http.setRequestHeader("x-uploadmethod", "formdata");
_http.setRequestHeader("x-filename", "Name of file");
// Prepare form data
var data = new FormData();
data.append("Name of file", File contents);
//Sends XHR request
_http.send(data);
Server side code representation:
HttpFileCollection files = Request.Files;
int Id = objUpload.UploadMyAssets(files[0]);
The logic in UploadMyAssets is taking files[0] as HttpPostedFile and then move ahead with application specific logic.
Thanks
I had the same issue. Turns out ASP.NET Default Session Manager is blocking with async streams over https (HTTP/2). Didn't happen over http (non-ssl).
Resolved this by using SessionStateBehavior.Readonly for the Controller Class. Related to this post:
ASP.Net Asynchronous HTTP File Upload Handler

How to get and set a persistent variable server side in a web application

I am trying to save a simple int on the server side then any user can log in and update it. I thought this would be a simple task and began trying to use the settings designer however i couldn't change the scope from "application" to "user" as application settings are read only.
I know i could save and change the variable in an XML file but i thought there must be a more simple way.
I have tried to use user profiles, however it isn't working any ideas? (I have also used Context.Profile.)
<profile>
<providers>
<clear/>
<add name="AspNetSqlProfileProvider" type="System.Web.Profile.SqlProfileProvider" connectionStringName="ApplicationServices" applicationName="/"/>
</providers>
<properties>
<add name="FilmNumber" type="int" allowAnonymous="true" defaultValue="2"/>
</properties>
</profile>
Code:
//********Get poster Number*******
int LastPosterNumber=0;
LastPosterNumber = (int)HttpContext.Current.Profile.GetPropertyValue("FilmNumber");
string strFileName;
strFileName = FileField.PostedFile.FileName;
string c = (LastPosterNumber + 1).ToString();
string dirPath = System.Web.HttpContext.Current.Server.MapPath("~") + "/Images/FilmPosters/" + c + ".jpg";
FileField.PostedFile.SaveAs(dirPath);
//******Save new poster number*******
HttpContext.Current.Profile.SetPropertyValue("FilmNumber", int.Parse(c));
HttpContext.Current.Profile.Save();
Try to avoid Settings because they require your config file to be modifiable, also the concept of "user settings" in ASP.NET does not exist because the application always runs under the user context of W3C's worker process - the .NET Settings API is not aware of ASP.NET Membership or other concepts of "Users".
Your best option is to use a DBMS, Application state or a static field in your application (if it doesn't need to be persisted beyond the lifespan of w3wp.exe) or a file on disk, you don't have to use XML serialization, you can write it out manually (just be sure to lock the file first because ASP.NET applications are multi-threaded).
In my applications, I only store the connection string and the bare minimum of initialization settings in web.config, per-user settings I store in a database table and write a simple API layer for this (usually with application-settings and inheritance or "EffectiveSettings"). Note that this is completely different (as far as the implementation is concerned) from .NET's Settings API which I avoid completely for various reasons, including those already promulgated in this answer.
Notes on IIS w3wp.exe lifespan:
IIS will terminate or "recycle" w3wp.exe at any time for a variety of reasons, which is why your ASP.NET application must persist to long-term storage at the nearest opportunity and any in-memory state will be lost. Reasons for this include:
Inactivity. If an application pool worker process has not handled a request in at least 45 minutes (or so) IIS will shut down the process.
Recycling. IIS takes pre-emptive measures against worker processes leaking resources by terminating and restarting them every 90 minutes or so (or it might be 29 hours, I'm not sure).
Unresponsiveness. IIS gives worker processes a strict timeout to respond to incoming requests, I think the default is 60 seconds. If no response is sent to the client then the process will be restarted.
Reaching a memory limit. Similar to the automatic time-based recycling described above, IIS will restart a worker process if it reaches a memory limit (which is why it's important to manage your resources).
Reaching a request limit. Again, similar to the automatic time-based recycling, IIS will restart a worker process after its odometer reads X many requests. This is disabled by default but your ISP might have enabled it.

How to reset an application variable daily

I am writing a program recording service calls and treatment done. We have a number of users who open and close calls and I want to show at all times the total number of calls opened today and the total number closed today and the difference between them. I thought of doing it with an application variable. I have to reset these variables to 0 every day. Where would I do that? I thought in the Global.asax but in which event could that be done? The application is running all the time so I suppose Application_Start wouldn't be appropriate. So where? Thank you.
You could configure the Periodic Restart Settings for Application Pool Recycling in IIS:
The element contains configuration settings that allow you to control when an application pool is recycled. You can specify that Internet Information Services (IIS) 7 recycle the application pool after a time interval (in minutes) or at a specific time each day. You can also configure IIS to base the recycle on the amount of virtual memory or physical memory that the worker process in the application pool is using or configure IIS to recycle the application pool after the worker process has processed a specific number of requests.
But this has a side-effect of putting the application offline during the time the pool is restarting, so if you have any user connected at that time it will lose its session. This can be minimized by restarting the application at a time you have no users connected, like at dawn.
The following config snippet set the application pool to daily recycle at 3:00 A.M.:
<add name="Example">
<recycling logEventOnRecycle="Schedule">
<periodicRestart>
<schedule>
<clear />
<add value="03:00:00" />
</schedule>
</periodicRestart>
</recycling>
<processModel identityType="NetworkService" shutdownTimeLimit="00:00:30" startupTimeLimit="00:00:30" />
</add>
I'd have a date variable with the last time the counter was reset, and check the date is "today" on every access to the counter.
Unless you have critical performance problems, I guess that'd be the way to go.
Sample easy-lazy code to call whenever you are updating the counter:
lock(myCounter)
{
if(DateTime.Now.Date != lastDateCounterWasReset)
{
lastDateCounterWasReset = DateTime.Now.Date;
myCounter = 0;
}
myCounter++;
}
Now we'd need to know more about how you'd like to be storing those variables (myCounter and lastDateCounterWasReset), but could be basically anywhere (database, filesystem, etc.)
I would store the calls to a database and do a select which groups by the current day to get the total calls, etc. for display.
That way it will automatically reset for you when a new day starts, and you don't need to worry about IIS Resets destroying your in memory data.
If you don't want the performance hit of querying too often, there are a number of caching options available.
I suppose you could use the Application_BeginRequest method. Use a boolean to see if it's already run that day.
Another option is a scheduler with a hidden URL to reset.

Problem with calling Console application (WCF Service) from webform

I am using a ASP.net webform application to run an existing console application which get all records from DB and send them through a third party WCF service. Locally everything is working fine. When I run the application it opens the console, gets the records and sends them. But now I pushed my files over to Test server along with the exe file and related config files. But when I access the application through the browser (test url) I get the same error message time and again and I don't see the console window. Sometimes everything works fine but never two times in a row.
The error message is:
"There was no end point listening at '.....svc' that could accept message. This is often caused by incorrect address or soap action.
System.net.webexception. Remote name could not be resolved
at System.Net.HttpWebRequest.GetRequestStream
at System.ServiceModel.Channels.HttpOutput.Webrequest.HttpOutput.GetOutputStream()
The code I have used in the webform to call console application is:
ProcessStartInfo p = new ProcessStartInfo();
p.Arguments = _updateNow.ToString();
p.FileName="something";
p.UseShellExecute = false;// tried true too without luck
Process.Start(p);
Error message denotes "there is no end point" and sounds like there is problem with the WCF service but if I double click the executable in Test there is no problem. What could be the possible problem or should I redo the console application functionality to my main webform application?
Update: After adding Thread.Sleep(3000) after Process.Start(p), I'm having no problem. So seems like main application is not waiting for the batch process to complete. How to solve this problem?
It seems like there is a short delay between starting the console application and the WCF web service becoming initialise and available to use - this is to be expected.
You could either:
Work around the issue using Thread.Sleep() and possibly with a couple of catch - retry blocks.
You could have the console application report to the creating process when it is ready to recieve requests (for example by having it write to the standard output and using redirected streams).
However at this point I'd probably reconsider the architecutre slightly - starting a new process is relativley costly, and on top of that initialising a WCF serice is also relatively costly too. If this is being done once per request then as well as the above timing issues you are also incurring performance penalties.
Is it not possible to change the architecutre slightly so that a single external process (for example a Windows service) is used for all requests instead of spawning a new process each time?

Why is file uploading failing in ASP.Net MVC?

I am uploading files using HttpWebRequest to an ASP.Net MVC application but, for some reason unknown to me, it is failing to upload consistently.
I know the file is good since if you try enough times it does eventually upload and can be viewed on the server just fine. When it fails, neither the server nor client reports any errors directly related to the upload, the upload just stops partway through at a random location and time and my MVC action method is called without the file being loaded (Request.Files.Count == 0).
This only seems to be a problem in our production environment over DSL. The test and development environment works fine and the production environment works fine from in the office (really fast connection to servers) but fails when running it from home over DSL.
As you can see below, the point where it fails is pretty basic.
[Authorize]
[AcceptVerbs(HttpVerbs.Put | HttpVerbs.Post)]
[ValidateInput(false)]
public int UploadScene(int sceneID, int tourID, string name, int number, PhotoType photoType)
{
SceneInfo scene;
if (Request.Files.Count < 1) throw new InvalidOperationException("Image file not uploaded.");
// process file...
}
It seems that it is probably configuration, but I can't figure what it might be. We are running in a cluster (we have 4 web servers) so it might have something to do with that, but I am testing against a single server (I can isolate the machine by name and can verify that it is processing my requests). I have also made sure that it is running in it's own app pool. What else should I check?
We are using IIS6 and .Net 3.5 on the servers.
Have you tried wrapping your form in the proper < form > tag?
<% using (Html.BeginForm("Action", "Controller", FormMethod.Post, new { #enctype = "multipart/form-data" })) { %>
I checked out the event viewer and noticed the app pool was recycling due to a virtual memory check. I turned that off and was able to upload over 20 images without a problem.
Of course this doesn't explain why recycling causes the file upload to fail immediately. I was under the impression that the old pool would continue processing any existing requests until they are complete or the shutdown time limit (we have it setup as 10 minutes in order to handle file uploads).

Categories