Why is file uploading failing in ASP.Net MVC? - c#

I am uploading files using HttpWebRequest to an ASP.Net MVC application but, for some reason unknown to me, it is failing to upload consistently.
I know the file is good since if you try enough times it does eventually upload and can be viewed on the server just fine. When it fails, neither the server nor client reports any errors directly related to the upload, the upload just stops partway through at a random location and time and my MVC action method is called without the file being loaded (Request.Files.Count == 0).
This only seems to be a problem in our production environment over DSL. The test and development environment works fine and the production environment works fine from in the office (really fast connection to servers) but fails when running it from home over DSL.
As you can see below, the point where it fails is pretty basic.
[Authorize]
[AcceptVerbs(HttpVerbs.Put | HttpVerbs.Post)]
[ValidateInput(false)]
public int UploadScene(int sceneID, int tourID, string name, int number, PhotoType photoType)
{
SceneInfo scene;
if (Request.Files.Count < 1) throw new InvalidOperationException("Image file not uploaded.");
// process file...
}
It seems that it is probably configuration, but I can't figure what it might be. We are running in a cluster (we have 4 web servers) so it might have something to do with that, but I am testing against a single server (I can isolate the machine by name and can verify that it is processing my requests). I have also made sure that it is running in it's own app pool. What else should I check?
We are using IIS6 and .Net 3.5 on the servers.

Have you tried wrapping your form in the proper < form > tag?
<% using (Html.BeginForm("Action", "Controller", FormMethod.Post, new { #enctype = "multipart/form-data" })) { %>

I checked out the event viewer and noticed the app pool was recycling due to a virtual memory check. I turned that off and was able to upload over 20 images without a problem.
Of course this doesn't explain why recycling causes the file upload to fail immediately. I was under the impression that the old pool would continue processing any existing requests until they are complete or the shutdown time limit (we have it setup as 10 minutes in order to handle file uploads).

Related

Unable to determine if a file is on a web server because the various methods of determining the directory do not work

I am developing an application in asp.net, vs2015 using c# and the development environment is a Win10Pro machine. I can use any of the various methods to obtain the working directory and see if a particular file exists on the dev pc, but not on the Web Server. I have tried the methods laid out on:
Get current application physical path within Application_Start
All work on the Dev PC, but when used on the Web Server it will not return the working directory. The Server is a 2016 Data server using IIS10. The issue is that the web site I am putting together work fine, except to display GrapeCity ActiveReports reports AR15. The web page containing their web viewer opens just fine and is looking for a report file (MyReport.rdlx). The global.aspx file is pointing to the root directory but when the web viewer opens up, it says File Not Found. I have absolutely no idea and tech support is not sure. Is this an IIS issue that is preventing the code to locate and verify the file is there? Any direction would be much appreciated. This has been very frustrating and time consuming.
AppDomain.CurrentDomain.BaseDirectory does not work, HttpRuntime.AppDomainAppPath does not as well as all the others. The request comes back blank.
string filename = AppDomain.CurrentDomain.BaseDirectory.ToString() +"SPU01_Dates.rdlx";
if (File.Exists(filename))
{
Response.Write("YES");
}
else
{
Response.Write("NO");
Response.Write("</br");
Response.Write(filename);
}
All this just returns nothing.
Thanks.
Try this code
if (File.Exists(Server.MapPath(filename)))
Check if a file exists on the server
In my test, it returned YES and worked well. Did you put "SPU01_Dates.rdlx" file in root folder?
In the development environment, it returned YES, and when I deployed it to IIS, it returned NO. I found that during the deployment process, the rdlx file was not deployed with the project, so I recreated one in the deployed folder, and it returned YES.
The test proves that AppDomain.CurrentDomain.BaseDirectory is the most accurate way to get the file path. When you test this code in IIS, does it return NO or empty? Returning empty means that this piece of code has not been executed.

File gets "locked" while moving operation and iis give 401 error

I have a windows service run with local system account
What my program does is :
if (File.Exists(outputPath))
{
File.Delete(outputPath);
}
File.Move(archivePath, outputPath);
Relevant folder is application folder of iis where its application pool's identity is ApplicationPoolIdentity located under c:\MyAppFolder.
My windows service does its few times a day, and my clients checks if any new version exists every 5 minutes(0,5,10,15...) and download that file.
Time to time, file is somehow get "locked" on filesystem then
iis gives 401 error
File cannot be deleted
My first question how can I repro this situation?
One patch is done by colleagues is:
var fs = File.GetAccessControl(outputPath);
fs.SetAccessRuleProtection(false, false);
File.SetAccessControl(outputPath, fs);
Although this patch, it seems error occured again,
I may apply, this solution as well.
Are those solutions are enough or necessary?
Again my first question is important "repro issue" and understand why this happens.

ASP .Net MVC Download Error and Require to Recycle IIS Application Pool

I currently maintain a nearly 3 years old ASP .Net MVC website, the application is running above IIS (now in IIS 7) and using ASP .Net 4 Framework. It used by client almost everyday and had a lot of upload-download file transaction. It also use ELMAH as Unhandled Exception Handling. The application running well until a few past month, there are a lot of report from user that they cannot do download the file, but without any error message, the download process just not doing anything while there is also no log in Browser Console. After doing several checking, all menu that have download function are using http response
Response.Clear();
Response.Cache.SetCacheability(HttpCacheability.Private);
Response.Expires = -1;
Response.Buffer = true;
Response.ContentType = "application/octet-stream";
Response.AddHeader("Content-Length", Convert.ToString(file_in_bytes.Length));
Response.AddHeader("Content-Disposition"
, string.Format("{0};FileName=\"{1}\"", "attachment", fileName));
Response.AddHeader("Set-Cookie", "fileDownload=true; path=/");
Response.BinaryWrite(hasil);
Response.End();
And nothing seems wrong (there are no Compile or Runtime Error in Development Server). We've also checked Elmah's log, but there no related error message appear in there. And This problem is temporarily disappear after our Server Management Team do Recycling the Application Pool in IIS.
This Web is also share Application Pool with another web, and when that error occurred, both application are affected, only the download function that affected, the other function like data retrieval from database, insert/edit/delete data is working fine.
I also checked the Web Server Event Viewer but there is nothing error in there. The very odd thing for us is that this error temporary disappear after we Recycling the Application Pool and after several days or weeks or months the error suddenly appear again.
Is there any log that we've missed to trace? or perhaps there is wrong with the Download code? And why its temporarily fixed after Recycling Application Pool?
Another Note : The data that need to be download by user is at average 500kb to 2MB in zip format contains several PDF files
Update : After few more hour investigating, I found that this web application using different method to Download, some are using the Http.Response like above code, and some are use FileContentResult as return value. But both using jquery.FileDownload in client-side. I also found this method in several Controller that has Download File method in this app,
private void CheckAndHandleFileResult(ActionExecutedContext filterContext)
{
var httpContext = filterContext.HttpContext;
var response = httpContext.Response;
if (filterContext.Result is FileContentResult)
{
//jquery.fileDownload uses this cookie to determine that
//a file download has completed successfully
response.AppendCookie(new HttpCookie(CookieName, "true")
{ Path = CookiePath });
}
else
{
//ensure that the cookie is removed in case someone did
//a file download without using jquery.fileDownload
if (httpContext.Request.Cookies[CookieName] != null)
{
response.AppendCookie(new HttpCookie(CookieName, "true")
{ Expires = DateTime.Now.AddYears(-1), Path = CookiePath });
}
}
}
Actually I'm not really sure is that method related to this error or not, but it is called in a method that override System.Web.MVC.Controller OnActionExecuted, and it contain the line off adding Cookie for file download if using FileContentResult or delete Cookie if is not using FileContentResult and file Download Cookie is exists. It is Possible if Cookie is Accidentally not deleted / cleared after it created? And because the download method is frequently called by nearly 100 user everyday, it is possible that the Cookie is pile up and cause IIS Worker Process Crash?
I've also checked some references about Cookie and its relation to IIS Session State (My Apps using In-Proc State). Am I Close? Or did I miss something?
Is there a reason why Response.Buffer is set to true? When buffering is enabled, the response is sent only after all processing is completed. Can you disable it by setting to false and see if this works? This could be the reason for having to recycle the app pool. You can also check if you are facing these issues - Help link

ASP.net: Concurrent file upload Fails after N number of large uploads

I am working on an asp.net (Webforms, asp.net 2.0, Framework 3.5) application. It is 32 bit application running on IIS 7.0, with OS Windows 2008 R2 SP1
I am facing an issue with large file uploads. The files which are more than 20 MB or so. The application is able to upload large files however, it is noticed that after N number of uploads, the next set of uploads keep on failing until IIS is restarted.
The application supports concurrent file uploads. It is noticed that, single large file upload always works. Only when we start upload for more than 1 file, one of the uploads get stuck.
I tried looking at the temp folders in which posted file data gets uploaded and noticed that when the issue happens, the upload for the failing file never starts from server's view point as it never generates any temp file and after few sec, the request fails.
When the things fail,
CPU is all OK
W3wp stands at 2 GB memory usage (against total 4 GB RAM)
W3wp does not show an crash as the other pages of the application still works fine
I tried using wireshark to see network traffic, but it also say ERR_connection_RESET. Apart from that, I am not getting any clue.
I am suspecting below things but not sure how to conclude or fix.
1) To start concurrent uploads, server needs to cop up with data pumping rate from client side and when it is unable to match that, it must be failing internally. This could be due to server's inability to server concurrent requests.
2) Frequent large uploads increases the memory footprint of the application to an extent where it cannot work with concurrent uploads, because to dump the files at temporary location in chunked manner, RAM is still required
Here is my web config setting
<httpRuntime maxRequestLength="2097151" executionTimeout="10800" enableVersionHeader="false"/>
From the implementation perspective,
1) We have client side implementation written in Java script, which creates FormData and sends the XHR to server
2) Server has a method which gets called when complete file is copied to server's temp directory, and we extract the file data using Request.Files collection and then processes further
When issue happens, the server method gets called, but the Request.Files comes empty.
Please let me know if anyone have very good insight on this which can guide me to the root cause and fix.
UPDATE:
Client side code representation:
//Set HTTP headers
_http.setRequestHeader("x-uploadmethod", "formdata");
_http.setRequestHeader("x-filename", "Name of file");
// Prepare form data
var data = new FormData();
data.append("Name of file", File contents);
//Sends XHR request
_http.send(data);
Server side code representation:
HttpFileCollection files = Request.Files;
int Id = objUpload.UploadMyAssets(files[0]);
The logic in UploadMyAssets is taking files[0] as HttpPostedFile and then move ahead with application specific logic.
Thanks
I had the same issue. Turns out ASP.NET Default Session Manager is blocking with async streams over https (HTTP/2). Didn't happen over http (non-ssl).
Resolved this by using SessionStateBehavior.Readonly for the Controller Class. Related to this post:
ASP.Net Asynchronous HTTP File Upload Handler

long process in asp

My situation is this:
was created a page that will run a long process . ... This process consists in:
- Read a file. Csv, for each row of the file wil be created an invoice...in the end of this process shows a successful message.
for this it was decided to use an updatepanel so that the process is asynchronous and can display an UpdateProgress while waiting to finish the process ... for this in the property of scriptmanagment was added the AsyncPostBackTimeout = 7200 (2 hours) and also timeout was increased in the web.config of the app as in qa and production servers.
Tests were made in the localhost as a qa server and works very well, the problem arises when testing the functionality on the production server.
that makes:
file is loaded and starts the process ... during this period is running the UpdateProgress but is only taking between 1 or 2 min and ends the execution without displaying the last message, as if truncated the process. When reviewing the invoices created are creating only the top 10 records of the file.(from a file with 50,100 or + rows)
so I would like to help me with this, because i don't know what could be wrong.
asp.net is not suited for long running processes.
The default page timeout for IIS is 110 seconds (90 for .net 1.0). You can increase this, but it is not recommended.
If you must do it, here is the setting:
<system.web>
...
<httpRuntime executionTimeout="180"/>
...
<\system.web>
Refer httpRuntime
Pass on this work to a windows service, WCF or a stand alone exe.
Use your page to get the status of the process from that application.
Here is an example that shows how to use workflows for long running processes.
You move the bulk of the processing out of asp.net, and free its threads to handle page requests.

Categories