I am having a HTTP PUT web API method in my MVC application which receives files from client side and Put it to the server storage.
As the file size might be large, I am not streaming the file into the memory to avoid memory out of bound exceptions, thus I am using MultipartFormDataStreamProvider to load it in a temp folder, and move it to final destination later.
Everything works perfectly except the fact that it doesn't upload files larger than 2097148 KB (2.097GB).
Once I give a file larger than that it starts streaming it at the Temp folder, and then it stops once the file sizes reaches 2097148 KB.
I have the folloing attributes in my web.config file :
maxRequestLength="5097151",
requestLengthDiskThreshold="50971",
maxAllowedContentLength="4242880000".
Also in IIS I have set the Maximum allowed content length (Bytes) to 4242880000 KB.
Is there anyother place which might cause this to happen?
Update
It seems that even under IIS 10 with .NET 4.6.1 the request is denied (400 Bad Request) even though all the limits are set to allow it.
Digging further it seems that this has been rejected at Microsoft.
In .Net 4.0 and earlier there is a 2Gb limitation in ASP.NET, that was fixed in .Net 4.5. However this fix makes a little sense because IIS itself does not support file uploads over 2Gb.
The only way to upload files over 2Gb to IIS-hosted server is to break it into pieces and upload piece by piece. Here are clients that can upload breaking a file into segments:
IT Hit Ajax File Browser
Sample WebDAV Browser
Note that these clients require your server to support PUT with Range header.
Another solution is to create a HttpListener-based server. HttpListener has much less functionality comparing to IIS but it does not have any upload limitations.
source
Related
Short Version
Is there a more efficient or less-resource-intensive way in C# to create a zip file of a folder recursively than using System.IO.Compression.ZipFile.CreateFromDirectory?
Long Version
I have a REST API running in an Azure App Service (Scaled at P2V2 420 total ACU and 7 GB memory). One of my endpoints accepts a POST request and generates a zip file using the following code. It then returns the name of the zip file that was generated wrapped in a standard JSON format.
System.IO.Compression.ZipFile.CreateFromDirectory(sourcePath, outputPath);
This runs fine on my local machine, but appears to have scaling issues on the cloud server. When I scale up to the next level app service (P3V2 820 ACU, 14GB memory) it runs fine, but without scaling up the API returns a 503 Service Unavailable message. I am not sure which resource is constrained (cpu, memory, disk, etc.), but I don't think it's a server timeout issue since it only runs about 10-20 seconds. The zip file generated is roughly 400 files or 200MB compressed.
My question is, since scaling up costs an extra $120 per month, is there a way to generate a zip file more efficiently so I don't need to scale up the server? Would it help at all to split the zip file creation into creating and adding files steps, using multiple commands rather than using CreateFromDirectory(...)?
Other info:
This endpoint is not being hit by anyone but me
I can try to create multiple smaller zip files to see if this helps with resource consumption, but this is not ideal for my use case.
This is not the only thing running on this server which is why it's already scaled up so much
Just to clarify, the zip file does actually get created on the server and I am able to see a log entry that gets logged just before response goes back to the user, and I can see the zip file in kudu. But Azure intercepts the 200 API response for some reason and sends a 503 instead.
I have very simple code that download files from a web server and here is the code. As i said it is very basic
// use the web client to download
using (var client = new WebClient())
{
// download locally
client.DownloadFile(from, to);
}
But for some client the file does not download completely and does not throw exception. All these client come from different location and all have the same behavior that the WebClient download exactly 10 mb of ANY file above 10mb. A 8mb file is 8mb, a 20mb file is 10mb a 34 mb file is 10mb. The funny thing is we ask those user to stop using the software.
This issue is not related to the computer as we have alot of those user on laptop that the download works fine from home and it doesn't when they are at work and some it's totally reverse, the download doesn't work at work but it does at work. The behavior is also different for client within the same physical office.
We have tried to talk to their IT and they have no problem going on our http browsable directory and downloading many files over 10mb and it works perfectly and they state they never had such issue. The issue seems to spread more and more and since last windows 10 update much more client started to have this issue.
As a side note this download code has been unchanged and running for 5 years with nearly no issues.
Does anyone know why download would complete without any error (in try..catch) without having downloaded the whole file ? and Why would all these different client with the issue would be cut at EXACTLY 10,000 bytes.
Wanted to add that we tried to reinstall .NET Framework for these user in the past without any result thinking it must be an issue with that
I just edited to add this little extra details that the files they are trying to download are on an anonymous access folder so no login required and it is browsable. All user with the issue can use Chrome and Edge to navigate to the folder and right-click and download and the file is complete that way. Only .NET cannot download files above 10mb on their PC.
I am trying to tackle an issue I am where uploading files on a web-app that are above 30 MB are receiving a gateway timeout error.
Prior to that, the system was receiving a 404.13 Error, so I changed the web.config file to support much larger files by changing the maxAllowedContentLength, this had worked initially. However, now it is receiving that 504 Error.
The way the system works:
A file is selected and uploaded to the web app using a third party uploader tool (this part works without issues).
That (IFormFile) is then processed by the web app to determine what type of file it is, and how to process it.
Once complete, that file stream is sent to an Amazon S3 bucket in a PutObjectAsync request.
I believe the issue is occurring on step three, but this is hard to diagnose as it does not occur in a local environment. What is strange is that I can upload a 29 MB file without issue, and it is processed quickly, but the moment I go above that, it seems to hang on that last step. Has anyone had a similar issue before? What steps did you take to resolve it?
Edit: Previously 404.13 error.
We figured out that the issue was not related to IIS or AWS directly. Instead, we were encoding video files into a desired format on the server prior to upload to AWS S3. This process took too long for larger files, and resulted in the Gateway timeout error.
Andy, you were correct. It was happening at exactly the two minute mark, that is (default) what we have the server timeout configured to.
Scenario:
I have a client & server architecture. Client program captures multiple displays connected to a machine and saves as jpg file in a folder. The minimum speed is 5 images per second per display. Same folder is shared over network.
I have a windows service running on a server grade machine which pulls all the files as soon as it is created in the shared folder. These files are rendered in a browser through asp.net page through an img tag like live streaming. These files are also used to make a video later.
Problem:
Once in 8-10 days I see a slow down of file copy process where the client machine stacks more than 30,000 images in the folder but server could not pull it for some reason.
With the help of Red-Gate profiler I could find that only file copy process was stuck and could not move the file. After some time server pull process covered all the lag & came on track. To access file I am using Fast Directory Enumerator. More info here http://www.codeproject.com/Articles/38959/A-Faster-Directory-Enumerator
Initially we tried push implementation where client was pushing the images to server folder but got similar performance issue more frequently.
I confirmed that there was no network connectivity issue as well as CPU utilization was low when the process lagged. I have also handled case where a file is being accessed by another process due to which it could not be moved.
What could be the reason for this delay?
Is there any other best option to move the files to server?
When uploading a file the upload appears to restart partway through the upload.
Part of our management system involves uploading files to be processed on the server. The files can be quite large (around 70Mb) normally text and image files. It is a c# MVC application on IIS 7.
This all works fine on the development setup but not on our live system hosted on Amazon Web services.
I have 3 test files:
70Mb, 35Mb and 10Mb
the 10Mb file reaches 100%, displays a 'waiting for page' message in the status bar and then starts uploading again.
the 35Mb file reaches 40% and then restarts the upload
the 70Mb file reaches 18% and then restarts the upload
All of the restarts happen between 2 minutes and 2 minute 20 seconds into the upload. The same behavior is seen in chrome and firefox.
I've set the httpRuntime maxRequestLength and executionTimeout to be sufficiently large enough. Debug is set to false in the live system
I've set the requestFiltering and maxAllowedContentLength to be sufficiently large also.
It doesn't seem like the code in the controllers is being reached. I think it's a config issue but i'm not sure what i've missed.
I've looked at Connection Reset due to large file upload but configs should permit upload which seems similar apart from my issues occur at around the 2 minute mark, not straight away.
Update: I've run through the process again with fiddler and this returns a 504 gateway timeout status
Update: This process works ok with smaller files (1Mb to 6Mb tested so far)