Download is failing in .Net - c#

Hi I am having issues in this code:
// Function will return the number of bytes processed
// to the caller. Initialize to 0 here.
int bytesProcessed = 0;
// Assign values to these objects here so that they can
// be referenced in the finally block
Stream remoteStream = null;
Stream localStream = null;
WebResponse response = null;
// Use a try/catch/finally block as both the WebRequest and Stream
// classes throw exceptions upon error
try
{
// Create a request for the specified remote file name
WebRequest request = WebRequest.Create(remoteFilename);
request.Method = "GET";
string credentials = Convert.ToBase64String(Encoding.ASCII.GetBytes(uName + ":" + pwd));
request.Headers[HttpRequestHeader.Authorization] = "Basic " + credentials;
if (request != null)
{
// Send the request to the server and retrieve the
// WebResponse object
response = request.GetResponse();
if (response != null)
{
// Once the WebResponse object has been retrieved,
// get the stream object associated with the response's data
remoteStream = response.GetResponseStream();
// Create the local file
localStream = File.Create(localFilename);
// Allocate a 1k buffer
byte[] buffer = new byte[1024];
int bytesRead;
long totalBytesToProcess = response.ContentLength;
// Simple do/while loop to read from stream until
// no bytes are returned
do
{
// Read data (up to 1k) from the stream
bytesRead = remoteStream.Read(buffer, 0, buffer.Length);
// Write the data to the local file
localStream.Write(buffer, 0, bytesRead);
// Increment total bytes processed
bytesProcessed += bytesRead;
log(resourcesPath + "/BytesRecieved.txt", bytesProcessed.ToString()+"/"+ totalBytesToProcess.ToString(), false);
} while (bytesRead > 0);
}
}
}
catch (Exception ex)
{
Response.Write(ex);
// log(resourcesPath +"/Logs.txt",);
}
finally
{
// Close the response and streams objects here
// to make sure they're closed even if an exception
// is thrown at some point
if (response != null) response.Close();
if (remoteStream != null) remoteStream.Close();
if (localStream != null) localStream.Close();
}
// Return total bytes processed to caller.
return bytesProcessed;
This was able to download small files ranging up to 200 mb of file unfortunately it's failing when the file size soar high like up to more than 1gb. I have tried downloadfileAsyc of web client but it's failing too. is there any other way to handle large file for this matter ?

Allocate buffer size bigger than expected file size .
byte[] byteBuffer = new byte[65536];
so that , if the file is 1GiB in size, you allocate a 1 GiB buffer, and then you try to fill the whole buffer in one call.
This filling may return fewer bytes but you've still allocated the whole buffer. Note that the maximum length of a single array in .NET is a 32-bit number which means that even if you recompile your program for 64bit and actually have enough memory available.
For your reference visit this link :
How to change this code to download file bigger than 2GB?

Related

Request stream fail to write

I have to upload a large file to the server with the following code snippet:
static async Task LordNoBugAsync(string token, string filePath, string uri)
{
HttpWebRequest fileWebRequest = (HttpWebRequest)WebRequest.Create(uri);
fileWebRequest.Method = "PATCH";
fileWebRequest.AllowWriteStreamBuffering = false; //this line tells to upload by chunks
fileWebRequest.ContentType = "application/x-www-form-urlencoded";
fileWebRequest.Headers["Authorization"] = "PHOENIX-TOKEN " + token;
fileWebRequest.KeepAlive = false;
fileWebRequest.Timeout = System.Threading.Timeout.Infinite;
fileWebRequest.Proxy = null;
using (FileStream fileStream = File.OpenRead(filePath) )
{
fileWebRequest.ContentLength = fileStream.Length; //have to provide length in order to upload by chunks
int bufferSize = 512000;
byte[] buffer = new byte[bufferSize];
int lastBytesRead = 0;
int byteCount = 0;
Stream requestStream = fileWebRequest.GetRequestStream();
requestStream.WriteTimeout = System.Threading.Timeout.Infinite;
while ((lastBytesRead = fileStream.Read(buffer, 0, bufferSize)) != 0)
{
if (lastBytesRead > 0)
{
await requestStream.WriteAsync(buffer, 0, lastBytesRead);
//for some reasons didnt really write to stream, but in fact buffer has content, >60MB
byteCount += bufferSize;
}
}
requestStream.Flush();
try
{
requestStream.Close();
requestStream.Dispose();
}
catch
{
Console.Write("Error");
}
try
{
fileStream.Close();
fileStream.Dispose();
}
catch
{
Console.Write("Error");
}
}
...getting response parts...
}
In the code, I made a HttpWebRequest and push the content to server with buffering. The code works perfectly for any files under 60MB.
I tried a 70MB pdf. The buffer array has different content for each buffering. Yet, the request stream does not seem to be getting written. The bytecount also reached 70M, showing the file is properly read.
Edit (more info): I set the break point at requestStream.Close(). It clearly takes ~2 mins for the request stream to write in 60MB files but only takes 2ms for 70MB files.
My calling:
Task magic = LordNoBugAsync(token, nameofFile, path);
magic.Wait();
I am sure my calling is correct (it works for 0B to 60MB files).
Any advice or suggestion is much appreciated.

Download speed test

I am writing an app in C# to measure and display download speed. I have the following code to download a 62MB file in chunks, which seems to work well for my purposes. I plan to extend this to measure the time required for each chunk, so it can be graphed.
Before doing so, I have a few questions to make sure this is actually doing what I think it is doing. Here is the code:
private void DownloadFile()
{
string uri = ConfigurationManager.AppSettings["DownloadFile"].ToString();
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(new Uri(uri));
int intChunkSize = 1048576; // 1 MB chunks
using (HttpWebResponse response = (HttpWebResponse)request.GetResponse())
{
byte[] buffer = new byte[intChunkSize];
int intStatusCode = (int)response.StatusCode;
if (intStatusCode >= 200 && intStatusCode <= 299) // success
{
Stream sourceStream = response.GetResponseStream();
MemoryStream memStream = new MemoryStream();
int intBytesRead;
bool finished = false;
while (!finished)
{
intBytesRead= sourceStream.Read(buffer, 0, intChunkSize);
if (intBytesRead > 0)
{
memStream.Write(buffer, 0, intBytesRead);
// gather timing info here
}
else
{
finished = true;
}
}
}
}
}
The questions:
Does response contain all the data when it is instantiated, or just the header info? response.ContentLength does reflect the correct value.
Even though I am using a 1 MB chunk size, the actual bytes read (intBytesRead) in each iteration is much less, typically 16384 bytes (16 KB), but sometimes 1024 (1 KB). Why is this?
Is there any way to force it to actually read 1 MB chunks?
Does it serve any purpose here to actually write the data to the MemoryStream?
Thanks.
Dan

Download .xls file from a url

I'm trying to download an xls file from a url:
http://www.site.com/ff/excel/file.aspx?deven=0
I'm using this code but when the download is complete the file is not properly downloaded. How can I download this file correctly?
string remoteFilename="http://www.site.com/ff/excel/file.aspx?deven=0";
string localFilename = "D:\\1\\1.xls";
Stream remoteStream = null;
Stream localStream = null;
WebResponse response = null;
try
{
// Create a request for the specified remote file name
WebRequest request = WebRequest.Create(remoteFilename);
if (request != null)
{
// Send the request to the server and retrieve the
// WebResponse object
response = request.GetResponse();
response.ContentType = "application/vnd.ms-excel";
if (response != null)
{
// Once the WebResponse object has been retrieved,
// get the stream object associated with the response's data
remoteStream = response.GetResponseStream();
// Create the local file
localStream = File.Create(localFilename);
// Allocate a 1k buffer
byte[] buffer = new byte[1024];
int bytesRead;
// Simple do/while loop to read from stream until
// no bytes are returned
do
{
// Read data (up to 1k) from the stream
bytesRead = remoteStream.Read(buffer, 0, buffer.Length);
// Write the data to the local file
localStream.Write(buffer, 0, bytesRead);
// Increment total bytes processed
} while (bytesRead > 0);
}
}
}
catch (Exception e)
{
MessageBox.Show(e.Message);
}
finally
{
// Close the response and streams objects here
// to make sure they're closed even if an exception
// is thrown at some point
if (response != null) response.Close();
if (remoteStream != null) remoteStream.Close();
if (localStream != null) localStream.Close();
}
MessageBox.Show("file downl");
Use WebClient, it's much simpler:
using (WebClient webClient = new WebClient())
{
webClient.DownloadFile(remoteFileName, localFilename);
}
if(File.Exists(localFilename))
MessageBox.Show("File Downloaded");
Try flushing with localStream.Flush() AFTER your do {} while(), you might also want to wrap with a using statement.
For example:
// Create the local file
using (localStream = File.Create(localFilename)) {
// Allocate a 1k buffer
byte[] buffer = new byte[1024];
int bytesRead;
// Simple do/while loop to read from stream until
// no bytes are returned
do {
// Read data (up to 1k) from the stream
bytesRead = remoteStream.Read(buffer, 0, buffer.Length);
// Write the data to the local file
localStream.Write(buffer, 0, bytesRead);
// Increment total bytes processed
} while (bytesRead > 0);
localStream.Flush();
}
This is how I download Excel files using a FilePathResult.
public FilePathResult DownloadFile(int ID)
{
var log = _db.Logs.FirstOrDefault(x => x.LogID == ID);
//Download the spreadsheet
string fileName = string.Format("{0}.xlsx", ID);
string path = _directory + "\\" + fileName;
return File(path, "application/vnd.ms-excel", string.Format("{0}.xlsx", log.ReportTitle));
}

The specified argument is outside the range of valid values - C#

I keep getting this error:
The specified argument is outside the range of valid values.
When I run this code in C#:
string sourceURL = "http://192.168.1.253/nphMotionJpeg?Resolution=320x240&Quality=Standard";
byte[] buffer = new byte[200000];
int read, total = 0;
// create HTTP request
HttpWebRequest req = (HttpWebRequest)WebRequest.Create(sourceURL);
req.Credentials = new NetworkCredential("username", "password");
// get response
WebResponse resp = req.GetResponse();
// get response stream
// Make sure the stream gets closed once we're done with it
using (Stream stream = resp.GetResponseStream())
{
// A larger buffer size would be benefitial, but it's not going
// to make a significant difference.
while ((read = stream.Read(buffer, total, 1000)) != 0)
{
total += read;
}
}
// get bitmap
Bitmap bmp = (Bitmap)Bitmap.FromStream(new MemoryStream(buffer, 0, total));
pictureBox1.Image = bmp;
This line:
while ((read = stream.Read(buffer, total, 1000)) != 0)
Does anybody know what could cause this error or how to fix it?
Thanks in advance
Does anybody know what could cause this error?
I suspect total (or rather, total + 1000) has gone outside the range of the array - you'll get this error if you try to read more than 200K of data.
Personally I'd approach it differently - I'd create a MemoryStream to write to, and a much smaller buffer to read into, always reading as much data as you can, at the start of the buffer - and then copying that many bytes into the stream. Then just rewind the stream (set Position to 0) before loading it as a bitmap.
Or just use Stream.CopyTo if you're using .NET 4 or higher:
Stream output = new MemoryStream();
using (Stream input = resp.GetResponseStream())
{
input.CopyTo(output);
}
output.Position = 0;
Bitmap bmp = (Bitmap) Bitmap.FromStream(output);

Created Custom File Downloads. MD5 Hash Doesn't Match

Using Asp.Net MVC I was creating a file downloader. The problem with the built in Asp.Net MVC functions is that they don't work on extremely large file downloads and in some browsers they don't pop up the save-as dialog. So I rolled by own using an article from msdn http://support.microsoft.com/kb/812406. The problem now is that the files are downloading perfectly, but the MD5 Checksums aren't matching because the file size is slightly different on the server than the download (even though 1000 tests show that the downloads execute just fine). Here is the code:
public class CustomFileResult : ActionResult
{
public string File { get; set; }
public CustomFileResult(string file)
{
this.File = file;
}
public override void ExecuteResult(ControllerContext context)
{
Stream iStream = null;
// Buffer to read 10K bytes in chunk:
byte[] buffer = new Byte[10000];
// Length of the file:
int length;
// Total bytes to read:
long dataToRead;
// Identify the file name.
string filename = System.IO.Path.GetFileName(this.File);
try
{
// Open the file.
iStream = new System.IO.FileStream(this.File, System.IO.FileMode.Open,
System.IO.FileAccess.Read, System.IO.FileShare.Read);
// Total bytes to read:
dataToRead = iStream.Length;
context.HttpContext.Response.ContentType = "application/octet-stream";
context.HttpContext.Response.AddHeader("Content-Disposition", "attachment; filename=" + filename);
// Read the bytes.
while (dataToRead > 0)
{
// Verify that the client is connected.
if (context.HttpContext.Response.IsClientConnected)
{
// Read the data in buffer.
length = iStream.Read(buffer, 0, 10000);
// Write the data to the current output stream.
context.HttpContext.Response.OutputStream.Write(buffer, 0, length);
// Flush the data to the HTML output.
context.HttpContext.Response.Flush();
buffer = new Byte[10000];
dataToRead = dataToRead - length;
}
else
{
//prevent infinite loop if user disconnects
dataToRead = -1;
}
}
}
catch (Exception ex)
{
// Trap the error, if any.
context.HttpContext.Response.Write("Error : " + ex.Message);
}
finally
{
if (iStream != null)
{
//Close the file.
iStream.Close();
}
context.HttpContext.Response.Close();
}
}
}
And the execution:
return new CustomFileResult(file.FullName);
Try using the
Response.TransmitFile(string fileName)
method.
It's really good and has some things to avoid OutOfMemory expections as well.
http://msdn.microsoft.com/en-us/library/12s31dhy(v=vs.80).aspx
Turns out the issue was a missing header.
context.HttpContext.Response.AddHeader("Content-Length", iStream.Length.ToString());
Adding that header solved the problem.
Once starting to write to the OutputStream, try flushing the OutputStream itself instead of flushing the response:
context.HttpContext.Response.OutputStream.Flush()
Your problem is here:
length = iStream.Read(buffer, 0, 10000);
// Write the data to the current output stream.
context.HttpContext.Response.OutputStream.Write(buffer, 0, length);
Every loop you read into a buffer of exactly 10,000 bytes and write that to stream. That means every file that someone downloads will be in multiples of 10,000. So if I was to download a file that is 9,998 bytes from your site, the file I got would be 10,000 bytes. Meaning that the hash would never match. My file would have 2 null bytes at the end of it.
You need to add a check to make sure that the amount of data to read is >=10k, and if it is not, resize your byte to the exact amount that is left, and transmit that. that should fix the hash mismatch
try something like this:
if (context.HttpContext.Response.IsClientConnected)
{
// Read the data in buffer.
if (dataToRead>=10000)
{
byte[] buffer = new byte[10000];
length = 10000
context.HttpContext.Response.OutputStream.Write(buffer, 0, length);
}
else
{
byte[] buffer = new byte[dataToRead];
length = buffer.Length;
context.HttpContext.Response.OutputStream.Write(buffer, 0, length);
}
// Flush the data to the HTML output.
context.HttpContext.Response.Flush();
dataToRead = dataToRead - length;
}

Categories