File download For Current Request - c#

I need to download the file as http response for the current http request.
Until now I used code as
System.Uri uri = System.Web.HttpContext.Current.Request.Url;
HttpWebRequest httpRequest = (HttpWebRequest)WebRequest.Create(
Path.Combine(uri.ToString(), filename));
httpRequest.Method = "GET";
using (HttpWebResponse httpResponse = (HttpWebResponse)httpRequest.GetResponse())
{
using (Stream responseStream = httpResponse.GetResponseStream())
{
using (FileStream localFileStream = new FileStream(
Path.Combine(localFolder, filename), FileMode.Open))
{
int bytesRead;
while ((bytesRead = responseStream.Read(buffer, 0, buffer.Length)) > 0)
{
totalBytesRead += bytesRead;
localFileStream.Write(buffer, 0, bytesRead);
}
}
}
}
But this code the request is only sending but not getting any responses...
Is this possible?

You should get the file off disk then use the Response.OutputStream to write the file directly to the response. Make sure to set the correct content headers so the browser will know what is coming.
FileInfo file = new FileInfo(Path.Combine(localFolder, filename));
int len = (int)file.Length, bytes;
Response.ContentType = "text/plain"; //Set the file type here
Response.AddHeader "Content-Disposition", "attachment;filename=" + filename;
context.Response.AppendHeader("content-length", len.ToString());
byte[] buffer = new byte[1024];
using(Stream stream = File.OpenRead(path)) {
while (len > 0 && (bytes =
stream.Read(buffer, 0, buffer.Length)) > 0)
{
Response.OutputStream.Write(buffer, 0, bytes);
len -= bytes;
}
}

Not sure, but it looks like your making a web request, getting the response stream, then attempting to buffer it out to localFolder. If so, FileMode.Open looks suspect ("should open an existing file..."?). Maybe use FileMode.Create.
MSDN ref
Also, does your web app needs to have write permissions to localFolder.

Related

File size on disk with WebResponse?

Is it possible to retreive file size on disk using webresponse?
if not, how can i retreive the file size?
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(finalUrl);
request.AllowAutoRedirect = true;
request.KeepAlive = false;
request.ProtocolVersion = HttpVersion.Version10;
request.ServicePoint.ConnectionLimit = 24;
//request.Method = "HEAD";
var contentType = string.Empty;
using (WebResponse webResponse = request.GetResponse())
{
contentType = webResponse.ContentType;
using (Stream stream = webResponse.GetResponseStream())
{
using (MemoryStream memoryStream = new MemoryStream())
{
Byte[] buffer = new Byte[1024];
Int32 bytesRead;
while ((bytesRead = stream.Read(buffer, 0, buffer.Length)) > 0)
{
memoryStream.Write(buffer, 0, bytesRead);
}
documentFile.Data = memoryStream.ToArray();
}
}
}
I think, it is the only way to get the file size information from the site before downloading the file is reading ContentLength header and there is no guarantee that server provide it.
var fileSize = webResponse.ContentLength;
or after download the file, you can get the length of the stream to retrieve size;
var fileSize = memoryStream.Length;

Image upload to FTP issue

I am using the following code to upload a image to my FTP server but I have a problem. If I use the code to check the dimension (height and width) of the image before upload then there is .png file created in the FTP server but it's empty (or invalid format) and if I remove the code to check the dimension, then the image is uploaded correctly. Does any have any idea on this?
public ActionResult UploadFile(int type, HttpPostedFileBase imagefile)
{
//check image height and width
using (System.Drawing.Image image = System.Drawing.Image.FromStream(imagefile.InputStream, true, true))
{
if (image.Width > 160 || image.Height > 160)
{
//do something here
}
}//end check image height and width
FtpWebRequest request = (FtpWebRequest)WebRequest.Create("ftpPath" + "/" + imagefile.FileName);
request.Credentials = new NetworkCredential("ftpUserName", "ftpPassword");
request.Method = WebRequestMethods.Ftp.UploadFile;
var sourceStream = imagefile.InputStream;
Stream requestStream = request.GetRequestStream();
request.ContentLength = sourceStream.Length;
int BUFFER_SIZE = imagefile.ContentLength;
byte[] buffer = new byte[BUFFER_SIZE];
int bytesRead = sourceStream.Read(buffer, 0, BUFFER_SIZE);
do
{
requestStream.Write(buffer, 0, bytesRead);
bytesRead = sourceStream.Read(buffer, 0, BUFFER_SIZE);
} while (bytesRead > 0);
sourceStream.Close();
requestStream.Close();
FtpWebResponse response = (FtpWebResponse)request.GetResponse();
response.Close();
}
As mentioned, you are reading up all the stream when you load it into your Image. However, I don't think you can reset the position (i.e. "Seek") on a NetworkStream (your InputStream). Once you read it, it is gone.
One thing you can do though is to create a MemoryStream and use Stream.CopyTo to copy the contents into that. Then, you can do anything you like with it, including resetting the position to 0 to "read it a second time".
//example of resetting a stream named "s"
s.Position = 0;
Thanks guys for the reply they were very helpful but I fixed this issue by making slight changes on my code as below, I checked the image dimension after reading the input Stream. May be someone find this helpful.
public string ftpUpload(HttpPostedFileBase imagefile, string filename)
{
var sourceStream = imagefile.InputStream;
int BUFFER_SIZE = imagefile.ContentLength;
byte[] buffer = new byte[BUFFER_SIZE];
int bytesRead = sourceStream.Read(buffer, 0, BUFFER_SIZE);
if (!CheckLogoDimension(sourceStream))
{
sourceStream.Close();
return "error";
}
FtpWebRequest request = (FtpWebRequest)WebRequest.Create(ftpRootPath + "/" + filename);
request.Credentials = new NetworkCredential(ftpUserName, ftpPassword);
request.Method = WebRequestMethods.Ftp.UploadFile;
Stream requestStream = request.GetRequestStream();
request.ContentLength = sourceStream.Length;
do
{
requestStream.Write(buffer, 0, bytesRead);
bytesRead = sourceStream.Read(buffer, 0, BUFFER_SIZE);
} while (bytesRead > 0);
sourceStream.Close();
requestStream.Close();
FtpWebResponse response = (FtpWebResponse)request.GetResponse();
response.Close();
return "Success";
}

Upload large file to ftp C#

I'm uploading large files to ftp site using this code.
Code
using (FileStream fs = new FileStream(FileLoc, FileMode.Open, FileAccess.Read))
{
string ftpUrl = string.Format("{0}/{1}", uploadUrl, uploadFileName);
FtpWebRequest requestObj = FtpWebRequest.Create(ftpUrl) as FtpWebRequest;
requestObj.Method = WebRequestMethods.Ftp.UploadFile;
requestObj.Credentials = new NetworkCredential(Uid, Pass);
using (Stream requestStream = requestObj.GetRequestStream())
{
byte[] buffer = new byte[8092];
int read = 0;
while ((read = fs.Read(buffer, 0, buffer.Length)) != 0)
{
requestStream.Write(buffer, 0, read);
}
requestStream.Flush();
if (fs != null)
{
fs.Close();
fs.Dispose();
}
if (requestStream != null)
{
requestStream.Close();
requestStream.Dispose();
}
}
}
Some times this code up-lode files very fine but some time it up-lodes some part of file not complete file and doesn't give any error.
Can any one help me why some time it upload only some part of file not hole file.
Here's the code we use for uploading to FTP. It looks very similar to your own. Nevertheleess, I post it for your reference as we haven't had any such reported failures
private void UploadFile(string fileToUpload)
{
Output = new Uri(Path.Combine(Output.ToString(), Path.GetFileName(fileToUpload)));
FtpWebRequest request = WebRequest.Create(Output) as FtpWebRequest;
request.Method = WebRequestMethods.Ftp.UploadFile;
// in order to work with Microsoft Windows Server 2003 + IIS, we can't use passive mode.
request.UsePassive = false;
request.Credentials = new NetworkCredential(username, password);
// Copy the contents of the file to the request stream.
Stream dest = request.GetRequestStream();
FileStream src = File.OpenRead(fileToUpload);
int bufSize = (int)Math.Min(src.Length, 1024);
byte[] buffer = new byte[bufSize];
int bytesRead = 0;
int numBuffersUploaded = 0;
do
{
bytesRead = src.Read(buffer, 0, bufSize);
dest.Write(buffer, 0, bufSize);
numBuffersUploaded++;
}
while (bytesRead != 0);
dest.Close();
src.Close();
FtpWebResponse response = (FtpWebResponse)request.GetResponse();
if (response.StatusCode != FtpStatusCode.ClosingData)
{
Console.Error.WriteLine("Request {0}: Error uploading file to FTP server: {1} ({2})", Id, response.StatusDescription, response.StatusCode);
}
else
{
Console.Out.WriteLine("Request {0}: Successfully transferred file to {1}", Id, Output.ToString());
}
}

Return stream reader from FTP response is good practice or not

I have a method for FTP download file, but I do not save file locally rather I parse the file in memory through ftp response. My question is, is returning stream reader after getting ftp response stream a good practice? Because do not want to do parsing and other stuff in the same method.
var uri = new Uri(string.Format("ftp://{0}/{1}/{2}", "somevalue", remotefolderpath, remotefilename));
var request = (FtpWebRequest)FtpWebRequest.Create(uri);
request.Credentials = new NetworkCredential(userName, password);
request.Method = WebRequestMethods.Ftp.DownloadFile;
var ftpResponse = (FtpWebResponse)request.GetResponse();
/* Get the FTP Server's Response Stream */
ftpStream = ftpResponse.GetResponseStream();
return responseStream = new StreamReader(ftpStream);
For me there are 2 disadvantages of using the stream directly, if you can live with them, you shouldn't waste memory or disk space.
In this stream you can not seek to a specific position, you can only read the contents as it comes in;
Your internet connection could suddenly drop and you will get an exception while parsing and processing your file, either split the parsing and processing or make sure your processing routine can handle the case that a file is processed for a second time (after a failure halfway through the first attempt).
To work around these issues, you could copy the stream to a MemoryStream:
using (var ftpStream = ftpResponse.GetResponseStream())
{
var memoryStream = new MemoryStream()
while ((bytesRead = ftpStream.Read(buffer, 0, buffer.Length)) > 0)
{
memoryStream.Write(buffer, 0, bytesRead);
}
memoryStream.Flush();
memoryStream.Position = 0;
return memoryStream;
}
If you are working with larger files I prefer writing it to a file, this way you minimize the memory footprint of your application:
using (var ftpStream = ftpResponse.GetResponseStream())
{
var fileStream = new FileStream(Path.GetTempFileName(), FileMode.CreateNew)
while ((bytesRead = ftpStream.Read(buffer, 0, buffer.Length)) > 0)
{
fileStream.Write(buffer, 0, bytesRead);
}
fileStream.Flush();
fileStream.Position = 0;
return fileStream;
}
I see more practical returning a responseStream when you are performing an HttpWebRequest. If you are using FtpWebRequest it means you are working with files. I would read the responseStream to byte[] and return the byte file content of the downloaded file, so you can easily work with the System.IO.Fileclasses to handle the file.
Thanks Carlos it was really helpful . I just return the byte[]
byte[] buffer = new byte[16 * 1024];
using (MemoryStream ms = new MemoryStream())
{
int read;
while ((read = ftpStream.Read(buffer, 0, buffer.Length)) > 0)
{
ms.Write(buffer, 0, read);
}
memoryStream=ms;
}
return memoryStream.ToArray();
and used byte[] in the method like this
public async Task ParseReport(byte[] bytesRead)
{
Stream stream = new MemoryStream(bytesRead);
using (StreamReader reader = new StreamReader(stream))
{
string line = null;
while (null != (line = reader.ReadLine()))
{
string[] values = line.Split(';');
}
}
stream.Close();
}

How to stream video content in asp.net?

I have the following code which downloads video content:
WebRequest wreq = (HttpWebRequest)WebRequest.Create(url);
using (HttpWebResponse wresp = (HttpWebResponse)wreq.GetResponse())
using (Stream mystream = wresp.GetResponseStream())
{
using (BinaryReader reader = new BinaryReader(mystream))
{
int length = Convert.ToInt32(wresp.ContentLength);
byte[] buffer = new byte[length];
buffer = reader.ReadBytes(length);
Response.Clear();
Response.Buffer = false;
Response.ContentType = "video/mp4";
//Response.BinaryWrite(buffer);
Response.OutputStream.Write(buffer, 0, buffer.Length);
Response.End();
}
}
But the problem is that the whole file downloads before being played. How can I make it stream and play as it's still downloading? Or is this up to the client/receiver application to manage?
You're reading the entire file into a single buffer, then sending the entire byte array at once.
You should read into a smaller buffer in a while loop.
For example:
byte[] buffer = new byte[4096];
while(true) {
int bytesRead = myStream.Read(buffer, 0, buffer.Length);
if (bytesRead == 0) break;
Response.OutputStream.Write(buffer, 0, bytesRead);
}
This is more efficient for you especially if you need to stream a video from a file on your server or even this file is hosted at another server
File On your server:
context.Response.BinaryWrite(File.ReadAllBytes(HTTPContext.Current.Server.MapPath(_video.Location)));
File on external server:
var wc = new WebClient();
context.Response.BinaryWrite(wc.DownloadData(new Uri("http://mysite/video.mp4")));
Have you looked at Smooth Streaming?
Look at sample code here

Categories