ASP.NET: Serve large video as content with one time url - c#

I need serve a very large video (1gb) as content and in order to do it I using the following code.
protected void Page_Load(object sender, EventArgs e)
{
///check the token
///proceed if token if valid
Response.Clear();
Response.Buffer = false;
Response.ContentType = "video/mp4";
var wc = new WebClient();
string filePath = Server.MapPath("videos/vid.mp4");
//Context.Response.BinaryWrite(File.ReadAllBytes(filePath));
const int chunkSize = 1024; // read the file by chunks of 1KB
using (var file = File.OpenRead(filePath))
{
int bytesRead;
byte[] buffer = new byte[chunkSize];
while ((bytesRead = file.Read(buffer, 0, buffer.Length)) > 0)
{
Context.Response.BinaryWrite(buffer);
}
}
Response.End();
}
Questions
Is this the right way?
Video is getting played but video's seek bar is not working.
How to provide video in chunks to the client?

Related

Not able to download and parallelly open a file object stored in server in asp.net

I have tried to download and opened a file object stored in Server, but not able to open the file post download. Please find the sample code I am trying as below. I have used Content-Disposition to inline, but still it is not working as mentioned in other solution to this problem. It just downloads, but don't open. Any help please on this?
protected void Page_Load(object sender, EventArgs e)
{
string filePath = this.Server.MapPath("~/Scripts/48A4-4972-AEBE-8E7C11AE64AF");
string fileName = "crma.jpg";
FileStream stream = null;
HttpResponse response = HttpContext.Current.Response;
long dataToRead;
try
{
stream = new FileStream(filePath, FileMode.Open);
dataToRead = stream.Length;
response.ClearContent();
response.AddHeader("Content-Disposition", "inline; filename=" + "\"" + fileName + "\"");
response.AddHeader("Content-Length", dataToRead.ToString());
response.ContentType = "application/jpg";
byte[] buffer = new Byte[10000];
while (dataToRead > 0)
{
if (response.IsClientConnected)
{
int length = stream.Read(buffer, 0, 10000);
response.OutputStream.Write(buffer, 0, length);
response.Flush();
buffer = new Byte[10000];
dataToRead = dataToRead - length;
}
else
{
dataToRead = -1;
}
}
}
finally
{
if (stream != null)
stream.Close();
response.End();
response.Close();
}
}

Failed to download files when deployed to Azure App Service

Large file download code found running on IIS.
But it stops at Azure AppService.
The file will download well and then stop when it reaches 1 GB.
Is there a problem with Azure App Service settings?
Please let me know what the problem is.
This is the ashx file code.
public void ProcessRequest(HttpContext context)
{
Stream stream = null;
int bytesToRead = 10000;
byte[] buffer = new Byte[bytesToRead];
string Url = context.Request.QueryString["Url"];
string FileName = HttpUtility.UrlEncode(context.Request.QueryString["FileName"]).Replace("+","%20");
try
{
HttpWebRequest fileReq = (HttpWebRequest)HttpWebRequest.Create(Url);
HttpWebResponse fileResp = (HttpWebResponse)fileReq.GetResponse();
if (fileReq.ContentLength > 0)
fileResp.ContentLength = fileReq.ContentLength;
stream = fileResp.GetResponseStream();
var resp = HttpContext.Current.Response;
resp.ContentType = "application/octet-stream";
resp.AddHeader("Content-Disposition", "attachment; filename=\"" + FileName + "\"");
resp.AddHeader("Content-Length", fileResp.ContentLength.ToString());
int length;
do
{
if (resp.IsClientConnected)
{
length = stream.Read(buffer, 0, bytesToRead);
resp.OutputStream.Write(buffer, 0, length);
resp.Flush();
buffer = new Byte[bytesToRead];
}
else
{
// cancel the download if client has disconnected
length = -1;
}
} while (length > 0); //Repeat until no data is read
}
finally
{
if (stream != null)
{
//Close the input stream
stream.Close();
}
}
}
To be honest, I didn't get any trouble using your code to download large files more than 1GB, even after publishing to Azure.
First, httpWebRequest doesn't not has any artificial size limit. I was wondering if you should consider other code to download, because it's not convenient if we couldn't see the details about the error log, and the download process.
Here is a issue might inspire you: C# - Is there a limit to the size of an httpWebRequest stream?
If you want try another code, try this:
static void Main(string[] args)
{
HttpWebRequestDownload hDownload = new HttpWebRequestDownload();
string downloadUrl = "http://speedtest.tele2.net/10MB.zip";
hDownload.DownloadProgressChanged += HDownloadOnDownloadProgressChanged;
hDownload.DownloadFileCompleted += delegate(object o, EventArgs args)
{
Debug.WriteLine("Download finished and saved to: "+hDownload.downloadedFilePath);
};
hDownload.Error += delegate(object o, string errMessage) { Debug.WriteLine("Error has occured !! => "+errMessage); };
hDownload.DownloadFile(downloadUrl);
}
private void HDownloadOnDownloadProgressChanged(object sender, HttpWebRequestDownload.ProgressEventArgs e)
{
Debug.WriteLine("progress: "+e.TransferredBytes+" => "+e.TransferredPercents);
}

Uploading image as attachment in RESTful WCF Service

I am trying to upload an image as an attachment in REST WCF Service and I am getting the following error.
"Access to path "C:\ImageUpload" is denied"
I have enabled Full Contorl permissions to this folder. I dont understand why I am getting this error.
I am new to WCF, and the most of the code I gathered from online resources.
Appreciate if you could let me know If there is any mistake in my code.
Here is my code.
REST WCF Service Code:
[OperationContract]
[WebInvoke(UriTemplate = "uploadImage/{parameter1}")]
void uploadImage(Stream fileStream);
public void uploadImage(Stream fileStream)
{
string filePath = #"C:\ImageUpload";
FileStream filetoUpload = new FileStream(filePath, FileMode.Create);
byte[] byteArray = new byte[10000];
int bytesRead, totalBytesRead = 0;
do
{
bytesRead = fileStream.Read(byteArray, 0, byteArray.Length);
totalBytesRead += bytesRead;
}
while (bytesRead > 0);
filetoUpload.Write(byteArray, 0, byteArray.Length);
filetoUpload.Close();
filetoUpload.Dispose();
}
This is my Test Client Code(Simple .aspx web page)
protected void btnUpload_Click(object sender, EventArgs e)
{
string file = FileUpload1.FileName;
RESTService1Client client = new RESTService1Client();
byte[] bytearray = null;
string name = "";
if (FileUpload1.HasFile)
{
name = FileUpload1.FileName;
Stream stream = FileUpload1.FileContent;
stream.Seek(0, SeekOrigin.Begin);
bytearray = new byte[stream.Length];
int count = 0;
while (count < stream.Length)
{
bytearray[count++] = Convert.ToByte(stream.ReadByte());
}
}
WebClient wclient = new WebClient();
wclient.Headers.Add("Content-Type", "image/jpeg");
client.uploadImage(FileUpload1.FileContent);
}
It's likely nothing to do with WCF or your code. It really is highly probable that permissions on that folder are insufficient for the IIS process user. By default the ASP.NET user is Network Service.
Try creating a new Windows user just for your ASP.NET application. Grant that user explicit read/write access to the upload folder. Then use Impersonation to make ASP.NET use that user.
http://www.codeproject.com/Articles/107940/Back-to-Basic-ASP-NET-Runtime-Impersonation
Rewrite server side as such:
REST WCF Service Code:
[OperationContract]
[WebInvoke(UriTemplate = "uploadImage/{parameter1}/{parameter2}")]
void uploadImage(Stream fileStream, string fileName);
.
public void uploadImage(Stream fileStream, string fileName)
{
string filePath = #"C:\ImageUpload\";
using (FileStream filetoUpload = new FileStream(filePath + fileName, FileMode.Create))
{
byte[] byteArray = new byte[10000];
int bytesRead = 0;
do
{
bytesRead = fileStream.Read(byteArray, 0, byteArray.Length);
if (bytesRead > 0)
{
filetoUpload.Write(byteArray, 0, bytesRead);
}
}
while (bytesRead > 0);
}
}
and your client side as such:
protected void btnUpload_Click(object sender, EventArgs e)
{
if (FileUpload1.HasFile)
{
RESTService1Client client = new RESTService1Client();
client.uploadImage(FileUpload1.FileContent, Path.GetFileName(FileUpload1.FileName));
}
}

How can we show progress bar for upload with FtpWebRequest

I am uploading files to ftp using FtpWebRequest. I need to show the status that how much is done.
So far my code is:
public void Upload(string filename, string url)
{
FileInfo fileInf = new FileInfo(filename);
string uri = "ftp://" + url + "/" + fileInf.Name;
FtpWebRequest reqFTP;
//string uri = "ftp://" + Host + "/public_html/testing/blogtest/" + fileInf.Name;
// Create FtpWebRequest object from the Uri provided
reqFTP = (FtpWebRequest)FtpWebRequest.Create(new Uri(uri));
// Provide the WebPermission Credintials
reqFTP.Credentials = new NetworkCredential(Username, Password);
// By default KeepAlive is true, where the control connection is not closed
// after a command is executed.
reqFTP.KeepAlive = false;
//reqFTP.UsePassive = true;
// Specify the command to be executed.
reqFTP.Method = WebRequestMethods.Ftp.UploadFile;
// Specify the data transfer type.
reqFTP.UseBinary = true;
// Notify the server about the size of the uploaded file
reqFTP.ContentLength = fileInf.Length;
// The buffer size is set to 2kb
int buffLength = 2048;
byte[] buff = new byte[buffLength];
int contentLen;
// Opens a file stream (System.IO.FileStream) to read the file to be uploaded
FileStream fs = fileInf.OpenRead();
// Stream to which the file to be upload is written
Stream strm = reqFTP.GetRequestStream();
// Read from the file stream 2kb at a time
contentLen = fs.Read(buff, 0, buffLength);
// Till Stream content ends
while (contentLen != 0)
{
// Write Content from the file stream to the FTP Upload Stream
strm.Write(buff, 0, contentLen);
contentLen = fs.Read(buff, 0, buffLength);
}
// Close the file stream and the Request Stream
strm.Close();
fs.Close();
}
The easiest is to use BackgroundWorker and put your code into DoWork event handler. And report progress with BackgroundWorker.ReportProgress.
The basic idea:
private void backgroundWorker1_DoWork(object sender, DoWorkEventArgs e)
{
var ftpWebRequest = (FtpWebRequest)WebRequest.Create("ftp://example.com");
ftpWebRequest.Method = WebRequestMethods.Ftp.UploadFile;
using (var inputStream = File.OpenRead(fileName))
using (var outputStream = ftpWebRequest.GetRequestStream())
{
var buffer = new byte[1024 * 1024];
int totalReadBytesCount = 0;
int readBytesCount;
while ((readBytesCount = inputStream.Read(buffer, 0, buffer.Length)) > 0)
{
outputStream.Write(buffer, 0, readBytesCount);
totalReadBytesCount += readBytesCount;
var progress = totalReadBytesCount * 100.0 / inputStream.Length;
backgroundWorker1.ReportProgress((int)progress);
}
}
}
private void backgroundWorker1_ProgressChanged(object sender, ProgressChangedEventArgs e)
{
progressBar.Value = e.ProgressPercentage;
}
Make sure WorkerReportsProgress is enabled
backgroundWorker2.WorkerReportsProgress = true;
With BackgroundWorker you can also easily implement upload cancellation.
A trivial example of FTP upload using FtpWebRequest with WinForms progress bar using Task class:
private void button1_Click(object sender, EventArgs e)
{
// Run Upload on background thread
Task.Run(() => Upload());
}
private void Upload()
{
string url = "ftp://ftp.example.com/remote/path/file.zip";
FtpWebRequest request = (FtpWebRequest)WebRequest.Create(url);
request.Credentials = new NetworkCredential("username", "password");
request.Method = WebRequestMethods.Ftp.UploadFile;
using (Stream fileStream = File.OpenRead(#"C:\local\path\file.zip"))
using (Stream ftpStream = request.GetRequestStream())
{
progressBar1.Invoke(
(MethodInvoker)delegate {
progressBar1.Maximum = (int)fileStream.Length; });
byte[] buffer = new byte[10240];
int read;
while ((read = fileStream.Read(buffer, 0, buffer.Length)) > 0)
{
ftpStream.Write(buffer, 0, read);
progressBar1.Invoke(
(MethodInvoker)delegate {
progressBar1.Value = (int)fileStream.Position; });
}
}
}
The core upload code is based on:
Upload and download a file to/from FTP server in C#/.NET
A cancellable approach using the async/await pattern's IProgress interface, taking advantage of overlapped I/O if available. Refer to KB156932 to determine if your scenario qualifies. The cancellation token is checked before opening the streams, but otherwise is offloaded to the streams' async methods while the file is being transferred.
I have done very little benchmarking, but suspect this is only practical when sending large files. The performance of using overlapped I/O may degrade with smaller files and especially smaller buffer sizes.
public async Task FtpAsync(string sourceFile, Uri destinationUri, string user, SecureString password, IProgress<decimal> progress, CancellationToken token)
{
const int bufferSize = 128 * 1024; // 128kb buffer
progress.Report(0m);
var request = (FtpWebRequest)WebRequest.Create(destinationUri);
request.Method = WebRequestMethods.Ftp.UploadFile;
request.Credentials = new NetworkCredential(user, password);
token.ThrowIfCancellationRequested();
using (var fileStream = new FileStream(sourceFile, FileMode.Open, FileAccess.Read, FileShare.Read, bufferSize, true))
{
using (var ftpStream = await request.GetRequestStreamAsync())
{
var buffer = new byte[bufferSize];
int read;
while ((read = await fileStream.ReadAsync(buffer, 0, buffer.Length, token)) > 0)
{
await ftpStream.WriteAsync(buffer, 0, read, token);
var percent = 100m * ((decimal)fileStream.Position / fileStream.Length);
progress.Report(percent);
}
}
}
var response = (FtpWebResponse)await request.GetResponseAsync();
var success = (int)response.StatusCode >= 200 && (int)response.StatusCode < 300;
response.Close();
if (!success)
throw new Exception(response.StatusDescription);
}
See BackgroundWorker, it allows you to run a time consuming task while the GUI still is responsive and also provides progress/cancellation.

How to stream video content in asp.net?

I have the following code which downloads video content:
WebRequest wreq = (HttpWebRequest)WebRequest.Create(url);
using (HttpWebResponse wresp = (HttpWebResponse)wreq.GetResponse())
using (Stream mystream = wresp.GetResponseStream())
{
using (BinaryReader reader = new BinaryReader(mystream))
{
int length = Convert.ToInt32(wresp.ContentLength);
byte[] buffer = new byte[length];
buffer = reader.ReadBytes(length);
Response.Clear();
Response.Buffer = false;
Response.ContentType = "video/mp4";
//Response.BinaryWrite(buffer);
Response.OutputStream.Write(buffer, 0, buffer.Length);
Response.End();
}
}
But the problem is that the whole file downloads before being played. How can I make it stream and play as it's still downloading? Or is this up to the client/receiver application to manage?
You're reading the entire file into a single buffer, then sending the entire byte array at once.
You should read into a smaller buffer in a while loop.
For example:
byte[] buffer = new byte[4096];
while(true) {
int bytesRead = myStream.Read(buffer, 0, buffer.Length);
if (bytesRead == 0) break;
Response.OutputStream.Write(buffer, 0, bytesRead);
}
This is more efficient for you especially if you need to stream a video from a file on your server or even this file is hosted at another server
File On your server:
context.Response.BinaryWrite(File.ReadAllBytes(HTTPContext.Current.Server.MapPath(_video.Location)));
File on external server:
var wc = new WebClient();
context.Response.BinaryWrite(wc.DownloadData(new Uri("http://mysite/video.mp4")));
Have you looked at Smooth Streaming?
Look at sample code here

Categories