Asp.Net advanced file uploading - c#

When using a standard <input type="file" /> on an mvc3 site, you can receive the file in your action method by creating an input parameter of type HttpPostedFile and setting the form to enctype="multipart/form-data"
One of the problems of this approach is that the request does not complete and is not handed off to your action method until the entire contents of the file have been uploaded.
I would like to do some things to that file as it is being uploaded to the server. Basically i want to asynchronously receive the data as it comes in and then programmatically handle the data byte by byte.
To accomplish the above I imagine you will need to handle this part of the request in an HttpModule or custom HttpHandler perhaps. I am familiar with how those things work, but I am not familiar with the method of receiving the file upload data asynchronously as it comes in.
I know this is possible because I have worked with 3rd party components in the past that do this (normally so they can report upload progress, or cache the data to disk to avoid iis/asp.net memory limitations). Unfortunately all the components I have used are closed source so I can't peek inside and see what they are doing.
I am not looking for code, but can someone get me pointed in the right direction here?

Using a WCF service you can send file streams to and from your service.
Here is the service side receive code I use:
int chunkSize = 2048;
byte[] buffer = new byte[chunkSize];
using (System.IO.FileStream writeStream =
new System.IO.FileStream(file.FullName, System.IO.FileMode.CreateNew, System.IO.FileAccess.Write))
{
do
{
// read bytes from input stream
int bytesRead = request.FileByteStream.Read(buffer, 0, chunkSize);
if (bytesRead == 0) break;
// write bytes to output stream
writeStream.Write(buffer, 0, bytesRead);
} while (true);
writeStream.Close();
}
If that looks like what you want, check out the CodeProject File Transfer Progress. It goes into a lot of detail that my code is loosely based on.

Related

100s of concurrent users trying to download files (asp.net C# application)

I am trying to implement file download feature in asp.net application. The application would be used by say around 200 users concurrently to download various files.
It would be hosted on IIS 7. I do not want the application server to crash because of multiple requests coming concurrently.
I am assuming that by calling Context.Response.Flush() in a loop, I am flushing out all the file data that I would have read till then, so application memory usage would be kept uniform. What other optimizations can I make to the current code or what other approach should be used in a scenario like this?
The requests would be for various files and the file sizes can be anywhere between 100 KB to 10 MB.
My current code is like this:
FileStream inStr = null;
byte[] buffer = new byte[1024];
String fileName = #"C:\DwnldTest\test.doc";
long byteCount; inStr = File.OpenRead(fileName);
Response.AddHeader("content-disposition", "attachment;filename=test.doc");
while ((byteCount = inStr.Read(buffer, 0, buffer.Length)) > 0)
{
if (Context.Response.IsClientConnected)
{
Context.Response.ContentType = "application/msword";
//Context.Response.BufferOutput = true;
Context.Response.OutputStream.Write(buffer, 0, buffer.Length);
Context.Response.Flush();
}
}
You can use Response.TransmitFile to save server memory when sending files.
Response.ContentType = "application/pdf";
Response.AddHeader("content-disposition", "attachment; filename=testdoc.pdf");
Response.TransmitFile(#"e:\inet\www\docs\testdoc.pdf");
Response.End();
In your code example, you're not closing / disposing inStr. That could affect performance.
Another more simple way to do this would be to use the built in method:
WriteFile
It should already be optimized and will take care of opening / closing files for you.
Maybe you want to use FileSystemWatcher class to check if the file was modified, and read it into memory only while such change was detected. For rest of the time just return the byte array that is already stored in memory. I don't know if HttpResponse.WriteFile method is sensitive for such file modification changes, or if always reads a file from given path, but this also seems to be a good option to use, as it is served by framework out of the box.
Since you are sending an existing file to the client, consider using HttpResponse.TransmitFile (http://msdn.microsoft.com/en-us/library/12s31dhy.aspx).
Looking at the .NET code it seems that this will forward the file writing to IIS instead of reading/writing it in ASP.NET process. HttpResponse.WriteFile(string, false) and HttpResponse.Write(string) seems to do the same thing.
In order to verify that the file sending is relayed to IIS, at HttpResponse.Output property - it should be of type HttpWriter. The HttpWriter._buffers array should now contain a new element HttpFileResponseElement).
Of course, you should always investigate if caching is appropriate in your scenario and test if it is being used.

Sending files over TCP/ .NET SSLStream is slow/not working

Im writing an Server/Client Application which works with SSL(over SSLStream), which has to do many things(not only file receiving/sending). Currently, It works so: Theres only one connection. I always send the data from the client/server using SSLStream.WriteLine() and receive it using SSLStream.ReadLine(), because I can send all informations over one connection and I can send from all threads without destroying the data.
Now I wanted to implement the file sending and receiving. Like other things in my client/server apps, every message has a prefix (like cl_files or sth) and a base64 encoded content part(prefix and content are seperated by |). I implemented the file sharing like that: The uploader send to the receiver a message about the total file size and after that the uploader sends the base64 encoded parts of the file over the prefix r.
My problem is that the file sharing is really slow. I got around 20KB/s from localhost to localhost. I have also another problem. If I increase the size of the base64 encoded parts of the file(which makes file sharing faster), the prefix r doesnt go out to the receiver anymore(so the datas couldnt be identified).
How can I make it faster?
Any help will be greatly appreciated.
My(propably bad) code is for the client:
//its running inside a Thread
FileInfo x = new FileInfo(ThreadInfos.Path);
long size = x.Length; //gets total size
long cursize = 0;
FileStream fs = new FileStream(ThreadInfos.Path, FileMode.Open);
Int16 readblocks = default(Int16);
while (cursize < size) {
byte[] buffer = new byte[4096];
readblocks = fs.Read(buffer, 0, 4096);
ServerConnector.send("r", getBase64FromBytes(buffer));//It sends the encoded Data with the prefix r over SSLStream.WriteLine
cursize = cursize + Convert.ToInt64(readblocks);
ThreadInfos.wait.setvalue((csize / size) * 100);//outputs value to the gui
}
fs.Close();
For the Server:
case "r"://switch case for prefixes
if (isreceiving)
{
byte[] buffer = getBytesFromBase64(splited[1]);//splited ist the received Line over ReadLine splitted by the seperator "|"
rsize = rsize + buffer.LongLength;
writer.Write(buffer, 0, buffer.Length);//it writes the decoded data into the file
if (rsize == rtotalsize)//checks if file is completed
{
writer.Close();
}
}
break;
Your problem stems from the fact that you are performing what is essentially a binary operation through a text protocol and you are exacerbating that problem by doing it over an encrypted channel. I'm not going to re-invent this for you, but here are some options...
Consider converting to an HTTPS client/server model instead of reinventing the wheel. This will give you a well-defined model for PUT/GET operations on files.
If you can not (or will not) convert to HTTPS, consider other client/server libraries that provide a secure transport and well-defined protocol for binary data. For example, I often use protobuf-csharp-port and protobuf-csharp-rpc to provide a secure protocol and transport within our datacenter or local network.
If you are stuck with your transport being a raw SslStream, try using a well-defined and proven binary serialization framework like protobuf-csharp-port or protobuf-net to define your protocol.
Lastly, if you must continue with the framework you have, try some http-like tricks. Write a name/value pair as text that defines the raw-binary content that follows.
First of all base64 over ssl will be slow anyway, ssl itself is slower then raw transport. File transfers are not done over base64 now days, http protocol is much more stable than anything else and most libraries on all platforms are very well stable. Base64 takes more size then actual data, plus the time to encode.
Also, your following line may be a problem.
ThreadInfos.wait.setvalue((csize / size) * 100);//outputs value to the gui
If your this line is blocking, then this will slow down for every 4kb. Updating for every 4kb is also not right, unless a progress value from previous value differs by significant amount, there is no need to update ui for it.
I'd give a try of gzip compress before/after the network. From my experience, it helps. I'd say some code like this could help :
using(GZipStream stream = new GZipStream(sslStream, CompressionMode.Compress))
{
stream.Write(...);
stream.Flush();
stream.Close();
}
Warning : It may interfer with SSL if the Flush is not done. and it will need some tests... and I didn't try to compile the code.
I think Akash Kava is right.
while (cursize < size) {
DateTime start = DateTime.Now;
byte[] buffer = new byte[4096];
readblocks = fs.Read(buffer, 0, 4096);
ServerConnector.send("r", getBase64FromBytes(buffer));
DateTime end = DateTime.Now;
Console.Writline((end-start).TotalSeconds);
cursize = cursize + Convert.ToInt64(readblocks);
ThreadInfos.wait.setvalue((csize / size) * 100);
end = DateTime.Now;
Console.Writline((end-start).TotalSeconds);
}
By doing this you can find out where is the bottle neck.
Also the way you sending data packets to server is not robust.
Is it possible to paste your implementation of
ThreadInfos.wait.setvalue((csize / size) * 100);

Find Length of Stream object in WCF Client?

I have a WCF Service, which uploads the document using Stream class.
Now after this, i want to get the Size of the document(Length of Stream), to update the fileAttribute for FileSize.
But doing this, the WCF throws an exception saying
Document Upload Exception: System.NotSupportedException: Specified method is not supported.
at System.ServiceModel.Dispatcher.StreamFormatter.MessageBodyStream.get_Length()
at eDMRMService.DocumentHandling.UploadDocument(UploadDocumentRequest request)
Can anyone help me in solving this.
Now after this, i want to get the Size of the document(Length of Stream), to update the fileAttribute for FileSize.
No, don't do that. If you are writing a file, then just write the file. At the simplest:
using(var file = File.Create(path)) {
source.CopyTo(file);
}
or before 4.0:
using(var file = File.Create(path)) {
byte[] buffer = new byte[8192];
int read;
while((read = source.Read(buffer, 0, buffer.Length)) > 0) {
file.Write(buffer, 0, read);
}
}
(which does not need to know the length in advance)
Note that some WCF options (full message security etc) require the entire message to be validated before processing, so can never truly stream, so: if the size is huge, I suggest you instead use an API where the client splits it and sends it in pieces (which you then reassemble at the server).
If the stream doesn't support seeking you cannot find its length using Stream.Length
The alternative is to copy the stream to a byte array and find its cumulative length. This involves processing the whole stream first , if you don't want this, you should add a stream length parameter to your WCF service interface

how to copy one Stream object values to second Stream Object in asp.net

In my project user can upload file up to 1GB. I want to copy that uploaded file stream data to second stream.
If I use like this
int i;
while ( ( i = fuVideo.FileContent.ReadByte() ) != -1 )
{
strm.WriteByte((byte)i);
}
then it is taking so much time.
If i try to do this by byte array then I need to add array size in long which is not valid.
If someone has better idea to do this then please let me know.
--
Hi Khepri thanks for your response. I tried Stream.Copy but it is taking so much time to copy one stream object to second.
I tried with 8.02Mb file and it took 3 to 4 minutes.
The code i have added is
Stream fs = fuVideo.FileContent; //fileInf.OpenRead();
Stream strm = ftp.GetRequestStream();
fs.CopyTo(strm);
If i am doing something wrong then please let me know.
Is this .NET 4.0?
If so Stream.CopyTo is probably your best bet.
If not, and to give credit where credit is due, see the answer in this SO thread. If you're not .NET 4.0 make sure to read the comments in that thread as there are some alternative solutions (Async stream reading/writing) that may be worth investigating if performance is at an absolute premium which may be your case.
EDIT:
Based off the update, are you trying to copy the file to another remote destination? (Just guessing based on GetRequestStream() [GetRequestStream()]. The time is going to be the actual transfer of the file content to the destination. So in this case when you do fs.CopyTo(strm) it has to move those bytes from the source stream to the remote server. That's where the time is coming from. You're literally doing a file upload of a huge file. CopyTo will block your processing until it completes.
I'd recommend looking at spinning this kind of processing off to another task or at the least look at the asynchronous option I listed. You can't really avoid this taking a large period of time. You're constrained by file size and available upload bandwidth.
I verified that when working locally CopyTo is sub-second. I tested with a half gig file and a quick Stopwatch class returned a processing time of 800 millisecondss.
If you are not .NET 4.0 use this
static void CopyTo(Stream fromStream, Stream destination, int bufferSize)
{
int num;
byte[] buffer = new byte[bufferSize];
while ((num = fromStream.Read(buffer, 0, buffer.Length)) != 0)
{
destination.Write(buffer, 0, num);
}
}

SOAP , getting progress of the uploaded Request while its uploading c#

Im trying to upload a file through a SOAP request , and it worked perfectly , but I couldnt get a progress for the uploaded amount of the request .
You could try sending the file up in "chunks", like 1MB at a time rather than sending it all up at once? That way when each chunk completes, you'll be able to update the progress.
I can answer my question now,
Im not using SOAP anymore to upload my files in my solution, Im using HTTPWebRequest now,
1) yes im uploading my large files in chunks (each chuck is 1MB),
2) each chunk(1 MB) can give me progress each BufferSize (4 KB in my case);
so there is a big loop, foreach(Chunk in File) {} .
and inside the big loop there is another loop, as Im using HTTPWebRequest:
long buffer = 4096;
Stream stm = request.GetRequestStream();
while (remainingOfChunkWithReq != 0)
{
stm.Write(buffer, 0, bytesRead);
remainingOfChunkWithReq = remainingOfChunkWithReq - bytesRead;
bytesRead = memoryStream.Read(buffer, 0, bytesSize);
//Send Progress
}
then continue to send the request. and receive the response.

Categories