c# upload in real time from client to server to amazon s3 - c#

Good morning, I have an desktop app that uploads files to a WCF service and then WCF Service uploads to Amazon S3.
This is my WCF method that receives the file and uploads to S3.
public void UploadFile(RemoteFileInfo request)
{
config = new AmazonS3Config();
config.CommunicationProtocol = Protocol.HTTP;
accessKeyID = "XXXXXXX";
secretAccessKeyID = "YYYYYYYY";
client = Amazon.AWSClientFactory.CreateAmazonS3Client(accessKeyID, secretAccessKeyID, config);
int chunkSize = 2048;
byte[] buffer = new byte[chunkSize];
using (System.IO.MemoryStream writeStream = new System.IO.MemoryStream())
{
do
{
// read bytes from input stream
int bytesRead = request.FileByteStream.Read(buffer, 0, chunkSize);
if (bytesRead == 0) break;
// simulates slow connection
System.Threading.Thread.Sleep(3);
// write bytes to output stream
writeStream.Write(buffer, 0, bytesRead);
} while (true);
// report end
Console.WriteLine("Done!");
// start the uploading to S3
PutObjectRequest fileRequest = new PutObjectRequest();
fileRequest.WithInputStream(writeStream);
fileRequest.Key = "testfile.pdf";
fileRequest.WithBucketName("tempbucket");
fileRequest.CannedACL = S3CannedACL.Private;
fileRequest.StorageClass = S3StorageClass.Standard;
client.PutObject(fileRequest);
writeStream.Close();
}
}
On my client I get the progress in real time when upload the file to the WCF Service but when I get the 100% complete it doesnt mean that the file has already uploaded to S3, so I would like to know if its possible to being uploading the file to S3 while Im writing the stream (inside of the using
(System.IO.MemoryStream writeStream = new System.IO.MemoryStream())
{
Is this possible? Any guideline on how to do it?
Appreciate in advance.

You can use InputStream property of PutObjectRequest
public void UploadFile(RemoteFileInfo request)
{
config = new AmazonS3Config();
config.CommunicationProtocol = Protocol.HTTP;
accessKeyID = "XXXXXXX";
secretAccessKeyID = "YYYYYYYY";
client = Amazon.AWSClientFactory.CreateAmazonS3Client(accessKeyID,secretAccessKeyID,config);
int chunkSize = 2048;
byte[] buffer = new byte[chunkSize];
PutObjectRequest fileRequest = new PutObjectRequest();
fileRequest.Key = "testfile.pdf";
fileRequest.WithBucketName("tempbucket");
fileRequest.CannedACL = S3CannedACL.Private;
fileRequest.StorageClass = S3StorageClass.Standard;
using (fileRequest.InputStream = new System.IO.MemoryStream())
{
do
{
// read bytes from input stream
int bytesRead = request.FileByteStream.Read(buffer, 0, chunkSize);
if (bytesRead == 0) break;
// simulates slow connection
System.Threading.Thread.Sleep(3);
// write bytes to output stream
fileRequest.InputStream.Write(buffer, 0, bytesRead);
} while (true);
// report end
Console.WriteLine("Done!");
client.PutObject(fileRequest);
}
}

I would recommend uploading the file to the WCF as chunks instead of a stream. I did so and it works just fine. also you need to return a message of the actual bytes written to the amazon later on you can increase the progress bar based on that. I know it will cause you to write a while loop in the client application but it will help you to show the progress with 100% accuracy for large files. Your WCF function should take the parameter like these
[DataContract]
class RemoteFileInfo
{
[DataMember]
Byte[] myChunk;
[DataMember]
long myOffset;
// other stuff you think you need to be sent each time.
}

Related

WCF: ZIP file generation

I am uploading a file using a WCF service.
The client side code looks like this:
System.ServiceModel.EndpointAddress endPointAddress = new System.ServiceModel.EndpointAddress(address);
IFileStreamService proxy = new FileStreamServiceClient("FileTransfer", endPointAddress);
proxy.UploadFile(uploadReq); //upload the file
Server side includes
public void UploadFile(FileUploadRequest uploadRequest)
{
try
{
string targetFile = uploadRequest.getTargetFile();
Stream sourceStream = uploadRequest.FileByStream;
Log("Going to read stream from client");
using (FileStream outfile = new FileStream(targetFile, FileMode.Create))
{
const int bufferSize = 65536; // 64K chunk
Byte[] buffer = new Byte[bufferSize];
int bytesRead = sourceStream.Read(buffer, 0, bufferSize);
while (bytesRead > 0)
{
outfile.Write(buffer, 0, bytesRead);
bytesRead = sourceStream.Read(buffer, 0, bufferSize);
}
}
}
}
I am not doing any zipping at client side. But at the server side zip file is getting created like this
temp_c7dfbfd7-3495-43b6-a0ec-e46153cef72a.zip
zip file is also corrupted. Can anybody point me to where things are going wrong?

Request stream fail to write

I have to upload a large file to the server with the following code snippet:
static async Task LordNoBugAsync(string token, string filePath, string uri)
{
HttpWebRequest fileWebRequest = (HttpWebRequest)WebRequest.Create(uri);
fileWebRequest.Method = "PATCH";
fileWebRequest.AllowWriteStreamBuffering = false; //this line tells to upload by chunks
fileWebRequest.ContentType = "application/x-www-form-urlencoded";
fileWebRequest.Headers["Authorization"] = "PHOENIX-TOKEN " + token;
fileWebRequest.KeepAlive = false;
fileWebRequest.Timeout = System.Threading.Timeout.Infinite;
fileWebRequest.Proxy = null;
using (FileStream fileStream = File.OpenRead(filePath) )
{
fileWebRequest.ContentLength = fileStream.Length; //have to provide length in order to upload by chunks
int bufferSize = 512000;
byte[] buffer = new byte[bufferSize];
int lastBytesRead = 0;
int byteCount = 0;
Stream requestStream = fileWebRequest.GetRequestStream();
requestStream.WriteTimeout = System.Threading.Timeout.Infinite;
while ((lastBytesRead = fileStream.Read(buffer, 0, bufferSize)) != 0)
{
if (lastBytesRead > 0)
{
await requestStream.WriteAsync(buffer, 0, lastBytesRead);
//for some reasons didnt really write to stream, but in fact buffer has content, >60MB
byteCount += bufferSize;
}
}
requestStream.Flush();
try
{
requestStream.Close();
requestStream.Dispose();
}
catch
{
Console.Write("Error");
}
try
{
fileStream.Close();
fileStream.Dispose();
}
catch
{
Console.Write("Error");
}
}
...getting response parts...
}
In the code, I made a HttpWebRequest and push the content to server with buffering. The code works perfectly for any files under 60MB.
I tried a 70MB pdf. The buffer array has different content for each buffering. Yet, the request stream does not seem to be getting written. The bytecount also reached 70M, showing the file is properly read.
Edit (more info): I set the break point at requestStream.Close(). It clearly takes ~2 mins for the request stream to write in 60MB files but only takes 2ms for 70MB files.
My calling:
Task magic = LordNoBugAsync(token, nameofFile, path);
magic.Wait();
I am sure my calling is correct (it works for 0B to 60MB files).
Any advice or suggestion is much appreciated.

Upload file from WinRT to WCF

Help again please. I managed to upload a file from ASP.NET to my WCF service and it works like a charm. Now I want to do the same thing from WinRT without success. My file upload service is based on this post http://www.seesharpdot.net/?p=214. From ASP.NET I upload the file using this code
string filePath = Server.MapPath("~/Files/Happy.jpg");
string fileName = "Happy.jpg";
ServiceReference1.FileMetaData metadata = new ServiceReference1.FileMetaData();
metadata.LocalFilename = fileName;
metadata.FileType = ".jpg";
fileStream = new FileInfo(filePath).OpenRead();
oService.UploadFile(metadata, fileStream);
byte[] buffer = new byte[2048];
int bytesRead = fileStream.Read(buffer, 0, 2048);
while (bytesRead > 0)
{
fileStream.Write(buffer, 0, 2048);
bytesRead = fileStream.Read(buffer, 0, 2048);
}
From WinRT I thought this will work but it does not. No exception is thrown.
FileOpenPicker openPicker = new FileOpenPicker();
openPicker.ViewMode = PickerViewMode.Thumbnail;
openPicker.SuggestedStartLocation = PickerLocationId.PicturesLibrary;
openPicker.FileTypeFilter.Add(".jpg");
openPicker.FileTypeFilter.Add(".jpeg");
openPicker.FileTypeFilter.Add(".png");
StorageFile file = await openPicker.PickSingleFileAsync();
if (file != null)
{
byte[] bytes = await GetByteFromFile(file);
await App.ServiceInstance.UploadFileAsync(bytes);
}
// This is the method to convert the StorageFile to a Byte[]
private async Task<byte[]> GetByteFromFile(StorageFile storageFile)
{
var stream = await storageFile.OpenReadAsync();
using (var dataReader = new DataReader(stream))
{
var bytes = new byte[stream.Size];
await dataReader.LoadAsync((uint)stream.Size);
dataReader.ReadBytes(bytes);
return bytes;
}
}
What is interesting is that my WCF Service method only accepts a byte array (byte[]) as parameter and ignores the messageContract. Do I need to change my WCF service? How would you recommend I go about to fix this? Any help appreciated.
My WCF Service:
public void UploadFile(FileUploadMessage request)
{
Stream fileStream = null;
Stream outputStream = null;
try
{
fileStream = request.FileByteStream;
string rootPath = HttpContext.Current.Server.MapPath("~\\Files"); ; // ConfigurationManager.AppSettings["RootPath"].ToString();
string newFileName = Path.Combine(rootPath, request.MetaData.LocalFileName);
outputStream = new FileInfo(newFileName).OpenWrite();
const int bufferSize = 1024;
byte[] buffer = new byte[bufferSize];
int bytesRead = fileStream.Read(buffer, 0, bufferSize);
while (bytesRead > 0)
{
outputStream.Write(buffer, 0, bytesRead);
bytesRead = fileStream.Read(buffer, 0, bufferSize);
}
}
catch (IOException ex)
{
throw new FaultException<IOException>(ex, new FaultReason(ex.Message));
}
finally
{
if (fileStream != null)
{
fileStream.Close();
}
if (outputStream != null)
{
outputStream.Close();
}
}
}
I had to implement the same, but the WinRT generation of the library is different as to the one for desktop (Console application).
I had to take out Mtom in the binding, and leave the WCF service parameter as a Stream type.
This still allowed me to upload the document as required. However, on the service, i named the file to the md5 checksum value. The windows 8 app then sent another message to the service, with the parameter being the md5 checksum (calculated on the WinRt device) along with the file metadata. The WCF service then looked for the file with the md5 checksum and renamed the file.
So its a 2 step process from what I see as an immediate workaround, which I think i am happy with.
Happy to share the code for the md5 checksum on the service and WinRt side if required.

Asynchronous File IO stops after a few megabytes sent

I am trying to create an asynchronous TPL file server using sockets and NetworkStream. When testing it, my browser small HTML file (1.9 KB) sends just fine, and sometimes even Javascript or CSS files that it links send to, but it won't download much more from the HTML page, including flash, images, etc. I receive no errors, including no connection errors. I can download a 96K image but that's about the limit. I set Connection: Keep-Alive in all response headers.
Does anyone know why my output streaming seems to be stalling?
async Task<> WriteToStream(NetworkStream _networkStream, string filePath, int startingPoint = 0)
{
using (FileStream sourceStream = new FileStream(filePath,
FileMode.Open, FileAccess.Read, FileShare.Read,
bufferSize: 4096, useAsync: true))
{
byte[] buffer = new byte[4096];
int numRead;
while ((numRead = await sourceStream.ReadAsync(buffer, 0, buffer.Length)) != 0)
{
_networkStream.Write(buffer, 0, numRead);
}
}
}
I also tried replacing this:
_networkStream.Write(buffer, 0, numRead);
with this:
await _networkStream.WriteAsync(buffer, 0, numRead);
and I still have the same problem.
I'm using sockets because I can't use HttpListener or TcpListener classes since I need to access incoming UDP and TCP requests.
I can call WriteToStream() with this simplified method:
private async void SendFileExample()
{
//This method is only for demonstration, so parameters are hardcoded.
// Get info and assemble header
string file = #"C:\www\webpage.html";
byte[] data = null;
string responseCode = "200 OK";
string contentType = "text/html";
long dataLength = 1901;
string serverName = "my Stack Overflow server is overflowing with...";
string header = string.Format("HTTP/1.1 {0}\r\n"
+ "Server: {1}\r\n"
+ "Content-Length: {2}\r\n"
+ "Content-Type: {3}\r\n"
+ "Connection: Keep-Alive\r\n"
+ "\r\n",
responseCode, serverName, dataLength, contentType);
var headerBytes = Encoding.ASCII.GetBytes(header);
//send header
await _networkStream.WriteAsync(headerBytes, 0, headerBytes.Length);
//send payload
await WriteToStream(_networkStream, file, 0);
//flush networkstream
await _networkStream.FlushAsync();
}
EDIT:
Here what calls the listen loop:
_listenTask = Task.Factory.StartNew(() => ListenLoop());
Here is the loop that spools the requests, spawning a client each time:
private async void ListenLoop()
{
for (; ; )
{
// Wait for connection
var socket = await _tcpListener.AcceptSocketAsync();
if (socket == null)
break;
// Got new connection, create a client handler for it
var client = new Client(socket,dbInfo,frmClient);
// Create a task to handle new connection
Task.Factory.StartNew(client.Do);
}
}
Connections are handled by this method:
public async void Do()
{
byte[] buffer = new byte[4096];
for (; ; )
{
// Read a chunk of data
int bytesRead = await _networkStream.ReadAsync(buffer, 0, buffer.Length);
// If Read returns with no data then the connection is closed.
if (bytesRead == 0)
return;
// Write to buffer and process request
_memoryStream.Seek(0, SeekOrigin.End);
_memoryStream.Write(buffer, 0, bytesRead);
bool done = ProcessHeader();
if (done)
break;
}
}
ProcessHeader() mostly just gets meta data like MIME types then passes the stream to the WriteToStream() method at the top of this post.

FileResult buffered to memory

I'm trying to return large files via a controller ActionResult and have implemented a custom FileResult class like the following.
public class StreamedFileResult : FileResult
{
private string _FilePath;
public StreamedFileResult(string filePath, string contentType)
: base(contentType)
{
_FilePath = filePath;
}
protected override void WriteFile(System.Web.HttpResponseBase response)
{
using (FileStream fs = new FileStream(_FilePath, FileMode.Open, FileAccess.Read))
{
int bufferLength = 65536;
byte[] buffer = new byte[bufferLength];
int bytesRead = 0;
while (true)
{
bytesRead = fs.Read(buffer, 0, bufferLength);
if (bytesRead == 0)
{
break;
}
response.OutputStream.Write(buffer, 0, bytesRead);
}
}
}
}
However the problem I am having is that entire file appears to be buffered into memory. What would I need to do to prevent this?
You need to flush the response in order to prevent buffering. However if you keep on buffering without setting content-length, user will not see any progress. So in order for users to see proper progress, IIS buffers entire content, calculates content-length, applies compression and then sends the response. We have adopted following procedure to deliver files to client with high performance.
FileInfo path = new FileInfo(filePath);
// user will not see a progress if content-length is not specified
response.AddHeader("Content-Length", path.Length.ToString());
response.Flush();// do not add anymore headers after this...
byte[] buffer = new byte[ 4 * 1024 ]; // 4kb is a good for network chunk
using(FileStream fs = path.OpenRead()){
int count = 0;
while( (count = fs.Read(buffer,0,buffer.Length)) >0 ){
if(!response.IsClientConnected)
{
// network connection broke for some reason..
break;
}
response.OutputStream.Write(buffer,0,count);
response.Flush(); // this will prevent buffering...
}
}
You can change buffer size, but 4kb is ideal as lower level file system also reads buffer in chunks of 4kb.
Akash Kava is partly right and partly wrong. You DO NOT need to add the Content-Length header or do the flush afterward. But you DO, need to periodically flush response.OutputStream and then response. ASP.NET MVC (at least version 5) will automatically convert this into a "Transfer-Encoding: chunked" response.
byte[] buffer = new byte[ 4 * 1024 ]; // 4kb is a good for network chunk
using(FileStream fs = path.OpenRead()){
int count = 0;
while( (count = fs.Read(buffer,0,buffer.Length)) >0 ){
if(!response.IsClientConnected)
{
// network connection broke for some reason..
break;
}
response.OutputStream.Write(buffer,0,count);
response.OutputStream.Flush();
response.Flush(); // this will prevent buffering...
}
}
I tested it and it works.

Categories