WebClient, Zip, Isolated Storage in Windows 8 metro - c#

I have a fairly complex scenario that I try to port to Windows 8 from Windows Phone 7.
I need to
Download s Zip file from the internet
Unzip it to the isolated storage
Read the unzipped xml files and images
Problems
In Windows Phone 7 I use WebClient that is no longer available in Windows 8. I tried HttpClientHandler but I am only able to download the ZIP file as a string and I do no know how to save it to isolated storage.
I found ZipArchive class but it takes a IO.Stream and I am not really sure how to use it (if I had the file save somewehre - point 1)

I'm just starting out with the new API's as well (so this might be off a bit), but based on the documentation:
HttpClient (and it's default handler HttpClientHandler) return a Task<HttpResponseMessage> from SendAsync.
The HttpResponseMessage has a property, Content which is of type HttpContent.
HttpContent in turn has a method, ReadAsStreamAsync, which returns Task<Stream> which you should be able to use (albeit indirectly) to pass to ZipArchive.
Or you can just use the HttpClient.GetStreamAsync method to get the stream (much simpler):
HttpClient client = new HttpClient();
Stream stream = await client.GetStreamAsync(uri);
If that doesn't work then you could also just wrap the string you get now in a MemoryStream and pass it to ZipArchive but that sounds kind of unsafe because of possible encoding problems.

Related

How to relay System.Net.ConnectStream to another request?

I have two cloud provider with their client SDKs say SDK1 and SDK2. And I wanted to copy one file from one cloud to another cloud storage. These SDKs have upload and download APIs like this:
Response uploadAsync(Uri uploadLocation, Stream fileStream);
Stream downloadAsync(Uri downloadLocation);
Earlier I was copying downloaded Stream to MemoryStream and passing it to upload API. And it was working but obviously, it will load entire file to memory which is not good.
I cannot directly pass downloaded Stream to upload API as somewhere it's checking Length of Stream and System.Net.ConnectStream being non-seekable throws Exception.
Any pointer on how can we use the downloaded Stream (which is of Type System.Net.ConnectStream) in upload API without actually storing entire file?

ASP.NET Core - How to upload a zip file if there does not exist a client-side UI for file selection

I have a unique scenario in which I'd like the end result to help me upload a zip file. Here is what is happening in my workflow:
Our user is given an application on their local machine. With a click of a button, it will copy files and a zip file to remote-machine-1.
On remote-machine-2, it is running a .NET Core web app.
On remote-machine-1, I'd like to ping an endpoint off the web app in order to upload the zip file to remote-machine-2. However, the caveat is that the user will not be able to specify where this zip file is - the location of the zip file already known due to the structure of how the files and zip file are copied over in the first place.
So the question remains, with the code below - how do I pass in an IFormFile object when I call the endpoint localhost:5000/PublishTargetAsync?file=[???]? Or is there another workaround?
public async Task<bool> PublishTargetAsync(IFormFile file)
{
if (file != null)
{
using (var fileStream = new FileStream(Path.Combine(_targetOutputDirectory.ToFileSystemPath(), file.Name), FileMode.Create))
{
await file.CopyToAsync(fileStream);
}
}
return true;
}
A simple, but non optimized approach would be to use HttpClient and post the file contents as a base64 encoded string as Json using sample code similar to what is in my link. From there you could work your way back to using HttpWebRequests and a network stream and crafting the Http request by hand if necessary for performance, but the above approach should work for most small files. You'll have to modify your PublishTargetAsync endpoint to handle a post request with the right type.

Streaming Data from Azure Blobs Back to Client

Using ASP.Net Web API, I am developing a service which (amongst other things) retrieves data from Azure, and returns it to the client.
One way of doing this would be to read the netire blob into a buffer, and then write that buffer to the response. However, I'd rather stream the contents, for better performance.
This is simple with the Azure API:
CloudBlobContainer container = BlobClient.GetContainerReference(containerName);
CloudBlockBlob blob = container.GetBlockBlobReference(blobName);
using (var buffer = new MemoryStream())
{
await blob.DownloadToStreamAsync(buffer);
}
And elsewhere in the code, this is returned to the client:
HttpResponseMessage response = Request.CreateResponse(HttpStatusCode.OK);
response.Content = new StreamContent(buffer);
But can I be certain that the MemoryStream won't be closed/disposed before the client finishes reading?
As long as you don't wrap your memory stream in a "using" statement you will be fine. If you do use "using" you end up with a weird race condition where it works sometimes and fails at other times.
I have code like yours in production and it works fine.
Only thing to be mindful of is that the whole blob is copied into memory before anything is sent to the client. This may cause memory pressures on your server and initial lag, depending on the size of the file.
If that is a concern, you have a couple of options.
One is to create a "lease" on the blob and give the user a URL to read it direct from blob storage for a limited time. That only works for low security scenarios though.
Alternatively you can use chunked transfer encoding. Basically, you read the file from blob storage in chunks and sends it to the client in those chunks. That saves memory - but I have not been able to make it work async, so you are trading memory for threads. Which is the right solution for you will depend in your specific circumstances.
(I have not got the code to hand, post a comment if you want it and I'll try to dig it out, even if it's a bit old).

Transferring large files asynchronously with C# via api

I have a Windows Service will be reading from local disk (video files) and post them to remote service via API.
Video files over 2gb size and I need to transfer them to another location through HttpClient/POST request.
There is no limitation on API call, so even if file is 10gb I can still post file to destination.
Right now I am reading entire video file as byte[] fileContent and pass it to function as
ByteArrayContent contentBody = new ByteArrayContent(fileContent);
It works for now, but since this is not scalable. (If multiple files will be transferred at the same time, it can fill up memory) I am seeking for a solution that transfer will happen in chunks.
Question: Can I read big files in buffer and transfer over HTTP as I am reading from local disk?
You can use the PostAsync(Uri, HttpContent) method of HttpClient. In order to stream the contents of your local file, use the StreamContent subclass of HttpContent and supply a file reader stream. A brief example:
async Task PostBigFileAsync(Uri uri, string filename)
{
using (var fileStream = File.OpenRead(filename))
{
var client = new HttpClient();
var response = await client.PostAsync(uri, new StreamContent(fileStream));
}
}

How to transfer a large Zip file (50MB) using a WCF Service through SOAP to any client?

I have a WCF Service that returns a byte array with a Zip file (50MB) to any client that requests it. If the Zip is very small (say 1MB), the SOAP response is coming from WCF with the byte array embedded in it. But the response size is very huge even for a 1MB file. If I try to transfer the 50MB file the service hangs and throws an out of memory exception, because the SOAP response becomes huge in size.
What is the best option available with WCF / web service to transfer large files (mainly ZIP format) as I am sending back a byte array. Is there any good approach instead of that for sending back the file?
Whether WCF / web service is best way to transfer large files to any client or is there any other better option/technology available so that interoperability and scalability for 10,000 users can be achieved?
My Ccode is below:
String pathfordownload = #"D:\New Folder.zip";
FileStream F2D = new FileStream(pathfordownload, FileMode.Open,FileAccess.Read);
BinaryReader binReader = new BinaryReader(F2D);
binReader.BaseStream.Position = 0;
byte[] binFile = binReader.ReadBytes(Convert.ToInt32 (binReader.BaseStream.Length));
binReader.Close();
return binFile;
A working piece/real piece of information will be really helpful as I am struggling with all the data available in Google and have had no good results for last week.
You can transfer a Stream through WCF and then you can send (almost) limitless length files.
I've faced the exact same problem. The out of memory is inevitable because you are using Byte arrays.
What we did is to flush the data on the hard drive, so in stead of being limited by your virtual memory your capacity for concurrent transactions is the HD space.
Then for transfer, we jut placed the file on the other computer. Of course in our case it was a server to server file transfer. If you want to de decoupled form the peer, you can use a file download in http.
So instead than responding with a file, your service could respond with a http url to the file location. Then when the client has successfully downloaded form the server with a standard HttpRequest or WebClient it calls a method to delete the file. In SOAP that could be Delete(string url), in REST that would be delete method on the resource.
I hope this makes sense to you. The most importnat part of this is to understand that in a scalable software especially if you are looking at 10000 clients (concurrent?) is that you may not use resources that are limited, like memory streams or byte arrays. But rather rely on large and easily expandable resources like a hard drive partition that coule eventually be on a SAN and IT could grow the partition as needed.

Categories