Difference between downloading into a file Or stream for ZipArchive extraction - c#

I want to download a big file (2,5 GB) with .Net WebClient.
Step 1
Now I can download the zip file:
Persist it as local file with webclient.DownloadFile method
OR
Persist into a stream with webclient.OpenRead method
Step 2
.Net ZipArchive does accept a stream or a local file path.
I want to extract the .zip file and read the folders/files extracted.
What is the better approach local file/stream concerning Step 2?

Related

Download a Single file in Remote Zip file

I want to download a single file in a remote Zip file that is in the cloud. The zip file is too large for me to download as a whole therefore I have decided to look for a way to download only a single file(XML) that I need within the archive. I have tried and Tested a webclient and web request but it downloads the whole zip file(also too large file for these technuques usually fails). I'm eyeing the SharpZipLib but I dont know how to use it. Is it the right library I should use or there are other available ones I can get and test. Thank you so much.

C# Get specific file Zip azure File Storage

I was wondering if it is possible to get a specific file in a ZIP from Azure File Storage without downloading and unzipping the whole ZIP.
The problem is that the zip file can be large (>1 gb), while the file in the zip which I need is just a few MB tops.
If it is possible, could you provide an example or link(s)?
Thank you
I was wondering if it is possible to get a specific file in a ZIP from
Azure File Storage without downloading and unzipping the whole ZIP.
No. You would need to download the entire file from storage, unzip the file and then extract the desired file. It is possible to keep the entire downloaded file in memory (in form of stream) and have a zipping library work on that stream instead of saving the entire file on the disk. But you have to read the entire file.

c# mimecontent to filestream

I am trying to download .eml files (message.mimecontent) from an Exchange Server and upload it to a SharePoint document library. One option is saving the file to local drive and then uploading it from a filestream.
Is there a way that I can convert mimecontent to a filestream without saving it as a file first ? This would make the code much nicer avoiding all intricacies of file operations, file name management etc.
Assuming you are using EWS, the mime content is actually a byte[]. See the Content property.
Without saving this byte[] to the file system, keep it in memory, and use this SPFileCollection.Add method overload to upload a file to the SharePoint site by providing the byte[] as input.

Stream dynamic zip files with resume support

Suppose, I have a list of MP3 files on my server. And I want a user to download multiple files (which he wants through any means). For, this what i want is to create the zip file dynamically and while saving it into the Output Stream using the dotnetzip or ioniczip libraries.
Well, that's not the perfect solution if the zip file got heavy in size. As, in that scenario the server doesn't support resumable downloads. So, to overcome this approach, I need to handle the zip file structure internally and provide the resume support.
So, is there any library (open source) which i can use to provide resumable dyanamic zip files stream directly to the Output Stream. Or, if possible I will be happy if someone let me know the structure of zip file specially the header content + data content.
Once a download has started, you should not alter the ZIP file anymore. Because then a resume will just result in a broken ZIP file. So make sure your dynamically created ZIP file stays available!
The issue of providing resume-functionality was solved in this article for .NET 1.1, and it is still valid and functional.

How do I extract a ZIP file on the fly with DotNetZip?

In my Windows Azure code I want to download a ZIP file from Blob Storage and unzip it on the fly and store the unzipped contents to the disk. This way I save on first writing the file to the disk and then reading it when doing the extraction.
I'm trying to use DotNetZip for that.
The ZIP file is originally very big, so it is cut into pieces (not multipart archive, but a plain ZIP archive, just cut into smaller files) and each piece is uploaded into Blob Storage. I know how to iterate through all the parts and open each blob when neededn
Azure SDK has CloudBlob.OpenRead() that returns a Stream descendant.
DotNetZip has ZipInputStream class that has a constructor accepting a Stream.
How do I connect these pieces together so that I can download the ZIP file pieces one by one and get them extracted on the fly?
Create your own stream class that will return data from the corresponding CloudBlob according to its position.
Then use this class a input stream for your ZipInputStream.
public sealed class ZipBlobStream : Stream
{
...
http://msdn.microsoft.com/library/system.io.stream.aspx

Categories