There is a Image array which name is "images" and there are 50 pictures in that array. How can I download that array as .zip file in asp.net?
Use the Image.Save method to save your image to a memory stream
Use the ZipFile Class or the Ionic.Zip library to create a Zip with these image streams.
Make a controller action, and return that zip file as a FileResult. You can use the Controller.File method to help you out.
Related
I want to download a big file (2,5 GB) with .Net WebClient.
Step 1
Now I can download the zip file:
Persist it as local file with webclient.DownloadFile method
OR
Persist into a stream with webclient.OpenRead method
Step 2
.Net ZipArchive does accept a stream or a local file path.
I want to extract the .zip file and read the folders/files extracted.
What is the better approach local file/stream concerning Step 2?
My program handles zip files with encrypted headers, it decrypts the headers and shows the info. Now I want to view the pictures within the zip file in a picturebox so I have to decompress the files into a memorystream.
I have all the bytes of the compressed files. Wich means: header, compressed data, extra length.
How can I decompress these bytes so I can view the file?
You should use the ZipArchive class to read the compressed data, since it appears you're reading valid zip files.
If you're using .NET 4 or older, you'll have to use a third-party library, like DotNetZip and its ZipInputStream class.
Using the FileUpload control, please explain the difference between the following two methods of uploading file:
1. Using the FileUpload.SaveAs() method:
fileUploadControl.SaveAs(path)
2. Writing a byte array to disk from FileUpload.FileBytes using File.WriteAllBytes():
File.WriteAllBytes(path, fileUploadControl.FileBytes);
How would these compare when uploading large files?
These both have different purposes.
SaveAs lets you save as a file directly while WriteAllBytes gives you a byte array of contents.
Your file upload control will receive the bytes only after the file has been uploaded by the client, so there will be no difference in upload speeds.
A byte array is a value-type, so if you are passing copies of that around, note that it will create copies in memory whenever you pass it to functions.
I would use FileUpload.FileBytes when I want to access the bytes directly in memory and fileUploadControl.SaveAs whenever all I want to do is write the file to disk.
In my Windows Azure code I want to download a ZIP file from Blob Storage and unzip it on the fly and store the unzipped contents to the disk. This way I save on first writing the file to the disk and then reading it when doing the extraction.
I'm trying to use DotNetZip for that.
The ZIP file is originally very big, so it is cut into pieces (not multipart archive, but a plain ZIP archive, just cut into smaller files) and each piece is uploaded into Blob Storage. I know how to iterate through all the parts and open each blob when neededn
Azure SDK has CloudBlob.OpenRead() that returns a Stream descendant.
DotNetZip has ZipInputStream class that has a constructor accepting a Stream.
How do I connect these pieces together so that I can download the ZIP file pieces one by one and get them extracted on the fly?
Create your own stream class that will return data from the corresponding CloudBlob according to its position.
Then use this class a input stream for your ZipInputStream.
public sealed class ZipBlobStream : Stream
{
...
http://msdn.microsoft.com/library/system.io.stream.aspx
I have an app that downloads from ftp and gets the format in memorystream or byte array-it gets the zip file ALREADY ZIPPED ,how can i use sharpziplib with this input to unpack that content to a specific place on my harddrive
Wrap the raw data in a MemoryStream and then pass this to the zip library instead of a FileStream.