send GZIP stream over WCF - c#

Below is my code.
I set the content-encoding header. Then write the file stream, to memory stream, using gzip encoding. Then finally return the memory stream.
However, the android, IOS, and webbrowser all recieve corrupt copies of the stream. None of them are able to fully read through the decompressed stream on the other side. Which vital part am I missing?
public Stream GetFileStream(String path, String basePath)
{
FileInfo fi = new FileInfo(basePath + path);
//WebOperationContext.Current.OutgoingResponse.ContentType = "application/x-gzip";
WebOperationContext.Current.OutgoingResponse.Headers.Add("Content-Encoding","gzip");
MemoryStream ms = new MemoryStream();
GZipStream CompressStream = new GZipStream(ms, CompressionMode.Compress);
// Get the stream of the source file.
FileStream inFile = fi.OpenRead();
// Prevent compressing hidden and already compressed files.
if ((File.GetAttributes(fi.FullName) & FileAttributes.Hidden)
!= FileAttributes.Hidden & fi.Extension != ".gz")
{
// Copy the source file into the compression stream.
inFile.CopyTo(CompressStream);
Log.d(String.Format("Compressed {0} from {1} to {2} bytes.",
fi.Name, fi.Length.ToString(), ms.Length.ToString()));
}
ms.Position = 0;
inFile.Close();
return ms;
}

I'd strongly recommend sending a byte array. Then on client side create a zip stream from the received byte array.

Related

GZipStream -- the decompressed file is missing data

I need to get a file from my server via FTP into a memory stream and then decompress it so I can further work with it.
I do the below but the decompressed file is truncated every time.
I can see that the FTP part is working correctly (I checked that ms.Length equals the correct file size on the server (about 700KB)).
res.Length is only about 400K but it should be about 10MB. (also I can see in the Console.WriteLine(res) that the file is truncated).
I get a MemoryStream from my FTP code then...
var decompress = new GZipStream(ms, CompressionMode.Decompress);
using (var sr = new StreamReader(decompress))
{
var res = sr.ReadToEnd();
Console.WriteLine(res);
}

Convert object to CSV and then compress without touching physical storage

Scenario
I have a object that I convert to a flat CSV and then compress and upload to a filestore.
I could easily do this by following the below steps.
Convert object to CSV file.
Compress file
Upload file.
However
I do not want the penalty that comes with touching physical storage so would like to do all this in memory.
Current Incorrect Implementation
Convert object to CSV byte array
Compress byte array
Upload byte array to file store
Problem
What i'm essentially doing is compressing a byte array and uploading that. which is obviously wrong. (Because when the compressed Gzip file is uncompressed, it contains a byte array of the csv and not the actual csv itself.)
Is it possible to create a file like "file.csv" in memory and then compress that in memory, instead of compressing a byte array?
The problem I'm having is it would seem I can only name the file and specify its extension when saving to a physical location.
Code Example of Current Implementation
public byte[] Example(IEnumerable<object> data)
{
// Convert Object to CSV and write to byte array.
byte[] bytes = null;
using (var ms = new MemoryStream())
{
TextWriter writer = new StreamWriter(ms);
var csv = new CsvWriter(writer);
csv.WriteRecords(data);
writer.Flush();
ms.Position = 0;
bytes = ms.ToArray();
}
//Compress byte array
using (var compressedStream = new MemoryStream(bytes))
using (var resultStream = new MemoryStream())
using (var zipStream = new GZipStream(compressedStream, CompressionMode.Decompress))
{
zipStream.CopyTo(resultStream);
zipStream.Close();
var gzipByteArray = resultStream.ToArray();
//Upload to AzureStorage
new AzureHelper().UploadFromByteArray(gzipByteArray, 0, gzipByteArray.Length);
}
}
Wrap the Stream you use for the upload in a GZipStream, write your CSV to that, and the then you'll have uploaded the compressed CSV.

DotNetZip fails with "stream does not support seek operations"

I am using DotNetZip in C# to unzip from a stream as follows:
public static void unzipFromStream(Stream stream, string outdir)
{ //omit try catch block
using (ZipFile zip = ZipFile.Read(stream)){
foreach (ZipEntry e in zip){
e.Extract(outdir, ExtractExistingFileAction.OverwriteSilently);
}
}
}
stream is obtained using
WebClient client = new WebClient();
Stream fs = client.OpenRead(url);
However, I got the following exception
exception during extracting zip from stream System.NotSupportedException: This stream does not support seek operations.
at System.Net.ConnectStream.get_Position()
at Ionic.Zip.ZipFile.Read(Stream zipStream, TextWriter statusMessageWriter, Encoding encoding, EventHandler`1 readProgress)
On the server side(ASP.NET MVC 4), returning FilePathResult or FileStreamResult both caused this exception.
Should I obtain the stream differently on the client side? Or how to make server return a "seekable" stream? Thanks!
You'll have to download the data to a file or to memory, and then create a FileStream or a MemoryStream, or some other stream type that supports seeking. For example:
WebClient client = new WebClient();
client.DownloadFile(url, filename);
using (var fs = File.OpenRead(filename))
{
unzipFromStream(fs, outdir);
}
File.Delete(filename);
Or, if the data will fit into memory:
byte[] data = client.DownloadData(url);
using (var fs = new MemoryStream(data))
{
unzipFromStream(fs, outdir);
}

unable to save dynamically created MemoryStream with rebex sftp

I'm using StreamWriter to generate a dynamic file and holding it in a MemoryStream. Everything appears to be alright until I go to save the file using rebex sftp.
The example they give on their site works fine:
// upload a text using a MemoryStream
string message = "Hello from Rebex FTP for .NET!";
byte[] data = System.Text.Encoding.Default.GetBytes(message);
System.IO.MemoryStream ms = new System.IO.MemoryStream(data);
client.PutFile(ms, "message.txt");
However the code below does not:
using (var stream = new MemoryStream())
{
using (var writer = new StreamWriter(stream))
{
writer.AutoFlush = true;
writer.Write("test");
}
client.PutFile(stream, "test.txt");
}
The file "test.txt" is saved, however it is empty. Do I need to do more than just enable AutoFlush for this to work?
After writing to the MemoryStream, the stream is positioned at the end. The PutFile method reads from the current position to the end. That's exactly 0 bytes.
You need to position the stream at the beginning before passing it to PutFile:
...
}
stream.Seek(0, SeekOrigin.Begin);
client.PutFile(stream, "test.txt");
You may also need to prevent the StreamWriter from disposing the MemoryStream:
var writer = new StreamWriter(stream);
writer.Write("test");
writer.Flush();
stream.Seek(0, SeekOrigin.Begin);
client.PutFile(stream, "test.txt");

FileUpload to FileStream

I am in process of sending the file along with HttpWebRequest. My file will be from FileUpload UI. Here I need to convert the File Upload to filestream to send the stream along with HttpWebRequest. How do I convert the FileUpload to a filestream?
Since FileUpload.PostedFile.InputStream gives me Stream, I used the following code to convert it to byte array
public static byte[] ReadFully(Stream input)
{
byte[] buffer = new byte[input.Length];
//byte[] buffer = new byte[16 * 1024];
using (MemoryStream ms = new MemoryStream())
{
int read;
while ((read = input.Read(buffer, 0, buffer.Length)) > 0)
{
ms.Write(buffer, 0, read);
}
return ms.ToArray();
}
}
Might be better to pipe the input stream directly to the output stream:
inputStream.CopyTo(outputStream);
This way, you are not caching the entire file in memory before re-transmission. For example, here is how you would write it to a FileStream:
FileUpload fu; // Get the FileUpload object.
using (FileStream fs = File.OpenWrite("file.dat"))
{
fu.PostedFile.InputStream.CopyTo(fs);
fs.Flush();
}
If you wanted to write it directly to another web request, you could do the following:
FileUpload fu; // Get the FileUpload object for the current connection here.
HttpWebRequest hr; // Set up your outgoing connection here.
using (Stream s = hr.GetRequestStream())
{
fu.PostedFile.InputStream.CopyTo(s);
s.Flush();
}
That will be more efficient, as you will be directly streaming the input file to the destination host, without first caching in memory or on disk.
You can't convert a FileUpload into a FileStream. You can, however, get a MemoryStream from that FileUpload's PostedFile property. You can then use that MemoryStream to fill your HttpWebRequest.
You can put a FileUpload file directly into a MemoryStream by using FileBytes (simplified answer from Tech Jerk)
using (MemoryStream ms = new MemoryStream(FileUpload1.FileBytes))
{
//do stuff
}
Or if you do not need a memoryStream
byte[] bin = FileUpload1.FileBytes;

Categories