After fileStream.CopyTo(memoryStream), memoryStream is null - c#

So, I have the function, which gets BitmapImage, I need to save it to iso storage and convert to Base64 (for sending to server). However, copying from fileStream to memoryStream is not successful.
public void SetImage(BitmapImage bitmap)
{
if (isoFiles.FileExists(Settings.FILE_AVATAR_JPG))
isoFiles.DeleteFile(Settings.FILE_AVATAR_JPG);
var fileStream = isoFiles.CreateFile(Settings.FILE_AVATAR_JPG);
var wb = new WriteableBitmap(bitmap);
wb.SaveJpeg(fileStream, 120, 120, 0, 85); // file is saved
var memoryStream = new MemoryStream();
fileStream.CopyTo(memoryStream); // here, memoryStream is null
byte[] result = memoryStream.ToArray();
fileStream.Close();
var base64 = Convert.ToBase64String(result);
}

Stream.CopyTo copies from the current position of fileStream which has been changed by SaveJpeg() so you need to reset it;
var memoryStream = new MemoryStream();
fileStream.Position = 0;
fileStream.CopyTo(memoryStream);

Related

GZipStream does not decompress data correctly

I'm trying to compress data with GZipStream. The code is quite straightforward:
// Serialize
var ms = new MemoryStream();
ProtoBuf.Serializer.Serialize(ms, result);
ms.Seek(0, SeekOrigin.Begin);
// Compress
var ms2 = new MemoryStream();
GZipStream zipStream = new GZipStream(ms2, CompressionMode.Compress);
ms.CopyTo(zipStream);
zipStream.Flush();
// Test
ms2.Seek(0, SeekOrigin.Begin);
var ms3 = new MemoryStream();
var unzipStream = new GZipStream(ms2, CompressionMode.Decompress);
unzipStream.CopyTo(ms3);
System.Diagnostics.Debug.WriteLine($"{ms.Length} =? {ms3.Length}");
Results should be equal, but I'm getting:
244480 =? 191481
Is the GZipStream unable to decompress stream compressed by itself? Or am I doing something wrong?
From the docs of GZipStream.Flush:
The current implementation of this method does not flush the internal buffer. The internal buffer is flushed when the object is disposed.
That fits in with not enough data being written to ms2. Try wrapping zipStream in a using block instead:
var ms2 = new MemoryStream();
using (GZipStream zipStream = new GZipStream(ms2, CompressionMode.Compress))
{
ms.CopyTo(zipStream);
}

Write bytes to End Of File PNG format

I have a resource file contain a PNG file I want to write byte array to the End Of File PNG image but when saving the file from stream remove byte array, I want to write hidden bytes to PNG file, How to resolve this issue?
public void Method()
{
using (var reader = new ResourceReader("My.resource"))
{
var iter = reader.GetEnumerator();
var ms = new MemoryStream();
var writer = new System.Resources.ResourceWriter(ms);
while (iter.MoveNext())
{
var key = iter.Key as string;
var value = iter.Value as Bitmap;
MemoryStream stream = new MemoryStream();
value.Save(stream, value.RawFormat);
byte[] b = new byte[] {0x11, 0x22, 0x33};
stream.Write(b, 0, b.Length);
Image bitmap = Image.FromStream(stream);
writer.AddResource(key, bitmap);
}
writer.Generate();
File.WriteAllBytes("My1.resource", ms.ToArray());
}
}

Deflate stream not reading

DeflateStream.Read is not working, I'm trying to read from a compressed memory stream but the byte array argument of read remains empty.
var memoryStream = new MemoryStream();
var writeStream = new DeflateStream(memoryStream, CompressionLevel.Optimal, true);
var readStream = new DeflateStream(memoryStream, CompressionMode.Decompress, true);
var serializedPayloadBytes = Serialize(new Payload { Message = "Payload" });
var serializedHeaderBytes = Serialize(new PayloadHeader { Length = serializedPayloadBytes.Length });
var headerSize = serializedHeaderBytes.Length;
var package = new byte[serializedHeaderBytes.Length + serializedPayloadBytes.Length];
Buffer.BlockCopy(serializedHeaderBytes, 0, package, 0, serializedHeaderBytes.Length);
Buffer.BlockCopy(serializedPayloadBytes, 0, package, serializedHeaderBytes.Length, serializedHeaderBytes.Length);
writeStream.Write(package, 0, package.Length);
writeStream.Flush();
writeStream.Close();
var arr = new byte[serializedHeaderBytes.Length];
readStream.Read(arr, 0, headerSize);
The arr is always empty (all bytes are zero), the memoryStream.ToArray() has data.
The MemoryStreams Position is at the end of the stream after writing it. You have to set it back to 0 if you want to read from the same stream again after write.
memoryStream.Position = 0;

Error decompressing gzipstream -- The magic number in GZip header is not correct

I'm using the C# System.IO class (framework 4.0) to compress an image pulled off the file system with FileDialog and then inserted into a SQL Server database as a varbinary(max) data type. The problem I'm having is when I pull the data out of the database and attempt to decompress I get the subject error with an additional message -- make sure you are passing in a gzip stream.
The code to get the file:
OpenFileDialog dlgOpen = new OpenFileDialog();
if (dlgOpen.ShowDialog() == DialogResult.OK)
{
FileStream fs = File.OpenRead(dlgOpen.FileName);
byte[] picbyte1 = new byte[fs.Length];
byte[] picbyte = Compress(picbyte1);
fs.Read(picbyte, 0, System.Convert.ToInt32(picbyte.Length));
String ImageName = dlgOpen.FileName;
//String bs64OfBytes = Convert.ToBase64String(picbyte);
fs.Close();
//additional code inserts into database
....
}
The compress method:
private static byte[] Compress(byte[] data)
{
var output = new MemoryStream();
using (var gzip = new GZipStream(output, CompressionMode.Compress, true))
{
gzip.Write(data, 0, data.Length);
gzip.Close();
}
return output.ToArray();
}
The decompress method:
private static byte[] Decompress(byte[] data)
{
var output = new MemoryStream();
var input = new MemoryStream();
input.Write(data, 0, data.Length);
input.Position = 0;
using (var gzip = new GZipStream(input, CompressionMode.Decompress, true))
{
var buff = new byte[64];//also used 32
var read = gzip.Read(buff, 0, buff.Length);//error occurs here
while (read > 0)
{
output.Write(buff, 0, read);
read = gzip.Read(buff, 0, buff.Length);
}
gzip.Close();
}
return output.ToArray();
}
You need to insert a line and remove an other:
FileStream fs = File.OpenRead(dlgOpen.FileName);
byte[] picbyte1 = new byte[fs.Length];
fs.Read(picbyte1, 0, (int)fs.Length); // <-- Add this one
byte[] picbyte = Compress(picbyte1);
// fs.Read(picbyte, 0, System.Convert.ToInt32(picbyte.Length)); // <-- And remove this one
// ...
You are reading the image in your code, but something is in the wrong order:
// Original but incorrect sequence
FileStream fs = File.OpenRead(dlgOpen.FileName); // Open the file
byte[] picbyte1 = new byte[fs.Length]; // Assign the array
byte[] picbyte = Compress(picbyte1); // Compress the assigned array, but there is no contents...
fs.Read(picbyte, 0, System.Convert.ToInt32(picbyte.Length)); // You save the file to the already compressed bytes...
So you have saved the first part of the original file, not the compressed one (but the number of saved bytes corresponds with the number of compressed bytes). If you send this to the DB and read it back, the decompressor does not find it's Magic Number.
As an improvement, you can change these lines:
FileStream fs = File.OpenRead(dlgOpen.FileName);
byte[] picbyte1 = new byte[fs.Length];
fs.Read(picbyte1, 0, (int)fs.Length); // line that I suggested to add
probably change to:
byte[] picbyte1 = File.ReadAllBytes(dlgOpen.FileName);

SharpZipLib - zero bytes

I am trying to implement SharpZipLib
So I copied the below snippet from their samples:
// Compresses the supplied memory stream, naming it as zipEntryName, into a zip,
// which is returned as a memory stream or a byte array.
//
public MemoryStream CreateToMemoryStream(MemoryStream memStreamIn, string zipEntryName)
{
MemoryStream outputMemStream = new MemoryStream();
ZipOutputStream zipStream = new ZipOutputStream(outputMemStream);
zipStream.SetLevel(3); //0-9, 9 being the highest level of compression
ZipEntry newEntry = new ZipEntry(zipEntryName);
newEntry.DateTime = DateTime.Now;
zipStream.PutNextEntry(newEntry);
StreamUtils.Copy(memStreamIn, zipStream, new byte[4096]);
zipStream.CloseEntry();
zipStream.IsStreamOwner = false; // False stops the Close also Closing the underlying stream.
zipStream.Close(); // Must finish the ZipOutputStream before using outputMemStream.
outputMemStream.Position = 0;
return outputMemStream;
// Alternative outputs:
// ToArray is the cleaner and easiest to use correctly with the penalty of duplicating allocated memory.
byte[] byteArrayOut = outputMemStream.ToArray();
// GetBuffer returns a raw buffer raw and so you need to account for the true length yourself.
byte[] byteArrayOut = outputMemStream.GetBuffer();
long len = outputMemStream.Length;
}
I copy pasted that function and called it this way:
using (MemoryStream ms = new MemoryStream())
using (FileStream file = new FileStream(#"c:\file.jpg", FileMode.Open, FileAccess.Read))
{
byte[] bytes = new byte[file.Length];
file.Read(bytes, 0, (int)file.Length);
ms.Write(bytes, 0, (int)file.Length);
var result = SharpZip.CreateToMemoryStream(ms, "file.jpg");
result.WriteTo(new FileStream(#"c:\myzip.zip", FileMode.Create, System.IO.FileAccess.Write));
}
The myzip.zip is sucessfully created, but the file.jpg inside has zero bytes.
Any ideas?
Thanks a lot
It is necessary to "rewind" the input MemoryStream after writing the input file data:
using (MemoryStream ms = new MemoryStream())
using (FileStream file = File.OpenRead(#"input file path"))
{
byte[] bytes = new byte[file.Length];
file.Read(bytes, 0, (int)file.Length);
ms.Write(bytes, 0, (int)file.Length);
ms.Position = 0; // "Rewind" the stream to the beginning.
var result = SharpZip.CreateToMemoryStream(ms, "file.jpg");
using (var outputStream = File.Create(#"output file path"))
{
result.WriteTo(outputStream);
}
}
Alternative (slightly simplified) version of the implementation:
var bytes = File.ReadAllBytes(#"input file path");
using (MemoryStream ms = new MemoryStream(bytes))
{
var result = SharpZip.CreateToMemoryStream(ms, "file.jpg");
using (var outputStream = File.Create(#"output file path"))
{
result.WriteTo(outputStream);
}
}

Categories