Why ZipEntry.Size always -1 - c#

I read ZipInputStream from a stream. There are 10 ZipEntries, but size of all of them is -1! I can't figure out why, because there are data, so it must be > 0. Here's my code:
var zipInputStream = new ZipInputStream(new MemoryStream(reports));
ZipEntry zipEntry;
while ((zipEntry = zipInputStream.GetNextEntry()) != null)
{
var fileName = Path.GetFileName(zipEntry.Name);
if (String.IsNullOrEmpty(fileName)) continue;
var identifier = fileName.Split('.')[1];
var buffer = new byte[zipEntry.Size];
zipInputStream.Read(buffer, 0, buffer.Length);
var report = encoding.GetString(buffer);
...
}
And on the line var buffer = new byte[zipEntry.Size] I've got an OverflowException. When I check zipEntry.Size - it's always -1. If I write var buffer = new byte[4096] for example it's ok, but not correct. Any thoughts, please? Thanks in advance!

Here, 0 would indicate "no data"; -1 is indicating that it doesn't know the size of the data. Your best bet, then, is to read to the end of that entry. Perhaps:
MemoryStream ms = new MemoryStream();
while ((zipEntry = zipInputStream.GetNextEntry()) != null)
{
var fileName = Path.GetFileName(zipEntry.Name);
if (String.IsNullOrEmpty(fileName)) continue;
var identifier = fileName.Split('.')[1];
ms.SetLength(0); // reset between iterations, but let it re-use the memory
zipInputStream.CopyTo(ms);
var report = encoding.GetString(ms.GetBuffer(), 0, (int)ms.Length);
}

Related

C# gRPC file streaming, original file smaller than the streamed one

I am having some problems with setting up a request-stream type gRPC architecture. The code below is just for testing purposes and it is missing various validation checks, but the main issue is that the original file is always smaller than the received one.
Could the cause here be encoding? It doesn't matter what the file type is, the end result is always that the file sizes are different.
Protobuf inteface:
syntax = "proto3";
package FileTransfer;
option csharp_namespace = "FileTransferProto";
service FileTransferService {
rpc DownloadFile(FileRequest) returns (stream ChunkMsg);
}
message ChunkMsg {
string FileName = 1;
int64 FileSize = 2;
bytes Chunk = 3;
}
message FileRequest {
string FilePath = 1;
}
Server side (sending):
public override async Task DownloadFile(FileRequest request, IServerStreamWriter<ChunkMsg> responseStream, ServerCallContext context)
{
string filePath = request.FilePath;
if (!File.Exists(filePath)) { return; }
FileInfo fileInfo = new FileInfo(filePath);
ChunkMsg chunk = new ChunkMsg();
chunk.FileName = Path.GetFileName(filePath);
chunk.FileSize = fileInfo.Length;
int fileChunkSize = 64 * 1024;
byte[] fileByteArray = File.ReadAllBytes(filePath);
byte[] fileChunk = new byte[fileChunkSize];
int fileOffset = 0;
while (fileOffset < fileByteArray.Length && !context.CancellationToken.IsCancellationRequested)
{
int length = Math.Min(fileChunkSize, fileByteArray.Length - fileOffset);
Buffer.BlockCopy(fileByteArray, fileOffset, fileChunk, 0, length);
fileOffset += length;
ByteString byteString = ByteString.CopyFrom(fileChunk);
chunk.Chunk = byteString;
await responseStream.WriteAsync(chunk).ConfigureAwait(false);
}
}
Client side (receiving):
public static async Task GetFile(string filePath)
{
var channel = Grpc.Net.Client.GrpcChannel.ForAddress("https://localhost:5001/", new GrpcChannelOptions
{
MaxReceiveMessageSize = 5 * 1024 * 1024, // 5 MB
MaxSendMessageSize = 5 * 1024 * 1024, // 5 MB
});
var client = new FileTransferProto.FileTransferService.FileTransferServiceClient(channel);
var request = new FileRequest { FilePath = filePath };
string tempFileName = $"temp_{DateTime.UtcNow.ToString("yyyyMMdd_HHmmss")}.tmp";
string finalFileName = tempFileName;
using (var call = client.DownloadFile(request))
{
await using (Stream fs = File.OpenWrite(tempFileName))
{
await foreach (ChunkMsg chunkMsg in call.ResponseStream.ReadAllAsync().ConfigureAwait(false))
{
Int64 totalSize = chunkMsg.FileSize;
string tempFinalFilePath = chunkMsg.FileName;
if (!string.IsNullOrEmpty(tempFinalFilePath))
{
finalFileName = chunkMsg.FileName;
}
fs.Write(chunkMsg.Chunk.ToByteArray());
}
}
}
if (finalFileName != tempFileName)
{
File.Move(tempFileName, finalFileName);
}
}
To add to Marc's answer, I feel like you can simplify your code a little bit.
using var fs = File.Open(filePath, System.IO.FileMode.Open);
int bytesRead;
var buffer = new byte[fileChunkSize];
while ((bytesRead = await fs.ReadAsync(buffer)) > 0)
{
await call.RequestStream.WriteAsync(new ChunkMsg
{
// Here the correct number of bytes must be sent which is starting from
// index 0 up to the number of read bytes from the file stream.
// If you solely pass 'buffer' here, the same bug would be present.
Chunk = ByteString.CopyFrom(buffer[0..bytesRead]),
});
}
I've used the array range operator from C# 8.0 which makes this cleaner or you can also use the overload of ByteString.CopyFrom which takes in an offset and count of how many bytes to include.
In your write loop, the chunk you actually send is for the oversized buffer, not accounting for length. This means that the last segment includes some garbage and is oversized. The received payload will be oversized by this same amount. So: make sure you account for length when constructing the chunk to send.
I tested the code and modified it to transfer the correct size.
The complete code is available at the following URL: https://github.com/lisa3907/grpc.fileTransfer
server-side-code
while (_offset < _file_bytes.Length)
{
if (context.CancellationToken.IsCancellationRequested)
break;
var _length = Math.Min(_chunk_size, _file_bytes.Length - _offset);
Buffer.BlockCopy(_file_bytes, _offset, _file_chunk, 0, _length);
_offset += _length;
_chunk.ChunkSize = _length;
_chunk.Chunk = ByteString.CopyFrom(_file_chunk);
await responseStream.WriteAsync(_chunk).ConfigureAwait(false);
}
client-side-code
await foreach (var _chunk in _call.ResponseStream.ReadAllAsync().ConfigureAwait(false))
{
var _total_size = _chunk.FileSize;
if (!String.IsNullOrEmpty(_chunk.FileName))
{
_final_file = _chunk.FileName;
}
if (_chunk.Chunk.Length == _chunk.ChunkSize)
_fs.Write(_chunk.Chunk.ToByteArray());
else
{
_fs.Write(_chunk.Chunk.ToByteArray(), 0, _chunk.ChunkSize);
Console.WriteLine($"final chunk size: {_chunk.ChunkSize}");
}
}

Fixed read-only folders

I am trying to copy or move folders and files in my C# application, but the folders are Read-Only, and they cannot be disabled, since when I try, it's enabled again. I tried many solutions, but not worked... Yes, I am administrator with all rights. I tried disable Read-only in script too, but don't work.
const int CopyBufferSize = 64 * 1024;
public void CopyFile(string source, string destination)
{
//File.Copy(source, destination);
//Stopwatch swTotal = Stopwatch.StartNew();
using (var outputFile = File.Create(destination))
{
using (var inputFile = File.OpenRead(source))
{
// we need two buffers so we can ping-pong
var buffer1 = new byte[CopyBufferSize];
var buffer2 = new byte[CopyBufferSize];
var inputBuffer = buffer1;
int bytesRead;
IAsyncResult writeResult = null;
while ((bytesRead = inputFile.Read(inputBuffer, 0, CopyBufferSize)) != 0)
{
// Wait for pending write
if (writeResult != null)
{
writeResult.AsyncWaitHandle.WaitOne();
outputFile.EndWrite(writeResult);
writeResult = null;
}
// Assign the output buffer
var outputBuffer = inputBuffer;
// and swap input buffers
inputBuffer = (inputBuffer == buffer1) ? buffer2 : buffer1;
// begin asynchronous write
writeResult = outputFile.BeginWrite(outputBuffer, 0, bytesRead, null, null);
}
if (writeResult != null)
{
writeResult.AsyncWaitHandle.WaitOne();
outputFile.EndWrite(writeResult);
}
}
}
//swTotal.Stop();
//Console.WriteLine("Total time: {0:N4} seconds.", swTotal.Elapsed.TotalSeconds);
}
I tried with
File.Copy(source, destination);
too.
Thank you.
more details..
var fileName = "sourceFile.txt";
var source = Path.Combine(Environment.CurrentDirectory, fileName);
var destination = Path.Combine(destinationFolder, fileName);
File.Copy(source, destination);
OR
File.Copy(#"someDirectory\someFile.txt", #"otherDirectory\someFile.txt");

Deflate stream not reading

DeflateStream.Read is not working, I'm trying to read from a compressed memory stream but the byte array argument of read remains empty.
var memoryStream = new MemoryStream();
var writeStream = new DeflateStream(memoryStream, CompressionLevel.Optimal, true);
var readStream = new DeflateStream(memoryStream, CompressionMode.Decompress, true);
var serializedPayloadBytes = Serialize(new Payload { Message = "Payload" });
var serializedHeaderBytes = Serialize(new PayloadHeader { Length = serializedPayloadBytes.Length });
var headerSize = serializedHeaderBytes.Length;
var package = new byte[serializedHeaderBytes.Length + serializedPayloadBytes.Length];
Buffer.BlockCopy(serializedHeaderBytes, 0, package, 0, serializedHeaderBytes.Length);
Buffer.BlockCopy(serializedPayloadBytes, 0, package, serializedHeaderBytes.Length, serializedHeaderBytes.Length);
writeStream.Write(package, 0, package.Length);
writeStream.Flush();
writeStream.Close();
var arr = new byte[serializedHeaderBytes.Length];
readStream.Read(arr, 0, headerSize);
The arr is always empty (all bytes are zero), the memoryStream.ToArray() has data.
The MemoryStreams Position is at the end of the stream after writing it. You have to set it back to 0 if you want to read from the same stream again after write.
memoryStream.Position = 0;

Why StringWriter.ToString return `System.Byte[]` and not the data?

UnZipFile method writes the data from inputStream to outputWriter.
Why sr.ToString() returns System.Byte[] and not the data?
using (var sr = new StringWriter())
{
UnZipFile(response.GetResponseStream(), sr);
var content = sr.ToString();
}
public static void UnZipFile(Stream inputStream, TextWriter outputWriter)
{
using (var zipStream = new ZipInputStream(inputStream))
{
ZipEntry currentEntry;
if ((currentEntry = zipStream.GetNextEntry()) != null)
{
var size = 2048;
var data = new byte[size];
while (true)
{
size = zipStream.Read(data, 0, size);
if (size > 0)
{
outputWriter.Write(data);
}
else
{
break;
}
}
}
}
}
The problem is on the line:
outputWriter.Write(data);
StringWriter.Write has no overload expecting a byte[]. Therefore, Write(Object) is called instead. And according to MSDN:
Writes the text representation of an object to the text string or stream by calling the ToString method on that object.
Calling ToString on a byte array returns System.byte[], explaining how you get that string in your StringWriter.
The reason is simple:
data is of type byte[]. There is no overload for byte[] on StringWriter so it uses the overload for object. And then calls ToString() on the boxed byte array which simply prints the type.
Your code is equivalent to this:
outputWriter.Write(data.ToString());
theateist,
Looking at the other answers here, I am going to have to agree that the reason for the "ToString()" returning System.Byte[] is because that is what you are putting into it, and everything put into the StringWriter calls it's own "ToString" method when doing so. (i.e. byte[].toString() = "System.byte[]"). In fact the whole idea is that the StringWriter is only ment for writing into a string "buffer" (StringBuilder), so in theory if your file was large enough(bigger than 2048), your output would be "System.Byte[]System.Byte[]" (etc.). Try this to deflate into a memory stream and then read from that stream, may be a better understanding of what you are looking at. (Code not tested, just example).
using (Stream ms = new MemoryStream())
{
UnZipFile(response.GetResponseStream(), ms);
string content;
ms.Position = 0;
using(StreamReader s = new StreamReader(ms))
{
content = s.ReadToEnd();
}
}
public static void UnZipFile(Stream inputStream, Stream outputWriter)
{
using (var zipStream = new ZipInputStream(inputStream))
{
ZipEntry currentEntry;
if ((currentEntry = zipStream.GetNextEntry()) != null)
{
int size = 2048;
byte[] data = new byte[size];
while (true)
{
size = zipStream.Read(data, 0, size);
if (size > 0)
{
outputWriter.Write(data);
}
else
{
break;
}
}
}
}
}
Another idea would actually be to using the endcoding to get the string
public string UnZipFile(Stream inputStream)
{
string tmp;
using(Stream zipStream = new ZipInputStream(inputStream))
{
ZipEntry currentEntry;
if(currentEntry = zipStream.GetNextEntry()) != null)
{
using(Stream ms = new MemoryStream())
{
int size = 2048;
byte[] data = new byte[size];
while(true)
{
if((size = zipStream.Read(data,0,size)) > 0)
ms.Write(data);
else
break;
}
tmp = Encoding.Default.GetString(ms.ToByteArray());
}
}
}
}
return tmp;
}
Or as one last idea, you could actually change your original code to have
outputWriter.Write(Encoding.Default.GetString(data));
Instead of
outputWriter.Write(data);
By the way, please avoid the var keyword in posts, maybe just my pet peev, but code is less readable when utilizing weak types.
StringWriter.Write:MSDN
StringWriter.ToString:MSDN

FileResult with MemoryStream gives empty result .. what's the problem?

I'm generating ics files ( iCalendar or RFC 2445 or however you call them) using a library that serializes the ical contents into a MemoryStream, or actually any type of stream.
Here's my chunk of code:
public ActionResult iCal(int id) {
MyApp.Event kiEvt = evR.Get(id);
// Create a new iCalendar
iCalendar iCal = new iCalendar();
// Create the event, and add it to the iCalendar
DDay.iCal.Components.Event evt = iCal.Create<DDay.iCal.Components.Event>();
// Set information about the event
evt.Start = kiEvt.event_date;
evt.End = evt.Start.AddHours(kiEvt.event_duration); // This also sets the duration
evt.Description = kiEvt.description;
evt.Location = kiEvt.place;
evt.Summary = kiEvt.title;
// Serialize (save) the iCalendar
iCalendarSerializer serializer = new iCalendarSerializer(iCal);
System.IO.MemoryStream fs = new System.IO.MemoryStream();
serializer.Serialize(fs, System.Text.Encoding.UTF8);
return File(fs, "text/calendar", "MyApp.wyd."+kiEvt.id+".ics");
}
My problem is that fs contains some content, but the controller returns empty file - with proper mimetype and filename. I'm most probably missing something with the stream handling but can't figure out what.
Can anybody help me out here? Thanks in advance.
Just a guess: Do you need to Seek back to the start of the stream before you return it?
fs.Seek(0, 0);
iCalendar iCal = new iCalendar();
foreach (CalendarItem item in _db.CalendarItems.Where(r => r.Start > DateTime.Now && r.Active == true && r.CalendarID == ID).ToList())
{
Event evt = new Event();
evt.Start = new iCalDateTime(item.Start);
evt.End = new iCalDateTime(item.End);
evt.Summary = "Some title";
evt.IsAllDay = false;
evt.Duration = (item.End - item.Start).Duration();
iCal.Events.Add(evt);
}
// Create a serialization context and serializer factory.
// These will be used to build the serializer for our object.
ISerializationContext ctx = new SerializationContext();
ISerializerFactory factory = new DDay.iCal.Serialization.iCalendar.SerializerFactory();
// Get a serializer for our object
IStringSerializer serializer = factory.Build(iCal.GetType(), ctx) as IStringSerializer;
if (serializer == null) return Content("");
string output = serializer.SerializeToString(iCal);
var contentType = "text/calendar";
var bytes = Encoding.UTF8.GetBytes(output);
var result = new FileContentResult(bytes, contentType);
result.FileDownloadName = "FileName.ics";
return result;

Categories