I am curently working on a GZIP HTTP decompression.
My server receives some data and im cropping and saving it in binary mode.
I've made a little script to download the gzip from stackoverflow and saved it to a .gz file.
Works fine!
But the "gzip" I receive from my fortigate-firewall ends up being corrupted.
Corrupted and working file here: https://gofile.io/d/j520Nr
The buffer is the corrupted file - and im not sure why.
Both files are extremely different (at least how I see it) - but the GZIP header is definitely present!
Can someone maybe compare these two files and tell me why they are that different?
Or maybe even show me how to fix it?
Thats the gzip html url for both of the files: What is the best way to parse html in C#?
My corrupted file is around 2KB larger!
I would be happy for every step in the right direction - maybe it is something that can be fixed really easy!
The following code should show you my workflow, "ReadAll" is pretty slow but reads all from the stream. It will be optimized ofc (maybe its the problem of the wrong gzip stream?)
public static byte[] ReadAll(NetworkStream stream, int buffer)
{
byte[] data = new byte[buffer];
using MemoryStream ms = new MemoryStream();
int numBytesRead;
while ((numBytesRead = stream.Read(data, 0, data.Length)) > 0)
{
ms.Write(data, 0, numBytesRead);
}
return ms.ToArray();
}
private bool Handled = false;
/// <summary>
/// Handles Client and passes matches to the parser for more investigation
/// </summary>
/// <param name="obj"></param>
private void HandleClient(object obj)
{
TcpClient client = (TcpClient)obj;
Out.Log(LogLevel.Verbose, $"Client {client.Client.RemoteEndPoint} connected");
Data = null; // Resets data after each received stream
// Get a stream object for reading and writing
NetworkStream stream = client.GetStream();
//MemoryStream memory = new MemoryStream();
// Wait to receive all the data sent by the client.
if (stream.CanRead)
{
Out.Log(LogLevel.Debug, "Can read stream");
StringBuilder c_completeMessage = new StringBuilder();
if (!Handled)
{
Out.Log(LogLevel.Warning, "Handling first and last client.");
Handled = true;
int breakPoint = 0;
byte[] res = ReadAll(stream, 1024);
for (int i = 0; i < res.Length; i++)
{
int xy = res[i];
int yy = res[i + 1];
if (res[i].Equals(31) && res[i + 1].Equals(139))
{
breakPoint = i;
Out.Log(LogLevel.Error, GZIP_MAGIC + $" found. Magic Number of GZIP at :{breakPoint}:");
break;
}
continue;
}
byte[] res2 = res.SubArray(breakPoint, res.Length - breakPoint - 7); // (7 for offset linebreaks, eol, etc)
res2.WriteToFile(#"C:\Users\--\Temporary\Buffer_ReadFully_cropped.gz");
As mentioned before, chunking and buffer size played a big role here.
Remember, ICAP uses chunking so you have to respond to the previous package with a CONTINUE, otherwise you will just receive the first X bytes from the server.
Related
I have created a small application using UnrealEngine 4.10 (UE4). Within that application, I am grabbing the colorBuffer via ReadPixels. I am then compressing the colorBuffer to PNG. Finally, the compressed colorBuffer is sent via TCP to another application. The UE4 application is written in c++, not that that should matter. The compressed colorBuffer is being sent every "tick" of the application - essentially 30 times a second (thereabouts).
My client, the one receiving the streamed PNG is written in c# and does the following:
Connect to server If connected
get the stream (memory stream)
read the memory stream into a byte array
convert the byte array to an image
Client implementation:
private void Timer_Tick(object sender, EventArgs e)
{
var connected = tcp.IsConnected();
if (connected)
{
var stream = tcp.GetStream(); //simply returns client.GetStream();
int BYTES_TO_READ = 16;
var buffer = new byte[BYTES_TO_READ];
var totalBytesRead = 0;
var bytesRead;
do {
// You have to do this in a loop because there's no
// guarantee that all the bytes you need will be ready when
// you call.
bytesRead = stream.Read(buffer, totalBytesRead,
BYTES_TO_READ - totalBytesRead);
totalBytesRead += bytesRead;
} while (totalBytesRead < BYTES_TO_READ);
Image x = byteArrayToImage(buffer);
}
}
public Image byteArrayToImage(byte[] byteArrayIn)
{
var converter = new ImageConverter();
Image img = (Image)converter.ConvertFrom(byteArrayIn);
return img;
}
The problem is that Image img = (Image)converter.ConvertFrom(byteArrayIn);
Throws an argument exception, telling me "Parmeter is not valid".
The data being sent looks like this:
My byteArrayInand buffer look like this:
I have also tried both:
Image.FromStream(stream); and Image.FromStream(new MemoryStream(bytes));
Image.FromStream(stream); causes it to read forever... and Image.FromStream(new MemoryStream(bytes)); results in the same exception as mentioned above.
Some questions:
What size shall I set BYTES_TO_READ to be? I set as 16 because when I check the size of the byte array being sent in the UE4 application (dataSize in the first image), it says the length is 16... Not too sure about what to set this as.
Is the process that I have followed correct?
What am I doing wrong?
UPDATE
#RonBeyer asked if I could verify that the data sent from the server matches that which is received. I have tried to do that and here is what I can say:
The data sent, as far as I can tell looks like this (sorry for formatting):
The data being received, looks like this:
var stream = tcp.GetStream();
int BYTES_TO_READ = 512;
var buffer = new byte[BYTES_TO_READ];
Int32 bytes = stream.Read(buffer, 0, buffer.Length);
var responseData = System.Text.Encoding.ASCII.GetString(buffer, 0,
bytes);
//responseData looks like this (has been formatted for easier reading)
//?PNG\r\n\u001a\n\0\0\0\rIHDR
//?PNG\r\n\u001a\n\0\0\0\rIHDR
//?PNG\r\n\u001a\n\0\0\0\rIHDR
//?PNG\r\n\u001a\n\0\0\0\rIHDR
//?PNG\r\n\u001a\n\0\0\0\rIHDR
//?PNG\r\n\u001a\n\0\0\0\rIHDR
//?PNG\r\n\u001a\n\0\0\0\rIHDR
//?PNG\r\n\u001a\n\0\0\0\rIHDR
//?PNG\r\n\u001a\n\0\0\0\rIHDR
//?PNG\r\n\u001a\n\0\0\0\rIHDR
If I try take a single line from the responseData and put that into an image:
var stringdata = "?PNG\r\n\u001a\n\0\0\0\rIHDR";
var data = System.Text.Encoding.ASCII.GetBytes(stringdata);
var ms = new MemoryStream(data);
Image img = Image.FromStream(ms);
data has a length of 16... the same length as the dataSize variable on the server. However, I again get the execption "Parameter is not valid".
UPDATE 2
#Darcara has helped by suggesting that what I was actually receiving was the header of the PNG file and that I needed to first send the size of the image. I have now done that and made progress:
for (TArray<class FSocket*>::TIterator ClientIt(Clients); ClientIt;
++ClientIt)
{
FSocket *Client = *ClientIt;
int32 SendSize = 2 * x * y;
Client->SetNonBlocking(true);
Client->SetSendBufferSize(SendSize, SendSize);
Client->Send(data, SendSize, bytesSent);
}
With this, I am now getting the image on the first go, however, subsequent attempts fail with the same "Parameter is not valid". Upon inspection, I have noticed that the stream now appears to be missing the header... "?PNG\r\n\u001a\n\0\0\0\rIHDR". I came to this conclusion when I converted the buffer to a string using Encoding.ASCII.GetString(buffer, 0, bytes);
Any idea why the header is now only being sent to first time and never again? What can I do to fix it?
First of all, thank you to #Dacara and #RonBeyer for your help.
I now have a solution:
Server:
for (TArray<class FSocket*>::TIterator ClientIt(Clients); ClientIt;
++ClientIt)
{
FSocket *Client = *ClientIt;
int32 SendSize = x * y; // Image is 512 x 512
Client->SetSendBufferSize(SendSize, SendSize);
Client->Send(data, SendSize, bytesSent);
}
The first issue was that the size of the image needed to be correct:
int32 SendSize = 2 * x * y;
The line above is wrong. The image is 512 by 512 and so SendSize should be x * y where x & y are both 512.
The other issue was how I was handling the stream client side.
Client:
var connected = tcp.IsConnected();
if (connected)
{
var stream = tcp.GetStream();
var BYTES_TO_READ = (512 * 512)^2;
var buffer = new byte[BYTES_TO_READ];
var bytes = stream.Read(buffer, 0, BYTES_TO_READ);
Image returnImage = Image.FromStream(new MemoryStream(buffer));
//Apply the image to a picture box. Note, this is running on a separate
//thread.
UpdateImageViewerBackgroundWorker.ReportProgress(0, returnImage);
}
The var BYTES_TO_READ = (512 * 512)^2; is now the correct size.
I now have Unreal Engine 4 streaming its frames.
You are only reading the first 16 bytes of the stream. I'm guessing that is not intentional.
If the stream ends/connection closes after the image is transferred, use stream.CopyTo to copy it into a MemoryStream. Image.FromStream(stream) might also work
If the stream does not end, you need to know the size of the transferred object beforehand, so you can copy it read-by-read into another array / memory stream or directly to disk. In that case a much higher read buffer should be used (default is 8192 I think). This is a lot more complicated though.
To manually read from the stream, you need to prepend you data with the size. A simple Int32 should suffice. Your client code might look something like this:
var stream = tcp.GetStream();
//this is our temporary read buffer
int BYTES_TO_READ = 8196;
var buffer = new byte[BYTES_TO_READ];
var bytesRead;
//read size of data object
stream.Read(buffer, 0, 4); //read 4 bytes into the beginning of the empty buffer
//TODO: check that we actually received 4 bytes.
var totalBytesExpected = BitConverter.ToInt32(buffer, 0)
//this will be the stream we will save our received bytes to
//could also be a file stream
var imageStream = new MemoryStream(totalBytesExpected);
var totalBytesRead = 0;
do {
//read as much as the buffer can hold or the remaining bytes
bytesRead = stream.Read(buffer, 0, Math.Min(BYTES_TO_READ, totalBytesExpected - totalBytesRead));
totalBytesRead += bytesRead;
//write bytes to image stream
imageStream.Write(buffer, 0, bytesRead);
} while (totalBytesRead < totalBytesExpected );
I glossed over a lot of error handling here, but that should give you the general idea.
If you want to transfer more complex objects look into proper protocols like Google Protocol Buffers or MessagePack
I am currently testing several decompression libraries for a project I'm involved with to decompress http file streams on the fly. I have tried two very promising libraries and found an issue that seems to appear in both of them.
This is what I am doing:
video.avi compressed to video.zip on HTTP server test.com/video.zip (~20MB)
HttpWebRequest to read stream from the server
Write HttpWebRequest ResponseStream data into MemoryStream
Let decompression library read from MemoryStream
Read decompressed file stream while it's being downloaded by HttpWebRequest
The whole idea works fine, I'm able to uncompress and stream the compressed video directly into VLC stdin and it's rendered just fine. However I have to use a read buffer of one byte on the decompression library. Any buffer larger than one byte will cause the uncompressed data stream to be cut off. For a test I've written the decompressed stream into a file and compared it with the original video.avi and some data is just skipped by the decompression. When streaming this broken data into VLC it causes a lot of video artifacts and the playback speed is also greatly reduced.
If I knew the size of what is available to read I could trim my buffer accordingly but no library would make this information public so all I can do is read the data with a one byte buffer. Maybe my approach is wrong? Or maybe I'm overlooking something?
Here's an example code (requires VLC):
ICSharpCode.SharpZLib (http://icsharpcode.github.io/SharpZipLib/)
static void Main(string[] args)
{
// Initialise VLC
Process vlc = new Process()
{
StartInfo =
{
FileName = #"C:\Program Files\VideoLAN\vlc.exe", // Adjust as required to test the code
RedirectStandardInput = true,
UseShellExecute = false,
Arguments = "-"
}
};
vlc.Start();
Stream outStream = vlc.StandardInput.BaseStream;
// Get source stream
HttpWebRequest stream = (HttpWebRequest)WebRequest.Create("http://codefreak.net/~daniel/apps/stream60s-large.zip");
Stream compressedVideoStream = stream.GetResponse().GetResponseStream();
// Create local decompression loop
MemoryStream compressedLoopback = new MemoryStream();
ZipInputStream zipStream = new ZipInputStream(compressedLoopback);
ZipEntry currentEntry = null;
byte[] videoStreamBuffer = new byte[8129]; // 8kb read buffer
int read = 0;
long totalRead = 0;
while ((read = compressedVideoStream.Read(videoStreamBuffer, 0, videoStreamBuffer.Length)) > 0)
{
// Write compressed video stream into compressed loopback without affecting current read position
long previousPosition = compressedLoopback.Position; // Store current read position
compressedLoopback.Position = totalRead; // Jump to last write position
totalRead += read; // Increase last write position by current read size
compressedLoopback.Write(videoStreamBuffer, 0, read); // Write data into loopback
compressedLoopback.Position = previousPosition; // Restore reading position
// If not already, move to first entry
if (currentEntry == null)
currentEntry = zipStream.GetNextEntry();
byte[] outputBuffer = new byte[1]; // Decompression read buffer, this is the bad one!
int zipRead = 0;
while ((zipRead = zipStream.Read(outputBuffer, 0, outputBuffer.Length)) > 0)
outStream.Write(outputBuffer, 0, outputBuffer.Length); // Write directly to VLC stdin
}
}
SharpCompress (https://github.com/adamhathcock/sharpcompress)
static void Main(string[] args)
{
// Initialise VLC
Process vlc = new Process()
{
StartInfo =
{
FileName = #"C:\Program Files\VideoLAN\vlc.exe", // Adjust as required to test the code
RedirectStandardInput = true,
UseShellExecute = false,
Arguments = "-"
}
};
vlc.Start();
Stream outStream = vlc.StandardInput.BaseStream;
// Get source stream
HttpWebRequest stream = (HttpWebRequest)WebRequest.Create("http://codefreak.net/~daniel/apps/stream60s-large.zip");
Stream compressedVideoStream = stream.GetResponse().GetResponseStream();
// Create local decompression loop
MemoryStream compressedLoopback = new MemoryStream();
ZipReader zipStream = null;
EntryStream currentEntry = null;
byte[] videoStreamBuffer = new byte[8129]; // 8kb read buffer
int read = 0;
long totalRead = 0;
while ((read = compressedVideoStream.Read(videoStreamBuffer, 0, videoStreamBuffer.Length)) > 0)
{
// Write compressed video stream into compressed loopback without affecting current read position
long previousPosition = compressedLoopback.Position; // Store current read position
compressedLoopback.Position = totalRead; // Jump to last write position
totalRead += read; // Increase last write position by current read size
compressedLoopback.Write(videoStreamBuffer, 0, read); // Write data into loopback
compressedLoopback.Position = previousPosition; // Restore reading position
// Open stream after writing to it because otherwise it will not be able to identify the compression type
if (zipStream == null)
zipStream = (ZipReader)ReaderFactory.Open(compressedLoopback); // Cast to ZipReader, as we know the type
// If not already, move to first entry
if (currentEntry == null)
{
zipStream.MoveToNextEntry();
currentEntry = zipStream.OpenEntryStream();
}
byte[] outputBuffer = new byte[1]; // Decompression read buffer, this is the bad one!
int zipRead = 0;
while ((zipRead = currentEntry.Read(outputBuffer, 0, outputBuffer.Length)) > 0)
outStream.Write(outputBuffer, 0, outputBuffer.Length); // Write directly to VLC stdin
}
}
To test this code I recommend setting the output buffer for SharpZipLib to 2 bytes and for SharpCompress to 8 bytes. You will see the artifacts and also that the play speed of the video is wrong, the seek time should always be aligned with the number that is counting in the video.
I haven't really found any good explanation of why a larger outputBuffer that is reading from the decompression lib is causing these problems or a way to solve this other than having the tiniest possible buffer.
So my question is what I am doing wrong or if this is a general issue when reading compressed files from streams? How could I increase the outputBuffer while reading the correct data?
Any help is greatly appreciated!
Regards,
Gachl
You need to write only how many bytes you read. Writing the entire buffer size will add additional bytes (whatever happened to be in the buffer before). zipStream.Read is not required to read as many bytes as you request.
while ((zipRead = zipStream.Read(outputBuffer, 0, outputBuffer.Length)) > 0)
outStream.Write(outputBuffer, 0, zipRead); // Write directly to VLC stdin
Edit: Solution is at bottom of post
I am trying my luck with reading binary files. Since I don't want to rely on byte[] AllBytes = File.ReadAllBytes(myPath), because the binary file might be rather big, I want to read small portions of the same size (which fits nicely with the file format to read) in a loop, using what I would call a "buffer".
public void ReadStream(MemoryStream ContentStream)
{
byte[] buffer = new byte[sizePerHour];
for (int hours = 0; hours < NumberHours; hours++)
{
int t = ContentStream.Read(buffer, 0, sizePerHour);
SecondsToAdd = BitConverter.ToUInt32(buffer, 0);
// further processing of my byte[] buffer
}
}
My stream contains all the bytes I want, which is a good thing. When I enter the loop several things cease to work.
My int t is 0although I would presume that ContentStream.Read() would process information from within the stream to my bytearray, but that isn't the case.
I tried buffer = ContentStream.GetBuffer(), but that results in my buffer containing all of my stream, a behaviour I wanted to avoid by using reading to a buffer.
Also resetting the stream to position 0 before reading did not help, as did specifying an offset for my Stream.Read(), which means I am lost.
Can anyone point me to reading small portions of a stream to a byte[]? Maybe with some code?
Thanks in advance
Edit:
Pointing me to the right direction was the answer, that .Read() returns 0 if the end of stream is reached. I modified my code to the following:
public void ReadStream(MemoryStream ContentStream)
{
byte[] buffer = new byte[sizePerHour];
ContentStream.Seek(0, SeekOrigin.Begin); //Added this line
for (int hours = 0; hours < NumberHours; hours++)
{
int t = ContentStream.Read(buffer, 0, sizePerHour);
SecondsToAdd = BitConverter.ToUInt32(buffer, 0);
// further processing of my byte[] buffer
}
}
And everything works like a charm. I initially reset the stream to its origin every time I iterated over hour and giving an offset. Moving the "set to beginning-Part" outside my look and leaving the offset at 0 did the trick.
Read returns zero if the end of the stream is reached. Are you sure, that your memory stream has the content you expect? I´ve tried the following and it works as expected:
// Create the source of the memory stream.
UInt32[] source = {42, 4711};
List<byte> sourceBuffer = new List<byte>();
Array.ForEach(source, v => sourceBuffer.AddRange(BitConverter.GetBytes(v)));
// Read the stream.
using (MemoryStream contentStream = new MemoryStream(sourceBuffer.ToArray()))
{
byte[] buffer = new byte[sizeof (UInt32)];
int t;
do
{
t = contentStream.Read(buffer, 0, buffer.Length);
if (t > 0)
{
UInt32 value = BitConverter.ToUInt32(buffer, 0);
}
} while (t > 0);
}
i am new to the C# world. I am using it for fast deployment of a solution to capture a live feed which comes in this form (curly brackets for clarity): {abcde}{CompressedMessage}, where {abcde} constitutes 5 characters indicating the length of the compressed message. The CompressedMessage is compressed using XCeedZip.dll, and needs to be uncompressed using the dll's uncompress method. The uncompress method returns an integer value indicating success or failure (of various sorts, eg no license failure, uncompression failure etc). I am receiving failure 1003 http://doc.xceedsoft.com/products/XceedZip/ for reference of the return values from the uncompress method.
while(true){
byte[] receiveByte = new byte[1000];
sock.Receive(receiveByte);
string strData =System.Text.Encoding.ASCII.GetString(receiveByte,0,receiveByte.Length);
string cMesLen = strData.Substring(0,5); // length of compressed message;
string compressedMessageStr = strData.Substring(5,strData.Length-5);
byte[] compressedBytes = System.Text.Encoding.ASCII.GetBytes(compressedMessageStr);
//instantiating xceedcompression object
XceedZipLib.XceedCompression obXCC = new XceedZipLib.XceedCompression();
obXCC.License("blah");
// uncompress method reference http://doc.xceedsoft.com/products/XceedZip/
// visual studio displays Uncompress method signature as Uncompress(ref object vaSource, out object vaUncompressed, bool bEndOfData)
object oDest;
object oSource = (object)compressedBytes;
int status = (int) obXCC.Uncompress(ref oSource, out oDest, true);
Console.WriteLine(status); /// prints 1003 http://doc.xceedsoft.com/products/XceedZip/
}
So basically my question boils down to invocation of the uncompress method and correct way of passing the parameters. I am in unfamiliar territory in the .net world, so i won't be surprised if the question is really simplistic.
Thanks for replies ..
##################################### updates
I am now doing the following:
int iter = 1;
int bufSize = 1024;
byte[] receiveByte = new byte[bufSize];
while (true){
sock.Receive(receiveByte);
//fetch compressed message length;
int cMesLen = Convert.ToInt32(System.Text.Encoding.ASCII.GetString(receiveByte,0,5));
byte[] cMessageByte = new byte[cMesLen];
if (i==1){
if (cMesLen < bufSize){
for (int i = 5; i < 5+cMesLen; ++i){
cMessageByte[i-5] = b[i];
}
}
}
XceedZipLib.XceedCompression obXCC = new XceedZipLib.XceedCompression();
obXCC.License("blah");
object oDest;
object oSource = (object) cMessageByte;
int status = (int) obXCC.Uncompress(ref oSource, out oDest, true);
if (iter==1){
byte[] testByte = objectToByteArray(oDest);
Console.WriteLine(System.Text.Encoding.ASCII.GetString(testByte,0,testByte.Length));
}
}
private byte[] objectToByteArray(Object obj){
if (obj==null){
return null;
}
BinaryFormatter bf = new BinaryFormatter();
MemoryStream ms = new MemoryStream();
bf.Serialize(ms,obj);
return ms.ToArray();
}
Problem is the testByte writeline command prints out gibberish. Any suggestions on how to move forward on this ? the status variable of uncompress is good and equal to 0 now.
The first mistake, always, is not looking at the return value of Receive; you have no idea how much data you just read, nor whether it constitutes an entire message.
It seems likely to me that you have corrupted the message payload by treating the entire data as ASCII. Rather than doing a GetString on the entire buffer, you should use GetString specifying only to use 5 bytes.
Correct process:
keep calling Receive (buffering the data, or increasing the offset and decreasing the count) until you have at least 5 bytes
process these 5 bytes to get the payload length
keep calling Receive (buffering the data, or increasing the offset and decreasing the count) until you have at least the payload length
process the payload without ever converting to/from ASCII
I once again need your help figuring out this problem of mine...Been already a day and I can't seem to find out why this is happening in my code and output.
Ok.....so basically I am trying to implement the RCON Protocol of Valve in C#, so far I am getting the expected output given the code and sample usage below:
Usage:
RconExec(socket, "cvarlist");
Code:
private string RconExec(Socket sock, string command)
{
if (!sock.Connected) throw new Exception("Not connected");
//sock.DontFragment = true;
sock.ReceiveTimeout = 10000;
sock.SendTimeout = 10000;
//sock.Blocking = true;
Debug.WriteLine("Executing RCON Command: " + command);
byte[] rconCmdPacket = GetRconCmdPacket(command);
sock.Send(rconCmdPacket); //Send the request packet
sock.Send(GetRconCmdPacket("echo END")); //This is the last response to be received from the server to indicate the end of receiving process
RconPacket rconCmdResponsePacket = null;
string data = null;
StringBuilder cmdResponse = new StringBuilder();
RconPacket packet = null;
int totalBytesRead = 0;
do
{
byte[] buffer = new byte[4]; //Allocate buffer for the packet size field
int bytesReceived = sock.Receive(buffer); //Read the first 4 bytes to determine the packet size
int packetSize = BitConverter.ToInt32(buffer, 0); //Get the packet size
//Now proceed with the rest of the data
byte[] responseBuffer = new byte[packetSize];
//Receive more data from server
int bytesRead = sock.Receive(responseBuffer);
//Parse the packet by wrapping under RconPacket class
packet = new RconPacket(responseBuffer);
totalBytesRead += packet.String1.Length;
string response = packet.String1;
cmdResponse.Append(packet.String1);
Debug.WriteLine(response);
Thread.Sleep(50);
} while (!packet.String1.Substring(0,3).Equals("END"));
Debug.WriteLine("DONE..Exited the Loop");
Debug.WriteLine("Bytes Read: " + totalBytesRead + ", Buffer Length: " + cmdResponse.Length);
sock.Disconnect(true);
return "";
}
The Problem:
This is not yet the final code as I am just testing the output in the Debug window. There are a couple of issues occuring if I modify the code to it's actual state.
Removing Thread.Sleep(50)
If I remove Thread.Sleep(50), the output doesn't complete and ends up throwing an exception. I noticed the 'END' termination string is sent by the server pre-maturely. This string was expected to be sent by the server only when the whole list completes.
I tested this numerous times and same thing happens, if I don't remove the line, the list completes and function exits the loop properly.
Removing Debug.WriteLine(response); within the loop and outputting the string using Debug.WriteLine(cmdResponse.ToString()); outside the loop, only partial list data is displayed. If I compare the actual bytes read from the loop with the length of the StringBuilder instance, they're just the same? Click here for the output generated.
Why is this happening given the two scenarios mentioned above?
You are not considering that Socket.Receive very well could read fewer bytes than the length of the supplied buffer. The return value tells you the number of bytes that was actually read. I see that you are properly storing this value in a variable, but I cannot see any code that use it.
You should be prepared to make several calls to Receive to retrieve the entire package. In particular when you receive the package data.
I'm not sure that this is the reason for your problem. But it could be, since a short delay on the client side could be enough to fill the network buffers so that the entire package is read in a single call.
Try using the following code to retrieve package data:
int bufferPos = 0;
while (bufferPos < responseBuffer.Length)
{
bufferPos += socket.Receive(responseBuffer, bufferPos, responseBuffer.Length - bufferPos, SocketFlags.None);
}
Note: You should also support the case when the first call to Receive (the one where you receive the package's data length) doesn't return 4 bytes.