Handle big bytes when receiving from serial port in c# - c#

I am new in the serial port. Currently, my project is to extract data from the machine. I'm getting data via an event onDataReceive and the machine is sending bytes.
My problem is that the first wave of bytes seemed to be converted correctly to string but in the second batch of bytes, I got garbage data.
Screen Shot of the output(this is the output given by the multi-currency reader machine:
The garbage data is I think the Serial Nos.
Here is my code in onDataReceive method:
private void _serialPort_DataReceived(object sender, SerialDataReceivedEventArgs e)
{
while (serialPort.BytesToRead > 0)
{
// Initialize a buffer to hold the received data
byte[] buffer = new byte[serialPort.ReadBufferSize];
//// There is no accurate method for checking how many bytes are read
//// unless you check the return from the Read method
int bytesRead = serialPort.Read(buffer, 0, buffer.Length);
String asd = System.Text.ASCIIEncoding.ASCII.GetString(buffer, 0, bytesRead);
//// For the example assume the data we are received is ASCII data.
tString += Encoding.ASCII.GetString(buffer, 0, bytesRead);
temp += System.Text.Encoding.Unicode.GetString(buffer, 0, bytesRead);
temp2 += System.Text.Encoding.UTF32.GetString(buffer, 0, bytesRead);
System.IO.File.WriteAllText(#"C:\OutputTextFiles\WriteLines.txt", tString);
System.IO.File.WriteAllText(#"C:\OutputTextFiles\WriteLines2.txt", temp);
System.IO.File.WriteAllText(#"C:\OutputTextFiles\WriteLines3.txt", temp2);
}
}
I'm trying to put the output with a txt file.
I hope someone could help me in my problem. Any tips and suggestions in data handling especially bytes?

Without knowing the size of serialPort.ReadBufferSize I can only suspect that your buffer is breaking the encoding bytes of your string. A character can be made of one or more bytes.
The trick is to read all of the bytes before decoding the string.
Try this example program:
var encoding = System.Text.Encoding.Unicode;
var message = "I am new in serial port. Currently my project is to extract data from machine.";
using (var ms = new MemoryStream(encoding.GetBytes(message)))
{
var bytes = new List<byte>();
var buffer = new byte[23];
var bytesRead = ms.Read(buffer, 0, buffer.Length);
while (bytesRead > 0)
{
Console.WriteLine(encoding.GetString(buffer, 0, bytesRead));
bytes.AddRange(buffer.Take(bytesRead));
bytesRead = ms.Read(buffer, 0, buffer.Length);
}
Console.WriteLine(encoding.GetString(bytes.ToArray(), 0, bytes.Count));
}
It will output the following:
I am new in�
猀攀爀椀愀氀 瀀漀爀琀�
. Currently�
洀礀 瀀爀漀樀攀挀琀 �
is to extra�
琀 搀愀琀愀 昀爀漀洀�
machine.
I am new in serial port. Currently my project is to extract data from machine.
The final line is correct because it uses all of the bytes to decode. The previous lines have errors because I've used a buffer size of 23 which breaks the string encoding.

Related

How do I keep reading data from my Socket to append my buffer

So I'm trying to send an image from my client to my server and I'm sending the entire image as a whole, meaning that I'm not splitting it up into chunks, I'm just sending the entire byte array as is.
CLIENT
private void SendImage(byte[] opcode, byte[] length, byte[] payload)
{
var packet = new byte[payload.Length + length.Length + 1];
Array.Copy(opcode, 0, packet, 0, 1);
//Set the length
Array.Copy(length, 0, packet, 1, length.Length);
Array.Copy(payload, 0, packet, 5, payload.Length);
_clientSocket.Send(packet);
}
This sends just fine, I'm using the OpCode 0x15 which will be interpreted by the server as "there is an image incoming". The length is the payload.Length which I've done this with
var length = BitConverter.GetBytes(myImage.Length);
So it occupies 4 bytes.
So my packet structure looks like this OpCode(1 byte), Length(4 bytes), Payload(imageBytes)
This method works just fine, it sends without any issues what so ever, I only included the code for it so that the next part will make sense.
SERVER
private byte[] _buffer = new byte[1024];
private void ReceiveCallback(IAsyncResult ar)
{
var client = (Socket)ar.AsyncState;
int received = client.EndReceive(ar);
//Temporary buffer
var dataBuf = new byte[received];
Array.Copy(_buffer, dataBuf, received);
switch (dataBuf[0])
{
//Received image
case 0x15:
//Read the packet header and check the length of the payload
var length = BitConverter.ToInt32(dataBuf.Skip(1).Take(4).ToArray(), 0);
//This will hold the bytes for the image
var imageBuffer = new byte[length];
//First incoming packet payload (image bytes)
var imgData = dataBuf.Skip(5).ToArray();
//Copy that into the buffer
Array.Copy(imgData, 0, imageBuffer, 0, imgData.Length);
var pos = imgData.Length;
//Keep reading bytes from the incoming stream
while (dataBuf.Length > 0)
{
imgData = new byte[1024];
client.Receive(imgData);
Array.Copy(imgData, 0, imageBuffer, pos, dataBuf.Length);
pos += 1024;
}
//This takes the bytes and creates a bitmap from it
AnotherViewModel.SetImage(imageBuffer);
break;
default:
Debug.WriteLine("Wat");
break;
}
client.BeginReceive(_buffer, 0, _buffer.Length, SocketFlags.None, new AsyncCallback(ReceiveCallback), client);
}
The issues I'm facing is that I don't know how to keep appending the incoming data to the imageBuffer because I want to receive the full image and not just the first 4 bytes and with my while loop I'm currently getting this exception
Destination array was not long enough. Check destIndex and length, and
the array's lower bounds.
Over at this line Array.Copy(imgData, 0, imageBuffer, pos, dataBuf.Length);
How do I properly read the entire image that the client sends to the server?

Socket receiving data is not coming. I tested on hercules

I'm sending data to Lector device.
Normally I received data from device when I sending on hercules.
Hercules is returning "sRA eExtIn1 0 0 0".
The below code has waiting line stream.Read() function.
How can I getting data from device?
string responseData = null;
using (TcpClient client = new TcpClient("10.1.13.102", 2111))
{
using (NetworkStream stream = client.GetStream())
{
byte[] sentData = System.Text.Encoding.ASCII.GetBytes("<STX>sRN eExtIn1<ETX>");
stream.Write(sentData, 0, sentData.Length);
byte[] buffer = new byte[32];
int bytes;
if (client.Connected)
{
while ((bytes = stream.Read(buffer, 0, buffer.Length)) != 0)
{
for (int i = 0; i < bytes; i++)
{
responseData += (char)buffer[i];
}
}
}
}
}
The mistake you're making, and the other answer is also making, is assuming that stream.Read won't return until it has read 32 bytes. That is incorrect.
https://msdn.microsoft.com/en-us/library/system.net.sockets.networkstream.read(v=vs.110).aspx
This method reads data into the buffer parameter and returns the
number of bytes successfully read. If no data is available for
reading, the Read method returns 0. The Read operation reads as much
data as is available, up to the number of bytes specified by the size
parameter.
It will return when there is no data available to read or 32 bytes have been read, whichever comes first. So if, for example, the client is slow or the network very busy, the response may not have arrived yet when you call stream.Read. Consequently, there will be nothing to read so it will return 0 and you will exit, failing to read the data. In fact, you may have to call stream.Read any number of times to get the full 32 bytes if the network is very saturated and data is arriving a few bytes at a time (not likely with such a small packet, but you have to code it that way).
So your code needs to look like this (note the additional while loop):
using (TcpClient client = new TcpClient("10.1.13.102", 2111))
{
using (NetworkStream stream = client.GetStream())
{
byte[] sentData = System.Text.Encoding.ASCII.GetBytes("<STX>sRN eExtIn1<ETX>");
stream.Write(sentData, 0, sentData.Length);
byte[] buffer = new byte[32];
int bytes;
if (client.Connected)
{
int bytesRead = 0;
while (bytesRead < buffer.Length)
{
while ((bytes = stream.Read(buffer, 0, buffer.Length)) != 0)
{
for (int i = 0; i < bytes; i++)
{
responseData += (char)buffer[i];
}
bytesRead += bytes;
}
}
}
}
}
Thanks everybody.
I found solution of my question.
and tags should be describe as bytes. Like below.
byte[] byt = System.Text.Encoding.ASCII.GetBytes("sRN DItype");
stream.Write(STX, 0 , 1);
stream.Write(byt, 0, byt.Length);
stream.Write(ETX, 0, 1);
stream.Read(buffer, 0, buffer.Length);

C# Sending request with SSL via TCP doesn't work

here is my code to send TCP request coded in SSL and get the answer from server, but it doesn't work. Debugging and trying to catch exceptions didn't work.
public void SendRequest(string request)
{
byte[] buffer = new byte[2048];
int bytes = -1;
sslStream.Write(Encoding.ASCII.GetBytes(request));
bytes = sslStream.Read(buffer, 0, buffer.Length);
Console.WriteLine(Encoding.ASCII.GetString(buffer, 0, bytes));
}
I took that from stackoverflow answer so I'm suprised it doesn't work. Here is my code that receives greeting from server (it works properly):
byte[] buffer = new byte[2048];
StringBuilder messageData = new StringBuilder();
int bytes = 0;
bytes = sslStream.Read(buffer, 0, buffer.Length);
Decoder decoder = Encoding.UTF8.GetDecoder();
char[] chars = new char[decoder.GetCharCount(buffer, 0, bytes)];
decoder.GetChars(buffer, 0, bytes, chars, 0);
messageData.Append(chars);
Console.Write(messageData.ToString());
If the sslStream.Read(buffer, 0, buffer.Length); is hanging, it's because the server hasn't sent a response.
Taking a look at your code on github (linked from your other question), it looks like you are using Console.ReadLine() to read commands and then write them to your network stream.
What is happening is that ReadLine() strips off the "\r\n" in the string that it returns, so what you'll need to do when sending it to the server is to add back the "\r\n" so that the server knows that the command is finished (it waits until it gets an EOL sequence before it will respond).
What you could do is in SendRequest(), you could do:
sslStream.Write(Encoding.ASCII.GetBytes(request + "\r\n"));

Screenshot not transferring fully

my program right now is sending a screen shot from one computer to the other but the result on the receiving side the img looks like:
Bitmap bmpScreenShot = new Bitmap(screenWidth, screenHeight);
Graphics gfx = Graphics.FromImage((Image)bmpScreenShot);
gfx.CopyFromScreen(0, 0, 0, 0, new Size(screenWidth, screenHeight));
bmpScreenShot.Save("pic.jpg", ImageFormat.Jpeg);
MemoryStream ms = new MemoryStream();
bmpScreenShot.Save(ms, ImageFormat.Jpeg);
bmpbyte = ms.ToArray();
bmpScreenShot.Dispose();
ms.Close();
///////////////////////////
Send_Text("" + screenHeight);
textBox3.Text += ("\r\nSending Hight=" + screenHeight);
Send_Text("" + screenWidth);
textBox3.Text += ("\r\nSending Width=" + screenWidth);
System.Threading.Thread.Sleep(200);
Send_Text("" + bmpbyte.Length);
textBox3.Text += ("\r\nSending size of:" + bmpbyte.Length);
textBox3.Text += "\r\nTransmiting the Screenshot";
stm.Write(bmpbyte, 0, bmpbyte.Length);
textBox3.Text += "\r\nSent IMG :)";
the code on top is the client side of it
im sending the size of the pic (hight and width and array length) and it transfers properly
but the server as i said has trobles getting the full pic
textBox2.Text += "\r\n Getting h";
byte[] buffer = new byte[320];
s.Receive(buffer);
string str = Encoding.UTF8.GetString(buffer, 0, buffer.Length);
Array.Clear(buffer, 0, buffer.Length);
int h = int.Parse(str);
textBox2.Text += "="+h+"\r\n Getting w";
s.Receive(buffer);
str = Encoding.UTF8.GetString(buffer, 0, buffer.Length);
Array.Clear(buffer, 0, buffer.Length);
int w = int.Parse(str);
textBox2.Text += "="+w+"\r\n Getting p";
s.Receive(buffer);
str = Encoding.UTF8.GetString(buffer, 0, buffer.Length);
Array.Clear(buffer, 0, buffer.Length);
int p = int.Parse(str);
textBox2.Text += "=" + p;
byte[] asd = new byte[p];
Bitmap cc = new Bitmap(h, w);
s.Receive(asd);
MemoryStream ms = new MemoryStream(asd);
Image bmp = Image.FromStream(ms);
bmp.Save("End.jpg", ImageFormat.Jpeg);
pictureBox1.Image = bmp;
plz note that alll of the undefined things are defind some where in the code
and the text box lines are just for the user interface
one more thing im using the localhost \local network and it all tcp
You're not telling us how you send the information (over network? serial? TCP?), but from your code I can see one thing:
You do receive the actual number of bytes to wait for, but you're not waiting for the actual number of bytes.
byte[] asd = new byte[p];
s.Receive(asd);
This does create a byte array large enough to keep all the bytes, but does s.Receive(asd) really receive all the bytes?
After you edited your question, let me clarify one thing: TCP communication can be fragmented. Just because you send 4000 bytes in one go does not guarantee the receiver to receive 4000 bytes in one go. He probably won't. That's why the Receive method returns the actual number of bytes received.
So what your receiver needs to do is this (pseudo code):
int totalBytesRead = 0;
do
{
int bytesRead = s.Receive(buffer);
totalBytesRead += bytesRead;
if (bytesRead == 0)
{
// Stream was closed - no more bytes
break;
}
else
{
// Write bytesRead bytes from buffer to memory stream
}
}
while (totalBytesRead < expectedNumberOfBytes);
if (totalBytesRead < expectedNumberOfBytes)
throw new Exception("Premature end of transmission");
Actually, thinking things through again and looking at your code I noticed that you're actually sending JPEG bytes to the receiver. It's highly improbable that an uninitialized buffer on the receiver's side should be valid JPEG.
So while all the above is still true, I now doubt that the receiver is actually the one doing things wrong. It seems now that saving the bytes to the memory stream doesn't work properly.
I see from your code you're saving the image both to a file and to a memory stream. Does the file contain a valid picture? If not you need to look for the cause of the screenshot not being created properly. If the picture is valid, you could try two other things:
Read the JPEG bytes from the file instead of creating a memory stream
Don't dispose of the memory stream before sending the bytes to the client, but after that
There are questions on SO that indicate that timing seems to be an issue when saving images to a memory stream and getting the bytes from the stream. Sometimes it won't work.

sslStream.Read problem: all bytes read are 0

TcpClient client = new TcpClient("69.147.112.160", 443);
SslStream sslStream = new SslStream(client.GetStream(),false,
ValidateServerCertificate,null);
try
{
sslStream.AuthenticateAsClient("mail.yahoo.com");
}
catch (AuthenticationException e)
{
return;
}
byte[] messsage = Encoding.UTF8.GetBytes(".<EOF>");
sslStream.Write(messsage);
sslStream.Flush();
byte[] buffer = new byte[4096];
int bytes2 = -1;
do
{
/**************************************************
*** JUST A LINE BELOW ALL buffer BYTES ARE ZERO!**
*************************************************/
bytes2 = sslStream.Read(buffer, 0, 4096);
m_sockClient.Send(buffer, bytes2, 0);
} while (bytes != 0);
All bytes in buffer that have not been filled in by the Read call will be zero; this is standard C#.
If every last one byte in there is zero, only two things can be responsible:
You read real null bytes from the stream (unlikely)
Read does not read anything (in which case it returns 0 -- you should definitely be checking the return value)
bytes2 = sslStream.Read(buffer, 0, 4096); reads up to 4096 bytes into buffer, not exactly 4096 bytes. It blocks until at least one byte is read and returns the number of bytes read. So after the method call, buffer will have the same content as before the method call (e.g., filled with nulls), except for the first bytes2 bytes, which are the bytes received from the server.

Categories