Read characters from serial port in c# - c#

Hello I am using Read() method to read 10 characters say 0123456789 from serial port. Actually the characters are sent by a PIC Micro-controller.
Here is my code:
serialPort1.PortName = "com4";
serialPort1.BaudRate = 9600;
serialPort1.Open();
char[] result = new char[10];
serialPort1.Read(result, 0, result.Length);
string s = new string(result);
MessageBox.Show(s);
serialPort1.Close();
When I run the code, a message box shows up and displays only the first character. "0" alone is displayed in the message box.
Where have i gone wrong ??

What you are doing wrong is not paying attention to the return value of Read(). Which tells you how many bytes were read.
Serial ports are very slow devices, at a typical baudrate setting of 9600 it takes a millisecond to get one byte transferred. That's an enormous amount of time for a modern processor, it can easily execute several million instructions in a millisecond. The Read() method returns as soon as some bytes are available, you only get all 10 of them if you make your program artificially slow so the driver gets enough time to receive all of them.
A simple fix is to keep calling Read() until you got them all:
char[] result = new char[10];
for (int len = 0; len < result.Length; ) {
len += serialPort1.Read(result, len, result.Length - len);
}
Another common solution is to send a unique character to indicate the end of the data. A line feed ('\n') is a very good choice for that. Now it becomes much simpler:
string result = serialPort.ReadLine();
Which now also supports arbitrary response lengths. Just make sure that the data doesn't also contain a line feed.

Related

UART communication bug in STM32 and a C# app

Kindly bear with me for this confusing question. I'm finding it as hard to describe as it is involving and tiresome. Read it and you'll know why.
I've been hounding this issue for over a month now without much progress. I'm using an STM32 (STM32F103C8 mounted on a BluePill board) to communicate with a C# app through an FT232r Serial-USB converter. The complete communication protocol is a bit complex. I'm writing here a simplistic version of the code that explains my problem quite accurately.
STM32 does the following.
In the initial setup,
Serial.begin at 2000000 (Yes it's very high but I've analyzed it using an oscilloscope and the signal is very healthy; impedance matching and clock jitter is very accurate).
Waits for a command from the C# end to enter the loop
In the loop, it does the following.
TX a byte buffer of length N on the serial port. Packet structure is 0xAA, N bytes, 1 byte checksum.
repeat the loop
And on the C# side (Pseudo code),
new Thread(()=>{while(true) IOTick(); Thread.Sleep(30); }).Start();
IOTick() is defined as:
{
while(SerialPortObject.BytesToRead > 1)
{
header = read();
if (header != 0xAA) continue;
byte [] buffer = new byte[N + 1];
receivedBytes = readBytes(buffer, N + 1, Timeout = 500ms); // receivedBytes is never less than N + 1 for timeout greater than 120)
use the N=16 bytes. Check Nth byte to compare checksum. Doen't take too much CPU time.
Send a packet received software event.
}
}
readBytes is defined as
int readBytes(byte[] buffer, int count, int timeout)
{
var st = DateTime.Now;
for (int i = 0; i < count; i++)
{
var b_ = read(timeout);
if (b_ == -1)
return i;
buffer[i] = (byte)b_;
timeout -= (int)(DateTime.Now - st).TotalMilliseconds;
}
return count;
}
int buffer2ReadIndex = 0;
byte[] buffer2= new byte[0];
int read(int timeout)
{
DateTime start = DateTime.Now;
if (buffer2.Length == 0)
{
while (SerialPortObject.BytesToRead <= 0)
{
if ((DateTime.Now - start).TotalMilliseconds > timeout)
return -1;
System.Threading.Thread.Sleep(30);
}
buffer2 = new byte[SerialPortObject.BytesToRead];
sp.Read(buffer2, 0, buffer2.Length);
}
if (buffer2.Length > 0)
{
var b = buffer2[buffer2ReadIndex];
buffer2ReadIndex++;
if (buffer2ReadIndex >= buffer2.Length)
{
buffer2ReadIndex = 0;
buffer2 = new byte[0];
}
return b;
}
return -1;
}
Now, everything is working as expected. The packet received software event is triggered not later than every ~30ms (the windows tick time). The problem starts if I have to wait between each packet TX at the STM side. First, I suspected that the I2C I was using for some tasks between each packet TX was causing some HW or software conflict with serial data which gets corrupted. But then I noticed that only if I introduce a delay of 1 millisecond using Arduino delay() between each packet TX, the same thing happens. Almost, 1K packets should be received every second now. Almost 1 out of 10 packets after a successful header exception get either not delivered completely or delivered with corrupted checksum, causing the C# app to lose the packet Header. The new header trace obviously requires flushing some bytes, losing some packets in the communication. Even this doesn't sound too bad for an app that can afford 5% data packet loss, strangely though, when this anomaly occurs, the packet received software interrupt waits for more than 1 second after every couple hundred of consecutive events.
I'm completely blind here. Even tried it with 115200 baud rate, does the same loss with a slightly lesser loss ratio. It should be noted that at 9600 baud, the issue doesn't happen. This is the only hint I've got right now.
It looks like I've found an answer.
After digging deep into SerialPort and SerialPort.base stream class and after doing some document reading and benchmarking, here is what I've observed:
SerialPort.BytesToRead updates are not uniform. DataReceived event seems to be following it. When bytes are coming at ~200kHz, (baud = 2Mbps), It is updated almost instantaneously (or within 30ms, worst case). When they are coming at ~20kHz or slower (evenly spaced on time using a micrcontroller), the SerialPort.BytesToRead can take up to 400ms to update. This happens only after a dozen 30ms updates.
So, observing this, I can say that SerialPort.BytesToRead is updated on two conditions. Some amount of time has passed since the data arrived (and this time is not constrained to 30ms) or the data is coming too fast.
This is a strange behavior. No data is lost when this anomaly is occurring. Not to surprise, 0.06% of bytes are lost when working at full bandwidth (200KBps at baud of 2Mbps).

Server's socket blocking if no newline detected but client's socket not

While setting up a TCP server-client connection, I realized that the server receive function hangs if the client does not send an '\n', but the client does not block if the sever doesn't. I tried searching for an explanation without finding a proper answer, so I came here to ask for your help.
I am using the same function to exchange data for both server and client, but I don't know why it works for one and doesn't for the other...
Here is my function in C#:
public bool sendToClient(int i, string msg)
{
try
{
(clientSockets.ElementAt(i)).mSocket.Send(Encoding.ASCII.GetBytes(msg));
}
catch(Exception e)
{
Console.WriteLine(e.Data.ToString());
return false;
}
return true;
}
private string getMessageFromConnection(Socket s)
{
byte[] buff;
string msg = "";
int k;
do
{
buff = new byte[100];
k = s.Receive(buff, 100, SocketFlags.None);
msg += Encoding.ASCII.GetString(buff, 0, k);
} while (k >= 100);
return msg;
}
The sockets are simple SOCK_STREAM ones, and the clientSockets is a list containing Client objects containing each client info including their socket.
I understand that one solution would be to detect a particular character to end the message, but I would like to know the reason behind it because I also had this issue using C.
Thanks in advance.
Your while loop continues only as long as you're reading exactly 100 bytes, and it seems that you intend to use that to detect the end of a message.
This will fail if the message is exactly 100 or any multitude of 100 bytes (in which case it will append a subsequent message to it).
But even worse - there is no guarantee that the socket will return 100 bytes, even if there is data still on its way. Receive does not wait until the underlying buffer has reached 100 bytes, it will return whatever it has available at that point.
You're going to have to either include a header that indicates the message length, or have a terminator character that indicates the end of the message.

RFID Doesn't answer always properly

I have an RFID card reader connected to my pc on serial port. It's using RS485, so I need switching between send and receive state. The communication frames contains header and CRC (CRC16 ccitt - Xmodem).
After every writing on the port I'm waiting the answer, then computing the CRC and if it failed, request frame again. Then if everything correct process it.
It works fine with the "simple" commands. (Request Firmware version, Enable/Disable antenna, etc..).
With the important commands (Logging into the reader's interface, config. it, etc..) I'm facing the next: Rarely the answer comes correctly, with a maximum delay of 5 secs, but in the most of the cases, I don't get anything on the buffer. I can wait for minutes, but nothing.
Conclusion: If I get answer it happens in the first seconds, if I don't I can wait anytime, it won't happen.
My question is: Could it be the hardware's fault, or maybe I miss something in my software?
Here is the send & receive part of my code:
int size;
bool msg_ok = false;
do
{
int max_attemps = 50;
port.DtrEnable = true;
port.RtsEnable = false;
port.Write(fullMsg, 0, fullMsg.Length);
port.DtrEnable = false;
port.RtsEnable = true;
do
{
Thread.Sleep(200);
size = port.BytesToRead;
}while(size <= 3 && max_attemps-- > 0);
if(size > 3){
answer = new byte[size];
port.Read(answer, 0, size);
int end = answer.Length - 1; //Trim 0-s after end
while (answer[end] == 0)
--end;
int start = 0;
while (answer[start] == 0) //Trim 0-s before header
++start;
trimmed = new byte[(end - start) + 1];
Array.Copy(answer, start, trimmed, 0, (end - start) + 1);
checkSum = new byte[2];
checkSum = crc.ComputeChecksumBytes(trimmed, trimmed.Length); //Calculate crc
if (checkSum[0] == trimmed[trimmed.Length - 1] && checkSum[1] == trimmed[trimmed.Length - 2])
{
msg_ok = true; //If it's still false on the end, restart this whole block and request again, if it's true, I can send the answer for processing
}
} else {
Console.WriteLine("Timed out.");
}
}while(!msg_ok);
When data is sent over a serial port, the operating system buffers the data as it arrives. If you query the data when only some of it has arrived, you will get a partial packet. You need to keep reading until you receive the full packet before you start trying to decode it. Otherwise your decode will fail on the first half of the packet, fail on the second half, and then sit waiting for another message that will never come.
The best approach for using a serial port is to subscribe to the DataReceived event, because this means you are called by the port if and when data arrives. This avoids having to sleep to try to get around the timing issues. You will still sometimes need to stitch several chunks of received data together to form a valid packet however, so you should write your code to keep reading and appending into a receive buffer until it recognises a valid, complete packet.
You also shouldn't need to flip the handshaking bits unless the device on the other end of the serial line is very unusual - just send your data and wait for the reply. By changing the low level states on the port manually you are likely to introduce transmission problems into the system.
Try starting with the example code on the DataReceived event page (above) and you should have more reliable results.

Sending Array of Bytes from Client to server?

I've a tcp based client-server application. I'm able to send and receive strings, but don't know how to send an array of bytes instead.
I'm using the following function to send a string from the client to the server:
static void Send(string msg)
{
try
{
StreamWriter writer = new StreamWriter(client.GetStream());
writer.WriteLine(msg);
writer.Flush();
}
catch
{
}
}
Communication example
Client sends a string:
Send("CONNECTED| 84.56.32.14")
Server receives a string:
void clientConnection_ReceivedEvent(Connection client, String Message)
{
string[] cut = Message.Split('|');
switch (cut[0])
{
case "CONNECTED":
Invoke(new _AddClient(AddClient), client, null);
break;
case "STATUS":
Invoke(new _Status(Status), client, cut[1]);
break;
}
}
I need some help to modify the functions above in order to send and receive an array of bytes in addition to strings. I want to make a call like this:
Send("CONNECTED | 15.21.21.32", myByteArray);
Just use Stream - no need for a writer here. Basic sending is simple:
stream.Write(data, 0, data.Length);
However you probably need to think about "framing", i.e. how it knows where each sun-message starts and ends. With strings this is often special characters (maybe new line) - but this is rarely possible in raw binary. A common appoach is to proceed the message with the number f bytes to follow, in a pre-defined way (maybe network-byte-order fixed 4 byte unsigned integer, for example).
Reading: again, use the Stream Read method, but understand that you always need t check the return value; just because you say "read at most 20 bytes" doesn't mean you get that many, even if more is coming - you could read 3,3,3,11 bytes for example (unlikely, but you see what I mean). For example, to read exactly 20 bytes:
var buffer = new byte[...];
int count = 20, read, offset = 0;
while(count > 0 && ((read = source.Read(buffer, offset, count)) > 0) {
offset += read;
count -= read;
}
if(count != 0) throw new EndOfStreamException();
Since you seem new to networking you might want to use WCF or another framework. I've just written an article about my own framework: http://blog.gauffin.org/2012/05/griffin-networking-a-somewhat-performant-networking-library-for-net
You need to use a header for your packets as Mark suggested, since TCP uses streams and not packets. i.e. there is not 1-1 relation between send and receive operations.
This is the same problems I'm having. I only code the client and the server accepts byte arrays as proper data. The messages start with an ASCII STX character preceded by a bunch of bytes of any values except the STX and ETX characters. The message ends with a ETX ASCII CHARACTER. In C I could do this in my sleep, but I'm learning C# on the job. I don't understand why you would send bunches of double byte unicodes when single byte ASCII codes work just as well. Wasting double the bandwidth.

Problem with serial port data receive in C#

I have a problem with a C# program.
Through the Serial port i Receive a large string about 110 characters.
This part works ok, but when i add code to split the string up i receive an error after some rows.
Here is the error I get:
**An unhandled exception of type 'System.ArgumentOutOfRangeException' occurred in mscorlib.dll
Additional information: Index and length must refer to a location within the string.**
Here is the code:
private void port_DataReceived(object sender, SerialDataReceivedEventArgs e)
{
if (!comport.IsOpen) return;
if (CurrentDataMode == DataMode.Text)
{
// Read all the data waiting in the buffer
string data = comport.ReadExisting();
Log(LogMsgType.Incoming, data);
string ziua = data.Substring(0, 8);
string ora = data.Substring(8, 8);
string linie = data.Substring(18, 1);
string interior = data.Substring(22, 3);
string durata1 = data.Substring(26, 4);
string durata2 = data.Substring(30, 8);
string nrtel = data.Substring(38, 10);
string tipapel = data.Substring(75, 1);
string acct = data.Substring(76, 5);
}
else
{
int bytes = comport.BytesToRead;
byte[] buffer = new byte[bytes];
comport.Read(buffer, 0, bytes);
Log(LogMsgType.Incoming, ByteArrayToHexString(buffer));
}
}
EDIT:
i've tested every substring and all of them are ok.
the string lenght is 112. it can't be to short.
this error appears after a few lines of 112... about one and a half
This is typical behavior for a serial port. They are very slow. When the DataReceived event fires, you'd typically only get one or two characters. Notably is that it works well when you debug because single-stepping through the code gives the serial port lots of time to receive additional characters. But it will go Kaboom as soon as you run without a debugger because the string isn't long enough.
You'll need to modify the code by appending the string you receive to a string variable at class scope. Only parse the string after you've received all the characters you expected. You'll need some way to know that you've received the full response. Most typically serial devices will terminate the string with a special character. Often a line-feed.
If that's the case then you can make it easy by setting the SerialPort.NewLine property to that terminator and calling ReadLine() instead of ReadExisting().
The length of "data" is probably too short for one of the calls to "Substring". Check the length of the string that you expect before accessing parts of it that may not exist.
You don't check what you have enough data before processing. SerialPort.ReadExisting Method just
Reads all immediately available bytes, based on the encoding, in both the stream and the input buffer of the SerialPort object.
Your device maybe just don't have time to pass all data. So you need rewrite your logic to concatenate incoming data and process it after receiving enough data.
The exception is telling you that, at some point, Substring is being given parameters that exceed the length of the string. Which likely means you aren't getting the data you are expecting from the serial port. Try inserting a breakpoint at the first call to Substring and check the contents of data to make sure the device you are reading from isn't sending some kind of error code or something other than what your code expects.
You should verify the length of your string before your start splitting it up. Put a conditional in there to handle the case where the string is less than what you expect, and then see if the errors persist.
Try make a length check for each variable assignment like this:
string acct = (data.length >= 81) ? data.Substring(76, 5) : string.empy;
The data.length could be shorter than the total length of your substring (76 + 5).

Categories