I'm working on a program in VS2010 C#. It has a GUI that is used to interact with an Arduino over the serial port.
The issue that I'm having is sending a byte value larger than 128(???) from the arduino to the program. I get an integer value on the arduino, break it into highBite and lowByte, and send each one, the reassemble on the other side.
If I send 600, it will send highByte of 2 and lowByte of 88, and it reassembles to 600 properly via bitshiting <<8 of highByte.
If I try to send 700 which should be 188 and 2, then I am seeing the 188 show in in C# as 63. Why???
A byte should be unsigned on both arduino and C#, so I'm not sure what is going wrong.
Arduino code (relevant parts): (0x43 signals to C# which data packet it is receiving)
byte bytesToSend[3] = {0x43, byte(88), byte(2)}; // 600 broken down to high and low bytes
Serial.write(bytesToSend, 3); // send three bytes
Serial.println(); //send line break to terminate transmission
byte bytesToSend[3] = {0x43, byte(188), byte(2)}; // 700 broken down to high and low bytes
Serial.write(bytesToSend, 3); // send three bytes
Serial.println(); //send line break to terminate transmission
C# code: (relevant parts - May have missed a syntax or two since I cut/trimmed and pasted...)
string inString = "";
inString = port.ReadLine(); // read a line of data from the serial port
inString = inString.Trim(); //remove newline
byte[] buf = new byte[15]; // reserve space for incoming data
buf = System.Text.Encoding.ASCII.GetBytes(inString); //convert string to byte array I've tried a block copy here, but it didn't work either...
Console.Write("Data received: H: {0}, L: {1}. =", buf[2], buf[1]); //display high and low bytes
Console.WriteLine(Convert.ToUInt32((buf[2] << 8) + buf[1])); //display combined value
And this is what I get in the serial monitor where it writes out the values:
Data received: H: 2, L: 88. = 600
Data received: H: 2, L: 63. = 575
The low byte value gets changed or mis-interpreted from 188 to 63 somewhere in the process. What is causing this and how can I fix it? It seems to work fine when the byte value is below 128, but not when it is above.
I think this could be problem at your c# side code. You should debug this by printing the string which you are reading just after port.ReadLine(), to see what you are receiving.
Also I would suggest to use C# Read(Byte[], Int32, Int32), so that your data is read into Byte Array which is array of unsigned char. ReadLine() is reading data into string (array of char).
Your encoding is wrong. Change the line from:
buf = System.Text.Encoding.ASCII.GetBytes(inString);
to
buf = System.Text.Encoding.GetEncoding("Windows-1252").GetBytes(inString);
Better yet, when you instantiate your Port object just set the encoder property to this type.
...
SerialPort port = new SerialPort();
System.Text.Encoding encoder = System.Text.Encoding.GetEncoding("Windows-1252");
port.Encoding = encoder;
...
Remember that ASCII is 7-bit so you will truncate values that are greater than decimal 127. The 1252 encoding is 8 bit and is great for binary data. The table shown at MSDN shows the full symbol support for the encoding.
Why, in C#, reading full string - which will force you to deal with encodings, ... - and do post-process instead of parsing in time?
System.IO.BinaryReader bin_port=new System.IO.BinaryReader(port); //Use binary reader
int b;
int data16;
b=bin_port.ReadByte();
switch (b) {
case 0x43: //Read integer
data16=bin_port.ReadUInt16();
while (bin_port.ReadByte()!=0x0a); //Discard all bytes until LF
break;
}
Related
I am receiving data from a cnc machine every 5 seconds. Length of the data is 66 bytes. And every two byte has a special meaning according to the guide that I have. The device sends the data over socket to a specific ip and port. I have been told that I should read the data as hex instead of ascii.
This line of code returns
string data = Encoding.ASCII.GetString(data.buffer,0,66);
this;
"\0\u0004\0\u0001\0\0\0\0\0\0\0\0\0\0\0\0\0\r\0\r\0\0\0\0\0\0:a\u0002#\0?\0`\u001b?\u0015U\0\0\0\0\u0001\u0010\0\u0018\0\0\u000f\a\0\0\0\0\0\0\0\0\0\0\0\0\0\0u/"
and of course it is not useful to me.
I did tried to convert byte array to the hex string with that code;
StringBuilder sb = new StringBuilder();
foreach (byte b in buffer)
sb.Append(b.ToString("X2"));
string hexString = sb.ToString();
And got result as
00040001000000000000000000020000000000000000000000003A9D023F00A000601B841555000000000110001800000F070000000000000000000000000000752F
And when I try to convert this result as string, no success, nothing meaningfull.
GOAL
What I am trying to achieve is, read the incoming socket data as hex and use every two byte as a word to match a value. For example first 2 byte should match either 0 or 1. With i have it returns ? (question mark)
Thank you.
I have been told that I should read the data as hex instead of ascii
My gut feeling is this statement has been misquoted or misunderstood. There is no value in processing binary data as string hex representation just as there is no value in converting it to ascii... The only sane way to process binary data, is in binary unless you have a meaningful way to convert it.
You mention you need word (2byte) groupings, you could just convert this to an array of short, or ushort depending on your needs
var bytes = new byte[66];
var shortArray = new short[bytes.Length / 2];
Buffer.BlockCopy(bytes, 0, shortArray, 0, bytes.Length);
or
for (int i = 0; i < shortArray.Length; i++)
shortArray[i] = BitConverter.ToInt16(bytes[(i*2)..(i*2+2)]);
Disclaimer : This is just an example, be very careful of the endianess of your data, there are other ways to do this
so I am trying to send some bytes with hex values to setup my microcontroller on the other side of Serial Port. The thing is I am not quite sure how to properly do it and in what format to send them. For example I need to send two bytes and their hex values are 57 and B0. When I try to send it as a char array and I read it back I always get the ASCII Hex values of those characters like 53 and then 55 for the value 57. So I wanted to format the hex value as a byte and send both of them at the same time from byte array but I am not getting anything when reading the response. After formatting it to byte, the MessageBox is showing it's decimal value and I don't know if it supposed to be like that. I am providing my code below.
Info_Byte_Dec += Protocol_Set + Protocol_Unit + Protocol_Write + Protocol_Settings; //stores decimal value
Data_Byte_Dec = Mode * Protocol_Mode_Offset + ODR * Protocol_ODR_Offset + Scale; //stores decimal value
Info_Byte_Hex = Info_Byte_Dec.ToString("X"); //convert to hex value
Data_Byte_Hex = Data_Byte_Dec.ToString("X"); //convert to hex value
string Merged = $"{Info_Byte_Hex} {Data_Byte_Hex}";
MessageBox.Show("Merged1: " + Merged);
byte[] Mergedbytes = Merged.Split(' ').Select(s => Convert.ToByte(s, 16)).ToArray();
MessageBox.Show("Merged2: " + Mergedbytes[0] + Mergedbytes[1]);
port.Write(Mergedbytes, 0, 2);
I am not sure whether I should just send the decimal value 87, or I should format it to 57 hex value, or even to 0x57.
usually in the microcontroller world when you use hex, you mean actual bytes, the hex is just a convenient notation to write binary byte values in, it usually never means send the ascii hex values. Having said that, there are no rules, and sometimes people do actual ascii hex.
In your case, if you are using 0x57 on the stm32, it likely means you are using a byte literal, and not a ascii representation. You would have to do extra work to turn a 0x57 into a string.
So that means in C# just send bytes, not the ascii hex like you are at the moment
I am trying to read decimal value from serial port which send me 1000 but my function is receiving only 232. I am able to receive values correctly upto 127. Also It is receiving all string values correctly.
private void mySerialPort_DataReceived(object sender, System.IO.Ports.SerialDataReceivedEventArgs e)
{
int bytes = 255;
byte[] buffer = new byte[bytes];
int i = 0;
do
{
bytes = mySerialPort.BytesToRead;
mySerialPort.Read(buffer, i, bytes);
i++;
Thread.Sleep(100);
} while (mySerialPort.BytesToRead != 0);
rxInt = BitConverter.ToInt32(buffer, 0);
this.Invoke(new EventHandler(displayDecimal));
}
First, let's explain that strange result:
The value 1000 is too big to fit in a byte. The hex value of decimal 1000 is 0x3E8. Since the serial port can only send a byte at a time, the lower part 0xE8, gets transmitted. The decimal value of 0xE8 is ... 232. So the seemingly weird transformation from 1000 to 232 is completely explained.
What to do about it? Bottom line is that you cannot stuff any value larger than 255 in a C# Byte structure (or -128 to 127 for SByte). You didn't mention what is sending the data, so I can't advise on that, but you need to break your data up into bytes for serial transmission or send it as a string (which you've already commented that works). There is a lot of help here on SO and on the internet at large in how to do that.
I have application that play Pcap files and i try to add function that wrap my packet with PPPOE layer.
so almost all done except large packets that i didn't understand yet how to set the new langth after add PPPOE layer.
For example this packet:
As you can see this packet length is 972 bytes (03 cc), and all i want is to convert it to decimal, after see this packet byte[] in my code i can see that this value converted into 3 and 204 in my packet byte[], so my question is how this calculation works ?
Those two bytes represents a short (System.Int16) in bigendian notation (most significant byte first).
You can follow two approaches to get the decimal value of those two bytes. One is with the BitConverter class, the other is by doing the calculation your self.
BitConverter
// the bytes
var bytes = new byte[] {3, 204};
// are the bytes little endian?
var littleEndian = false; // no
// What architecure is the BitConverter running on?
if (BitConverter.IsLittleEndian != littleEndian)
{
// reverse the bytes if endianess mismatch
bytes = bytes.Reverse().ToArray();
}
// convert
var value = BitConverter.ToInt16( bytes , 0);
value.Dump(); // or Console.WriteLine(value); --> 972
Calculate your self
base 256 of two bytes:
// the bytes
var bytes2 = new byte[] {3, 204};
// [0] * 256 + [1]
var value2 = bytes2[0] * 256 + bytes2[1]; // 3 * 256 + 204
value2.Dump(); // 972
I am reading a line from MCU via serial port. The line consists in 14 characters terminated by "OK". The characters are converted to int then processed. The problem is when the value becomes larger than 128. For values larger than 128 the value (int converted) remains at 63. Here is the code:
serialPort1.DiscardInBuffer();
serialPort1.DiscardOutBuffer();
serialPort1.Write("d");//request line from mcu
Thread.Sleep(100);
string line = serialPort1.ReadLine();
int p1_low = line[0];
int p1_high = line[1]*256;
int p1 = p1_low + (p1_high);
label1.Text = "Input Sensor: " + p1_low;
p1_low varies much often than p1_high and sticks to 63 value when is larger than 128. Where can be the problem?
Change the encoding to
SerialPort1.Encoding = System.Text.Encoding.GetEncoding(28591)
The default encoding, as you have discovered, replaces byte values > 127 with a '?'. Encoding 28591 preserves byte values greater than 127.
You do not need the thread sleep as .ReadLine blocks.
It sounds like you are set to use 7 data bits. Change the data bits config value to 8, so you can get all 256 values.
Reference: http://msdn.microsoft.com/en-us/library/system.io.ports.serialport.databits.aspx
Use the method
Write(Byte[], int32, int32) because a byte is a numerical value --> (0-255) or (0x00-0xFF) or (b00000000-11111111). A char or a string can be encoded but not a byte.
Make sure to
NOT use Write(Char[], int32, int32)