Serial reading, only first 128 values taken into account - c#

I am reading a line from MCU via serial port. The line consists in 14 characters terminated by "OK". The characters are converted to int then processed. The problem is when the value becomes larger than 128. For values larger than 128 the value (int converted) remains at 63. Here is the code:
serialPort1.DiscardInBuffer();
serialPort1.DiscardOutBuffer();
serialPort1.Write("d");//request line from mcu
Thread.Sleep(100);
string line = serialPort1.ReadLine();
int p1_low = line[0];
int p1_high = line[1]*256;
int p1 = p1_low + (p1_high);
label1.Text = "Input Sensor: " + p1_low;
p1_low varies much often than p1_high and sticks to 63 value when is larger than 128. Where can be the problem?

Change the encoding to
SerialPort1.Encoding = System.Text.Encoding.GetEncoding(28591)
The default encoding, as you have discovered, replaces byte values > 127 with a '?'. Encoding 28591 preserves byte values greater than 127.
You do not need the thread sleep as .ReadLine blocks.

It sounds like you are set to use 7 data bits. Change the data bits config value to 8, so you can get all 256 values.
Reference: http://msdn.microsoft.com/en-us/library/system.io.ports.serialport.databits.aspx

Use the method
Write(Byte[], int32, int32) because a byte is a numerical value --> (0-255) or (0x00-0xFF) or (b00000000-11111111). A char or a string can be encoded but not a byte.
Make sure to
NOT use Write(Char[], int32, int32)

Related

C# : How can I encode GUIDs to 11 character ids?

Like https://www.youtube.com/watch?v={id}
{id}:11characters
I try to use Convert.ToBase64String
string encoded = Convert.ToBase64String(guid.ToByteArray())
.Replace("/", "_")
.Replace("+", "-").Replace("=", "");
like this
The GUIDs is only reduced to 22 characters.
How can I encode GUIDs to 11 character ids?(or less 22 characters)
11 characters, even assuming you could use all 8 bits per character in a URL (hint: you can't), would only allow 88 bits.
A UUID/GUID is 128 bits. Therefore, the conversion you propose is not possible without losing data.
This is off topic answer but it might give you an ID with only 11 characters.
In C# a long value has 64 bits, which if encoded with Base64, there will be 12 characters, including 1 padding =. If we trim the padding =, there will be 11 characters.
One crazy idea here is we could use a combination of Unix Epoch and a counter for one epoch value to form a long value. The Unix Epoch in C# DateTimeOffset.ToUnixEpochMilliseconds is in long format, but the first 2 bytes of the 8 bytes are always 0, because otherwise the date time value will be greater than the maximum date time value. So that gives us 2 bytes to place an ushort counter in.
So, in total, as long as the number of ID generation does not exceed 65536 per millisecond, we can have an unique ID:
// This is the counter for current epoch. Counter should reset in next millisecond
ushort currentCounter = 123;
var epoch = DateTimeOffset.UtcNow.ToUnixTimeMilliseconds();
// Because epoch is 64bit long, so we should have 8 bytes
var epochBytes = BitConverter.GetBytes(epoch);
if (BitConverter.IsLittleEndian)
{
// Use big endian
epochBytes = epochBytes.Reverse().ToArray();
}
// The first two bytes are always 0, because if not, the DateTime.UtcNow is greater
// than DateTime.Max, which is not possible
var counterBytes = BitConverter.GetBytes(currentCounter);
if (BitConverter.IsLittleEndian)
{
// Use big endian
counterBytes = counterBytes.Reverse().ToArray();
}
// Copy counter bytes to the first 2 bytes of the epoch bytes
Array.Copy(counterBytes, 0, epochBytes, 0, 2);
// Encode the byte array and trim padding '='
var shortUid = Convert.ToBase64String(epochBytes).TrimEnd('=');

How to send byte array over Serial Port to STM32?

so I am trying to send some bytes with hex values to setup my microcontroller on the other side of Serial Port. The thing is I am not quite sure how to properly do it and in what format to send them. For example I need to send two bytes and their hex values are 57 and B0. When I try to send it as a char array and I read it back I always get the ASCII Hex values of those characters like 53 and then 55 for the value 57. So I wanted to format the hex value as a byte and send both of them at the same time from byte array but I am not getting anything when reading the response. After formatting it to byte, the MessageBox is showing it's decimal value and I don't know if it supposed to be like that. I am providing my code below.
Info_Byte_Dec += Protocol_Set + Protocol_Unit + Protocol_Write + Protocol_Settings; //stores decimal value
Data_Byte_Dec = Mode * Protocol_Mode_Offset + ODR * Protocol_ODR_Offset + Scale; //stores decimal value
Info_Byte_Hex = Info_Byte_Dec.ToString("X"); //convert to hex value
Data_Byte_Hex = Data_Byte_Dec.ToString("X"); //convert to hex value
string Merged = $"{Info_Byte_Hex} {Data_Byte_Hex}";
MessageBox.Show("Merged1: " + Merged);
byte[] Mergedbytes = Merged.Split(' ').Select(s => Convert.ToByte(s, 16)).ToArray();
MessageBox.Show("Merged2: " + Mergedbytes[0] + Mergedbytes[1]);
port.Write(Mergedbytes, 0, 2);
I am not sure whether I should just send the decimal value 87, or I should format it to 57 hex value, or even to 0x57.
usually in the microcontroller world when you use hex, you mean actual bytes, the hex is just a convenient notation to write binary byte values in, it usually never means send the ascii hex values. Having said that, there are no rules, and sometimes people do actual ascii hex.
In your case, if you are using 0x57 on the stm32, it likely means you are using a byte literal, and not a ascii representation. You would have to do extra work to turn a 0x57 into a string.
So that means in C# just send bytes, not the ascii hex like you are at the moment

Sending Ints as bytes from arduino to C# program fails

I'm working on a program in VS2010 C#. It has a GUI that is used to interact with an Arduino over the serial port.
The issue that I'm having is sending a byte value larger than 128(???) from the arduino to the program. I get an integer value on the arduino, break it into highBite and lowByte, and send each one, the reassemble on the other side.
If I send 600, it will send highByte of 2 and lowByte of 88, and it reassembles to 600 properly via bitshiting <<8 of highByte.
If I try to send 700 which should be 188 and 2, then I am seeing the 188 show in in C# as 63. Why???
A byte should be unsigned on both arduino and C#, so I'm not sure what is going wrong.
Arduino code (relevant parts): (0x43 signals to C# which data packet it is receiving)
byte bytesToSend[3] = {0x43, byte(88), byte(2)}; // 600 broken down to high and low bytes
Serial.write(bytesToSend, 3); // send three bytes
Serial.println(); //send line break to terminate transmission
byte bytesToSend[3] = {0x43, byte(188), byte(2)}; // 700 broken down to high and low bytes
Serial.write(bytesToSend, 3); // send three bytes
Serial.println(); //send line break to terminate transmission
C# code: (relevant parts - May have missed a syntax or two since I cut/trimmed and pasted...)
string inString = "";
inString = port.ReadLine(); // read a line of data from the serial port
inString = inString.Trim(); //remove newline
byte[] buf = new byte[15]; // reserve space for incoming data
buf = System.Text.Encoding.ASCII.GetBytes(inString); //convert string to byte array I've tried a block copy here, but it didn't work either...
Console.Write("Data received: H: {0}, L: {1}. =", buf[2], buf[1]); //display high and low bytes
Console.WriteLine(Convert.ToUInt32((buf[2] << 8) + buf[1])); //display combined value
And this is what I get in the serial monitor where it writes out the values:
Data received: H: 2, L: 88. = 600
Data received: H: 2, L: 63. = 575
The low byte value gets changed or mis-interpreted from 188 to 63 somewhere in the process. What is causing this and how can I fix it? It seems to work fine when the byte value is below 128, but not when it is above.
I think this could be problem at your c# side code. You should debug this by printing the string which you are reading just after port.ReadLine(), to see what you are receiving.
Also I would suggest to use C# Read(Byte[], Int32, Int32), so that your data is read into Byte Array which is array of unsigned char. ReadLine() is reading data into string (array of char).
Your encoding is wrong. Change the line from:
buf = System.Text.Encoding.ASCII.GetBytes(inString);
to
buf = System.Text.Encoding.GetEncoding("Windows-1252").GetBytes(inString);
Better yet, when you instantiate your Port object just set the encoder property to this type.
...
SerialPort port = new SerialPort();
System.Text.Encoding encoder = System.Text.Encoding.GetEncoding("Windows-1252");
port.Encoding = encoder;
...
Remember that ASCII is 7-bit so you will truncate values that are greater than decimal 127. The 1252 encoding is 8 bit and is great for binary data. The table shown at MSDN shows the full symbol support for the encoding.
Why, in C#, reading full string - which will force you to deal with encodings, ... - and do post-process instead of parsing in time?
System.IO.BinaryReader bin_port=new System.IO.BinaryReader(port); //Use binary reader
int b;
int data16;
b=bin_port.ReadByte();
switch (b) {
case 0x43: //Read integer
data16=bin_port.ReadUInt16();
while (bin_port.ReadByte()!=0x0a); //Discard all bytes until LF
break;
}

How to create byte[] with length 16 using FromBase64String [duplicate]

This question already has an answer here:
Calculate actual data size from Base64 encoded string length
(1 answer)
Closed 10 years ago.
I have a requirement to create a byte[] with length 16. (A byte array that has 128 bit to be used as Key in AES encryption).
Following is a valid string
"AAECAwQFBgcICQoLDA0ODw=="
What is the algorithm that determines whether a string will be 128 bit? Or is trial and error the only way to create such 128 bit strings?
CODE
static void Main(string[] args)
{
string firstString = "AAECAwQFBgcICQoLDA0ODw=="; //String Length = 24
string secondString = "ABCDEFGHIJKLMNOPQRSTUVWX"; //String Length = 24
int test = secondString.Length;
byte[] firstByteArray = Convert.FromBase64String((firstString));
byte[] secondByteArray = Convert.FromBase64String((secondString));
int firstLength = firstByteArray.Length;
int secondLength = secondByteArray.Length;
Console.WriteLine("First Length: " + firstLength.ToString());
Console.WriteLine("Second Length: " + secondLength.ToString());
Console.ReadLine();
}
Findings:
For 256 bit, we need 256/6 = 42.66 chars. That is rounded to 43 char. [To make it divisible by 4 add =]
For 512 bit, we need 512/6 = 85.33 chars. That is rounded to 86 char. [To make it divisible by 4 add ==]
For 128 bit, we need 128/6 = 21.33 chars. That is rounded to 22 char. [To make it divisible by 4 add ==]
A base64 string for 16 bytes will always be 24 characters and have == at the end, as padding.
(At least when it's decodable using the .NET method. The padding is not always inlcuded in all uses of base64 strings, but the .NET implementation requires it.)
In Base64 encoding '=' is a special symbol that is added to end of the Base64 string to indicate that there is no data for these chars in original value.
Each char is equal to 6 original bits of data, so to produce 8 bit values the string length has to be dividable by 4 without remainder. (6 bits * 4 = 8 bits * 3). When the resulting BASE64 string is shorter than 4n then '=' are added at the end to make it valid.
Update
Last char before '==' encodes only 2 bits of information, so by replacing it with all possible Base64 chars will give you only 4 different keys out of 64 possible combinations. In other words, by generating strings in format "bbbbbbbbbbbbbbbbbbbbbb==" (where 'b' is valid Base64 character) you'll get 15 duplicate keys per each unique key.
You can use PadRight() to pad the string to the end of it with a char that you will later remove once decrypted.

Equivalent of sprintf in C#?

Is there something similar to sprintf() in C#?
I would for instance like to convert an integer to a 2-byte byte-array.
Something like:
int number = 17;
byte[] s = sprintf("%2c", number);
string s = string.Format("{0:00}", number)
The first 0 means "the first argument" (i.e. number); the 00 after the colon is the format specifier (2 numeric digits).
However, note that .NET strings are UTF-16, so a 2-character string is 4 bytes, not 2
(edit: question changed from string to byte[])
To get the bytes, use Encoding:
byte[] raw = Encoding.UTF8.GetBytes(s);
(obviously different encodings may give different results; UTF8 will give 2 bytes for this data)
Actually, a shorter version of the first bit is:
string s = number.ToString("00");
But the string.Format version is more flexible.
EDIT: I'm assuming that you want to convert the value of an integer to a byte array and not the value converted to a string first and then to a byte array (check marc's answer for the latter.)
To convert an int to a byte array you can use:
byte[] array = BitConverter.GetBytes(17);
but that will give you an array of 4 bytes and not 2 (since an int is 32 bits.)
To get an array of 2 bytes you should use:
byte[] array = BitConverter.GetBytes((short)17);
If you just want to convert the value 17 to two characters then use:
string result = string.Format("{0:00}", 17);
But as marc pointed out the result will consume 4 bytes since each character in .NET is 2 bytes (UTF-16) (including the two bytes that hold the string length it will be 6 bytes).
It turned out, that what I really wanted was this:
short number = 17;
System.IO.BinaryWriter writer = new System.IO.BinaryWriter(stream);
writer.Write(number);
writer.Flush();
The key here is the Write-function of the BinaryWriter class. It has 18 overloads, converting different formats to a byte array which it writes to the stream. In my case I have to make sure the number I want to write is kept in a short datatype, this will make the Write function write 2 bytes.

Categories