How to convert a byte to a char, e.g. 1 -> '1'? - c#

How to convert a byte to a char? I don't mean an ASCII representation.
I have a variable of type byte and want it as a character.
I want just following conversions from byte to char:
0 ->'0'
1 ->'1'
2 ->'2'
3 ->'3'
4 ->'4'
5 ->'5'
6 ->'6'
7 ->'7'
8 ->'8'
9 ->'9'
(char)1 and Convert.ToChar(1) do not work. They result in '' because they think 1 is the ASCII code.

the number .ToString();
one.ToString(); // one.ToString()[0] - first char -'1'
two.ToString(); // two.ToString()[0] - first char -'2'
Note that you can't really convert a byte to a char
char is one char while byte can even three digits value!
If you want to use LINQ and you're sure the byte won't exceed one digit(10+) you can use this:
number.ToString().Single();

Simply using variable.ToString() should be enough. If you want to get fancy, add the ASCII code of 0 to the variable before converting:
Convert.ToChar(variable + Convert.ToByte('0'));

Use this for conversion.
(char)(mybyte + 48);
where mybyte = 0 or 1 and so
OR
Convert.ToChar(1 + 48); // specific case for 1
While others have given solution i'll tell you why your (char)1 and Convert.ToChar(1) is not working.
When your convert byte 1 to char it takes that 1 as an ASCII value.
Now ASCII of 1 != 1.
Add 48 in it because ASCII of 1 == 1 + 48`. Similar cases for 0, 2 and so on.

Assume you have variable byte x;
Just use (char)(x + '0')

Use Convert.ToString() to perform this.

Related

How to convert from string to 16-bit unsigned integer in python?

I'm currently working on some encoding and decoding of the string in python. I was supposed to convert some code from C# to python, however I encountered some problem as below:
So now I have a string that looks like this: 21-20-89-00-67-00-45-78
The code was supposed to eliminates the - in between the numbers, and packed 2 integers into 1 group, then convert them into bytes. In C#, it was done like this:
var value = "21-20-89-00-67-00-45-78";
var valueNoDash = value.Replace("-", null);
for (var i = 0; i < DataSizeInByte; i++)
{
//convert every 2 digits into 1 byte
Data[i] = Convert.ToByte(valueNoDash.Substring(i * 2, 2), 16);
}
The above code represents Step 1: Remove - from the string, Step 2: using Substring method to divide them into 2 digits in 1 group, Step 3: use Convert.ToByte with base 16 to convert them into 16-bit unsigned integer. The results in Data is
33
32
137
0
103
0
69
120
So far I have no problem with this C# code, however when I try to do the same in python, I could not get to the same result as the C# code. My python code are as below:
from textwrap import wrap
import struct
values = "21-20-89-00-67-00-45-78"
values_no_dash = a.replace('-', '')
values_grouped = wrap(b, 2)
values_list = []
for value in values_grouped:
values_list.append(struct.pack('i', int(value)))
In python, it gives me list of bytes in hex value, which is as below:
b'\x15\x00\x00\x00'
b'\x14\x00\x00\x00'
b'Y\x00\x00\x00'
b'\x00\x00\x00\x00'
b'C\x00\x00\x00'
b'\x00\x00\x00\x00'
b'-\x00\x00\x00'
b'N\x00\x00\x00'
This is in bytes object, however when I converted this object into Decimal, it gives me the exact same value as the original string: 21, 20, 89, 0, 67, 0, 45, 78.
Which means I did not convert successfully into 16-bit unsigned integer right? How can I do this in python? I've tried using str.encode() but the result still different. How can I achieve what C# had done in python?
Thanks and appreciates if anyone can help!
I think this is the solution you're looking for:
values = "21-20-89-00-67-00-45-78"
values_no_dash_grouped = values.split('-') #deletes dashes and groups numbers simultaneously
for value in values_no_dash_grouped:
print(int(value, 16)) #converts number in base 16 to base 10 and prints it
Hope it helps!

Conversion from Base 64 error

I'm trying to convert from a Base64 string. First I tried this:
string a = "BTQmJiI6JzFkZ2ZhY";
byte[] b = Convert.FromBase64String(a);
string c = System.Text.Encoding.ASCII.GetString(b);
Then got the exception - System.FormatException was caught Message=Invalid length for a Base-64 char array.
So after googling,I tried this:
string a1 = "BTQmJiI6JzFkZ2ZhY";
int mod4 = a1.Length % 4;
if (mod4 > 0)
{
a1 += new string('=', 4 - mod4);
}
byte[] b1 = Convert.FromBase64String(a1);
string c1 = System.Text.Encoding.ASCII.GetString(b1);
Here I got the exception - System.FormatException was caught Message=Invalid character in a Base-64 string.
Is there any invalid character in "BTQmJiI6JzFkZ2ZhY"? Or is it the length issue?
EDIT: I first decrypt the input string using the below code:
string sourstr, deststr,strchar;
int strlen;
decimal ascvalue, ConvValue;
deststr = "";
sourstr = "InputString";
strlen = sourstr.Length;
for (int intI = 0; intI <= strlen - 1; intI++)
{
strchar = sourstr.Substring(intI, 1);
ascvalue = (decimal)strchar[0];
ConvValue = (decimal)((int)ascvalue ^ 85);
if ((char)ConvValue.ToString().Length == 0)
{
deststr = deststr + strchar;
}
else
{
deststr = deststr + (char)ConvValue;
}
}
This output deststr is passed to below code
Convert.ToBase64String(System.Text.Encoding.ASCII.GetBytes(deststr));
This is where I got "BTQmJiI6JzFkZ2ZhY"
You cannot get such base64 string by encoding whole number of bytes. While encoding, every 3 bytes are represented as 4 characters, because 3 bytes is 24 bits, and each base64 character is 6 bits (2^6=64), so 4 of them is also 24 bits. If number of bytes to encode is not divisable by 3 - you have some bytes left. You can have 2 or 1 bytes left.
If you have 2 bytes left - that's 16 bits and you need at least 3 characters to encode that (2 characters is just 12 bits - not enough). So in case you have 2 bytes left - you encode them with 3 characters and apply "=" padding.
If you have 1 byte left - that's 8 bits. You need at least 2 characters for that. You encode to 2 characters and apply "==" padding.
Note that there is no way to encode something to just one character (and for that reason - there is no "===" padding).
Your string can be divided in 4 character blocks: "BTQm", "JiI6", "JzFk", "Z2Zh", "Y". 4 first blocks each represent 3 bytes, but what "Y" represents? Who knows. You can say that it represents 1 byte in range 0-63, but from above you can see that's not how it works, so to interpret it like that you have to do it yourself.
From above you can see that you cannot get base64 string with length 17 (without padding). You can get 16, 18, 19, 20, but never 17
Are you sure you took all chars from base64 output?
Appending "==" at the end of the string will make your first approach work without any problems. Although there is strange character at the beginning of the output. So the next question is: Are you sure it is "ASCI" Encoding?

Bit shifting with hex in Python

I am trying to understand how to perform bit shift operations in Python. Coming from C#, it doesn't work in the same way.
The C# code is;
var plain=0xabcdef0000000; // plaintext
var key=0xf0f0f0f0f123456; // encryption key
var L = plain;
var R = plain>>32;
The output is;
000abcdef0000000 00000000000abcde
What is the equivilent in Python? I have tried;
plain = 0xabcdef0000000
key = 0xf0f0f0f0f123456
print plain
left = plain
right = plain >> 32
print hex(left)
print hex(right)
However, it doesn't work. The output is different in Python. The 0's padding are missing. Any help would be appreciated!
The hex() function does not pad numbers with leading zeros, because Python integers are unbounded. C# integers have a fixed size (64 bits in this case), so have an upper bound and can therefor be padded out. This doesn't mean those extra padding zeros carry any meaning; the integer value is the same.
You'll have to explicitly add those zeros, using the format() function to produce the output:
print format(left, '#018x')
print format(right, '#018x')
The # tells format() to include the 0x prefix, and the leading 0 before the field width asks format() to pad the output:
>>> print format(left, '#018x')
0x000abcdef0000000
>>> print format(right, '#018x')
0x0000000000abcde
Note that the width includes the 0x prefix; there are 16 hex digits in that output, representing 64 bits of data.
If you wanted to use a dynamic width based on the number of characters used in key, then calculate that from int.bit_length(); every 4 bits produce a hex character:
format(right, '#0{}x'.format((key.bit_length() + 3) // 4 + 2))
Demo:
>>> (key.bit_length() + 3) // 4 + 2
17
>>> print format(right, '#0{}x'.format((key.bit_length() + 3) // 4 + 2))
0x0000000000abcde
But note that even the key is only 60 bits in length and C# would pad that value with an 0 as well.
I have no problem with you you tried
>>> hex(0xabcdef0000000)
'0xabcdef0000000'
>>> hex(0xabcdef0000000 >> 32)
'0xabcde'
In [83]: plain=0xabcdef0000000
In [84]: plain>>32
Out[84]: 703710
In [85]: plain
Out[85]: 3022415462400000
In [87]: hex(plain)
Out[87]: '0xabcdef0000000'
if
In [134]: left = plain
In [135]: right = plain >> 32
Then
In [140]: '{:0x}'.format(left)
Out[140]: 'abcdef0000000'
In [143]: '{:018x}'.format(right)
Out[143]: '0000000000000abcde'

Array of chars in hex format to integer?

I have an API which returns a byte[] over the network which represents information about a device.
It is in format 15ab1234cd\r\n where the first 2 characters are a HEX representation of the amount of data in the message.
I am aware I can convert this to a string via ASCIIEncoding.ASCII.GetString, and then use Convert.ToInt32(string.Substring(0, 2), 16) to achieve this. However the whole thing stays a byte array throughout the life of the whole program I am writing, and I don't want to convert to a string just for the purpose of getting the packet length.
Any suggestions of converting array of chars in hex format to an int in C#?
There is no .Net provided function that does it. Converting first 2 bytes to string with Encoding.GetString is very readable (possibly not most performant):
var hexValue = ASCIIEncoding.ASCII.GetString(byteData, 0, 2);
var intValue = Convert.ToInt32(hexValue, 16);
You can easily write conversion code (map '0'-'9' and 'a'-'f' / 'A'-'F' ranges to corresponding integer value and add together.
Here is one-statement conversion strictly for entertainment purposes. The resulting lambda (before ((byte)'0',(byte)'A') in sample takes 2 byte arguments assuming them to be ASCII characters and convert into integer.
((Func<Func<char,int>, Func<byte, byte, int>>)
(charToInt=> (c, c1)=>
charToInt(char.ToUpper((char)c)) * 16 + charToInt(char.ToUpper((char)c1))))
((Func<char, int>)(
c => c >= '0' && c <='9' ? c-'0' : c >='A' && c <= 'F' ? c - 'A' + 10 : 0))
((byte)'0',(byte)'A')
If you know the first two values are valid hexadecimal characters (0-9, A-Z, a-z), it is possible to convert to a hex value using logical operators.
int GetIntFromHexBytes(byte[] s, int start, int length)
{
int ret = 0;
for (int i = start; i < start+length; i++)
{
ret <<= 4;
ret |= (byte)((s[i] & 0x0f) + ((s[i] & 0x40) >> 6) * 9);
}
return ret;
}
(This works because c & 0x0f returns the 4 least significant bits, and will range from 0-9 for the values '0'-'9', and from 1 - 6 for both capital and lowercase letters ('a' - 'z' and 'A' - 'Z'). s[i] & 0x40 is 0 for numeric characters, and 0x40 for alpha characters; shifting right six characters provides a value of 0 for numeric characters and 1 for alphabetic characters. Shifting left and multiplying by 9 will add a bias of 9 for alpha characters to map A-F and a-f from 1-6 to 10-15.)
Given the byte array:
byte[] b = { (byte)'7', (byte)'f', (byte)'1', (byte)'c' };
Calling GetIntFromHexBytes(b, 0, 2) will return 127 (0x7f), the first two bytes of the array, as required.
As a caution: this approach does no bounds checking. A check can be added in the loop if needed to ensure that the input bytes are valid hex characters.

How to create byte[] with length 16 using FromBase64String [duplicate]

This question already has an answer here:
Calculate actual data size from Base64 encoded string length
(1 answer)
Closed 10 years ago.
I have a requirement to create a byte[] with length 16. (A byte array that has 128 bit to be used as Key in AES encryption).
Following is a valid string
"AAECAwQFBgcICQoLDA0ODw=="
What is the algorithm that determines whether a string will be 128 bit? Or is trial and error the only way to create such 128 bit strings?
CODE
static void Main(string[] args)
{
string firstString = "AAECAwQFBgcICQoLDA0ODw=="; //String Length = 24
string secondString = "ABCDEFGHIJKLMNOPQRSTUVWX"; //String Length = 24
int test = secondString.Length;
byte[] firstByteArray = Convert.FromBase64String((firstString));
byte[] secondByteArray = Convert.FromBase64String((secondString));
int firstLength = firstByteArray.Length;
int secondLength = secondByteArray.Length;
Console.WriteLine("First Length: " + firstLength.ToString());
Console.WriteLine("Second Length: " + secondLength.ToString());
Console.ReadLine();
}
Findings:
For 256 bit, we need 256/6 = 42.66 chars. That is rounded to 43 char. [To make it divisible by 4 add =]
For 512 bit, we need 512/6 = 85.33 chars. That is rounded to 86 char. [To make it divisible by 4 add ==]
For 128 bit, we need 128/6 = 21.33 chars. That is rounded to 22 char. [To make it divisible by 4 add ==]
A base64 string for 16 bytes will always be 24 characters and have == at the end, as padding.
(At least when it's decodable using the .NET method. The padding is not always inlcuded in all uses of base64 strings, but the .NET implementation requires it.)
In Base64 encoding '=' is a special symbol that is added to end of the Base64 string to indicate that there is no data for these chars in original value.
Each char is equal to 6 original bits of data, so to produce 8 bit values the string length has to be dividable by 4 without remainder. (6 bits * 4 = 8 bits * 3). When the resulting BASE64 string is shorter than 4n then '=' are added at the end to make it valid.
Update
Last char before '==' encodes only 2 bits of information, so by replacing it with all possible Base64 chars will give you only 4 different keys out of 64 possible combinations. In other words, by generating strings in format "bbbbbbbbbbbbbbbbbbbbbb==" (where 'b' is valid Base64 character) you'll get 15 duplicate keys per each unique key.
You can use PadRight() to pad the string to the end of it with a char that you will later remove once decrypted.

Categories