If I have int x = 24, how can I convert that into a 2-byte array where the first byte stores the value for 2 (50) and the second byte stores the value for 4 (52)?
System.Text.Encoding.ASCIIEncoding.GetBytes(x.ToString());
Easiest way is to convert to a String first, then convert that to bytes.
byte[] bytes = System.Text.Encoding.ASCII.GetBytes(x.ToString());
You can use the division and modulo operators:
byte[] data = new byte[] { (byte)(48 + x / 10), (byte)(48 + x % 10) };
int x_int = 24;
string x_string = x_int.ToString();
var x_bytes = (from x in x_string select Convert.ToByte(x)).ToArray();
Related
I have this code:
string result = "";
foreach(char item in texte)
{
result += Convert.ToString(item, 2).PadLeft(8, '0');
}
So I have string named result which is conversion of a word like 'bonjour' in binary.
for texte = "bonjour" I have string result = 01100010011011110110111001101010011011110111010101110010 as type integer.
And when I do
Console.writeLine(result[0])
I obtain 0, normal, what I expected, but if I do
Console.WriteLine((int)result[0])
or
Console.WriteLine(Convert.ToInt32(result[0]))
I obtain 48!
I don't want 48, I want 0 or 1 at the type integer.
Could you help me please?
You can just subtract 48 from it!
Console.WriteLine(result[0] - 48);
because the characters digits 0-9 are encoded as 48 to 57.
If you want to access each bit by index, I suggest using a BitArray instead:
var bytes = Encoding.ASCII.GetBytes("someString");
var bitArray = new BitArray(bytes);
// now you can access the first bit like so:
bitArray.Get(0) // this returns a bool
bitArray.Get(0) ? 1 : 0 // this gives you a 1 or 0
string a = "23jlfdsa890123kl21";
byte[] data = System.Text.Encoding.Default.GetBytes(a);
StringBuilder result = new StringBuilder(data.Length * 8);
foreach (byte b in data)
{
result.Append(Convert.ToString(b, 2).PadLeft(8, '0'));
}
you can try this code.
Just Do this
Console.WriteLine(Convert.ToInt32(Convert.ToString(result[0])));
You're expecting it to behave the same as Convert.ToInt32(string input) but actually you're invoking Convert.ToInt32(char input) and if you check the docs, they explicitly state it will return the unicode value (in this case the same as the ASCII value).
http://msdn.microsoft.com/en-us/library/ww9t2871(v=vs.110).aspx
I have an array of integer 1s and 0s (possibly need to get converted to byte type?). I have used an online ASCII to binary generator to get the equivalent binary of this 6 digit letter sequence:
abcdef should equal 011000010110001001100011011001000110010101100110 in binary. So in c#, my array is [0,1,1,0,0,0,0...], built by:
int[] innerArr = new int[48];
for (int i = 0; i < 48); i++) {
int innerIsWhite = color.val[0] > 200 ? 0 : 1;
innerArr[i] = innerIsWhite;
}
I want to take this array, and convert it into abcdef (and be able to do the opposite).
How do I do this? Is there a better way to be storing these ones and zeros.
Try using Linq and Convert:
source = "abcdef";
// 011000010110001001100011011001000110010101100110
string encoded = string.Concat(source
.Select(c => Convert.ToString(c, 2).PadLeft(8, '0')));
// If we want an array
byte[] encodedArray = encoded
.Select(c => (byte) (c - '0'))
.ToArray();
// string from array
string encodedFromArray = string.Concat(encodedArray);
// abcdef
string decoded = string.Concat(Enumerable
.Range(0, encoded.Length / 8)
.Select(i => (char) Convert.ToByte(encoded.Substring(i * 8, 8), 2)));
If your input is a bit string, then you can use a method like below to convert that into character string
public static string GetStringFromAsciiBitString(string bitString) {
var asciiiByteData = new byte[bitString.Length / 8];
for (int i = 0, j = 0; i < asciiiByteData.Length; ++i, j+= 8)
asciiiByteData[i] = Convert.ToByte(bitString.Substring(j, 8), 2);
return Encoding.ASCII.GetString(asciiiByteData);
}
The above code simply uses the Convert.ToByte method asking it to do a base-2 string to byte conversion. Then using Encoding.ASCII.GetString, you get the string representation from the byte array
In my code, I presume your bit string is clean (multiple of 8 and with only 0s and 1s), in production grade code you will have to sanitize your input.
I have a byte array:
newMsg.DATA = new byte[64];
How can I convert it into binary value and then write it in text file with comma separation. Comma should be in between binary values not bytes.....
like 1,1,1,0,0,0,1,1,1,1,1,0,0,0,0,0.......
Here is an example that uses LINQ:
byte[] arr = new byte[] { 11, 55, 255, 188, 99, 22, 31, 43, 25, 122 };
string[] result = arr.Select(x => string.Join(",", Convert.ToString(x, 2)
.PadLeft(8, '0').ToCharArray())).ToArray();
System.IO.File.WriteAllLines(#"D:\myFile.txt", result);
Every number in byte[] arr is converted to a binary number with Convert.ToString(x, 2) and the comma "," is added between binary values with string.Join(",",...). At the end you can write all the elements in result to a text file by using System.IO.File.WriteAllLines.
The example above gives you this kind of output in a txt file:
0,0,0,0,1,0,1,1
0,0,1,1,0,1,1,1
1,1,1,1,1,1,1,1
...
Explanation of Convert.ToString(value, baseValue):
The first parameter value represents the number you want to convert to a string
and the second parameter baseValue represents which type of conversion you want to perform.
Posible baseValues are : 2,8,10 and 16.
BaseValue = 2 - represents a conversion to a binary number representation.
BaseValue = 8 - represents a conversion to a octal number representation.
BaseValue = 10 - represents a conversion to a decimal number representation.
BaseValue = 16 - represents a conversion to a hexadecimal number representation.
I think this will Help you c# provides inbuilt functionality to do so
with help of Convert.ToString(byte[],base); here base could be[2(binary),8(octal),16(HexaDecimal)]
byte[] data = new byte[64];
// 2nd parameter 2 is Base e.g.(binary)
string a = Convert.ToString(data[data.Length], 2);
StringBuilder sb = new StringBuilder();
foreach(char ch in a.ToCharArray())
{
sb.Append(ch+",");
}
// This is to remove last extra ,
string ans = sb.ToString().Remove(sb.Length - 1, 1);
This should get you going:
var bytes = new byte[] { 128, 255, 2 };
var stringBuilder = new StringBuilder();
for (var index = 0; index < bytes.Length; index++)
{
var binary = Convert.ToString(bytes[index], 2).PadLeft(8, '0');
var str = string.Join(",", binary.ToCharArray());
stringBuilder.Append(str);
if (index != bytes.Length -1) stringBuilder.Append(",");
}
Console.WriteLine(stringBuilder);
I need to create a an byte array with hex and int values.
For example:
int value1 = 13;
int value2 = 31;
byte[] mixedbytes = new byte[] {0x09, (byte)value1, (byte)value2};
Problem: The 31 is converted to 0x1F. It should be 0x31. I've tried to convert the int values to string and back to bytes but that didn't solve the problem. The integers have never more than two digits.
Try this:
int value1 = 0x13;
int value2 = 0x31;
byte[] mixedbytes = new byte[] { 0x09, (byte)value1, (byte)value2 };
Also, you don't seem to understand conversion between decimal and hex. 31 in decimal is 1F in hex, expecting it to be 31 in hex is a bad expectation for a better understanding of the conversion between decimal and hex, please have a look here: http://www.wikihow.com/Convert-from-Decimal-to-Hexadecimal
I think you can try this method
string i = "10";
var b = Convert.ToByte(i, 16)
In this method 10 will be stored as 0x10
This format is commonly known as Binary Coded Decimal (BCD). The idea is that the nibbles in the byte each contain a single decimal digit.
In C#, you can do this conversion very easily:
var number = 31;
var bcd = (number / 10) * 16 + (number % 10);
Let's say I have array of bytes
byte[] byteArr = new byte[] { 1, 2, 3, 4, 5 };
I want to convert this array to get regular numeric variable of uint, so result will be
uint result = 12345;
So far all the example I've seen were with bytes, byte I don't need bytes, but numeric value.
Thanks...
It sounds like you want something like:
uint result = 0;
foreach (var digit in array)
{
result = result * 10 + digit;
}
Or more fancily, using LINQ:
uint result = array.Aggregate((uint) 0, (curr, digit) => curr * 10 + digit);