string value1 , value1 ;
int length1 , length2 ;
System.Collections.BitArray bitValue1 = new System.Collections.BitArray(Length1);
System.Collections.BitArray bitValue2 = new System.Collections.BitArray(Length2);
I'm looking for the fastest way to covert each string to BitArray with defined length for each string (the string should be trimmed if it is larger than defined length and if strings size is smaller remaining bits will be filled with false) and then put this two strings together and write it in a binary file .
Edit :
#dtb : a simple example can be like this value1 = "A" ,value2 = "B" and length1 =8 and length2 = 16 and the result will be 010000010000000001000010
the first 8 bits are from "A" and next 16 bits from "B"
//Source string
string value1 = "t";
//Length in bits
int length1 = 2;
//Convert the text to an array of ASCII bytes
byte[] bytes = System.Text.Encoding.ASCII.GetBytes(value1);
//Create a temp BitArray from the bytes
System.Collections.BitArray tempBits = new System.Collections.BitArray(bytes);
//Create the output BitArray setting the maximum length
System.Collections.BitArray bitValue1 = new System.Collections.BitArray(length1);
//Loop through the temp array
for(int i=0;i<tempBits.Length;i++)
{
//If we're outside of the range of the output array exit
if (i >= length1) break;
//Otherwise copy the value from the temp to the output
bitValue1.Set(i, tempBits.Get(i));
}
And I'm going to keep saying it, this assumes ASCII characters so anything above ASCII 127 (such as the é in résumé) will freak out and probably return ASCII 63 which is the question mark.
When converting a string to something else you need to consider what encoding you want to use. Here's a version that uses UTF-8
bitValue1 = System.Text.Encoding.UTF8.GetBytes(value1, 0, length1);
Edit
Hmm... saw that you're looking for a BitArray and not a ByteArray, this won't help you probably.
Since this is not a very clear question I'll give this a shot nonetheless,
using System.IO;
using System.Runtime.Serialization;
using System.Runtime.Serialization.Formatters.Binary;
public static void RunSnippet()
{
string s = "123";
byte[] b = System.Text.ASCIIEncoding.ASCII.GetBytes(s);
System.Collections.BitArray bArr = new System.Collections.BitArray(b);
Console.WriteLine("bArr.Count = {0}", bArr.Count);
for(int i = 0; i < bArr.Count; i++)
Console.WriteLin(string.Format("{0}", bArr.Get(i).ToString()));
BinaryFormatter bf = new BinaryFormatter();
using (FileStream fStream = new FileStream("test.bin", System.IO.FileMode.CreateNew)){
bf.Serialize(fStream, (System.Collections.BitArray)bArr);
Console.WriteLine("Serialized to test.bin");
}
Console.ReadLine();
}
Is that what you are trying to achieve?
Hope this helps,
Best regards,
Tom.
Related
I have a jpg image that I can convert into a sequence of binary numbers but I can't recover it afterward. It ends up corrupted.
using System;
using System.IO;
using System.Net.Mime;
using System.Text;
namespace ConsoleApp1
{
class Program
{
static void Main(string[] args)
{
string inputFilename = "test.jpg";
byte[] fileBytes = File.ReadAllBytes(inputFilename);
var test = ToBinaryString(fileBytes);
byte[] bytes = Encoding.ASCII.GetBytes(test);
string fileresult = Convert.ToBase64String(bytes);
byte[] fileresutl2 = Convert.FromBase64String(fileresult);
File.WriteAllBytes("C:/Users/Florian/RiderProjects/ConsoleApp1/ConsoleApp1/bin/Debug/net5.0/test7.txt", fileresutl2);
string text = System.IO.File.ReadAllText("C:/Users/Florian/RiderProjects/ConsoleApp1/ConsoleApp1/bin/Debug/net5.0/test7.txt");
var result = Convert.ToBase64String(FromBinaryString(text));
File.WriteAllBytes("C:/Users/Florian/RiderProjects/ConsoleApp1/ConsoleApp1/bin/Debug/net5.0/test8.jpg", FromBinaryString(test));
Console.WriteLine(test);
}
public static string ToBinaryString(byte[] array)
{
var s = new StringBuilder();
foreach (byte b in array)
s.Append(Convert.ToString(b, 2));
return s.ToString();
}
public static byte[] FromBinaryString(string s)
{
int count = s.Length / 8;
var b = new byte[count];
for (int i = 0; i < count ; i++)
b[i] = Convert.ToByte(s.Substring(i * 8, 8), 2);
return b;
}
}
}
I don't see what can corrupt my file.
Excuse me for the organization of the file, it is simply a test code coded quickly.
There are some odd things in the code such as uselessly converting to base64 and back, but let's go right to the core of the problem: converting bytes to a binary string
public static string ToBinaryString(byte[] array)
{
var s = new StringBuilder();
foreach (byte b in array)
s.Append(Convert.ToString(b, 2));
return s.ToString();
}
On the face of it, it seems reasonable. Bytes go in, a binary string comes out. But here are some examples that I hope will convince you that there is an issue with this conversion:
Console.WriteLine(ToBinaryString(new byte[]{1, 1}));
Console.WriteLine(ToBinaryString(new byte[]{3}));
( https://ideone.com/y6dB1Z )
These both come out as "11". That's bad, it means the conversion is not reversible. The reason it goes wrong is that the most-significant zeroes are dropped, and then the next byte "shifts into" that position. Every byte should be represented by exactly 8 bits, just as your decoding function expects. For example:
public static string ToBinaryString(byte[] array)
{
var s = new StringBuilder();
foreach (byte b in array)
s.Append(Convert.ToString(b, 2).PadLeft(8, '0'));
return s.ToString();
}
By the way it's not like you should actually ever use something like this (except when experimenting, which it looks like is what you're doing, so it's OK), it inflates the file size by a factor of 8 and doesn't actually do anything useful. If you're going to decode the JPG for example, do that directly with the bytes of the file, not with "the file as a binary string". Even Huffman codes should not be done like that.
I have this code:
string result = "";
foreach(char item in texte)
{
result += Convert.ToString(item, 2).PadLeft(8, '0');
}
So I have string named result which is conversion of a word like 'bonjour' in binary.
for texte = "bonjour" I have string result = 01100010011011110110111001101010011011110111010101110010 as type integer.
And when I do
Console.writeLine(result[0])
I obtain 0, normal, what I expected, but if I do
Console.WriteLine((int)result[0])
or
Console.WriteLine(Convert.ToInt32(result[0]))
I obtain 48!
I don't want 48, I want 0 or 1 at the type integer.
Could you help me please?
You can just subtract 48 from it!
Console.WriteLine(result[0] - 48);
because the characters digits 0-9 are encoded as 48 to 57.
If you want to access each bit by index, I suggest using a BitArray instead:
var bytes = Encoding.ASCII.GetBytes("someString");
var bitArray = new BitArray(bytes);
// now you can access the first bit like so:
bitArray.Get(0) // this returns a bool
bitArray.Get(0) ? 1 : 0 // this gives you a 1 or 0
string a = "23jlfdsa890123kl21";
byte[] data = System.Text.Encoding.Default.GetBytes(a);
StringBuilder result = new StringBuilder(data.Length * 8);
foreach (byte b in data)
{
result.Append(Convert.ToString(b, 2).PadLeft(8, '0'));
}
you can try this code.
Just Do this
Console.WriteLine(Convert.ToInt32(Convert.ToString(result[0])));
You're expecting it to behave the same as Convert.ToInt32(string input) but actually you're invoking Convert.ToInt32(char input) and if you check the docs, they explicitly state it will return the unicode value (in this case the same as the ASCII value).
http://msdn.microsoft.com/en-us/library/ww9t2871(v=vs.110).aspx
I'm writing little decoder. User enter string (hex) that program should decode.
My problem is that int value is not the same as inputed, therefore I don't know how should I advance after reading binary. Am I missing point on how to read binary ?
string input = "0802";
byte[] arr = Encoding.Default.GetBytes(input);
using (MemoryStream stream = new MemoryStream(arr))
{
using (BinaryReader reader = new BinaryReader(stream))
{
int a = reader.ReadInt32();
Console.WriteLine(a);
//output: 842020912
}
}
This is correct. You read 4 bytes from the string so they are interpreted as 4 bytes of your int.
If you check the hex value of your "incorrect" number 842020912 it will give you 0x32303830 and reading every byte as ASCII gives "2080".
The order is reversed as you are reading the value as little-endian.
You are doing it the hard way. I using Bytes but can modify for 1in16, or int32 very easily. :
string input = "0802";
List<byte> bytes = new List<byte>();
for (int i = 0; i < input.Length; i += 2)
{
bytes.Add(byte.Parse(input.Substring(i, 2), System.Globalization.NumberStyles.HexNumber));
}
I have a random integer value which I need to represent in String as a Byte array. For example:
int value = 32;
String strValue = getStringByteArray(value);
Console.WriteLine(strValue); // should write: " \0\0\0"
If value = 11 then getStringByteArray(value) shuld return "\v\0\0\0".
If value = 13 then getStringByteArray(value) shuld return "\r\0\0\0".
And so on.
Any idea on how to implement the method getStringByteArray(int value) in C#?
UPDATE
This is the code that receives the data from the C# NamedPipe Server:
bool CFilePipe::ReadString(int m_handle, string &value)
{
//--- check for data
if(WaitForRead(sizeof(int)))
{
ResetLastError();
int size=FileReadInteger(m_handle);
if(GetLastError()==0)
{
//--- check for data
if(WaitForRead(size))
{
value=FileReadString(m_handle,size);
return(size==StringLen(value));
}
}
}
//--- failure
return(false);
}
Don't take this approach at all. You should be writing to a binary stream of some description - and write the binary data for the length of the packet/message, followed by the message itself. For example:
BinaryWriter writer = new BinaryWriter(stream);
byte[] data = Encoding.UTF8.GetBytes(text);
writer.Write(data.Length);
writer.Write(data);
Then at the other end, you'd use:
BinaryReader reader = new BinaryReader(stream);
int length = reader.ReadInt32();
byte[] data = reader.ReadBytes(length);
string text = Encoding.UTF8.GetString(data);
No need to treat binary data as text at all.
Well. First of all you should get bytes from integer. You can do it with BitConverter:
var bytes = BitConverter.GetBytes(value);
Next, here is three variants. First - if you want to get result in binary format. Just take all your bytes and write as it is:
var str = string.Concat(bytes.Select(b => Convert.ToString(b, 2)));
Second variant. If you want convert your byte array to hexadecimal string:
var hex = BitConverter.ToString(array).Replace("-","");
Third variant. Your representation ("\v\0\0\0") - it is simple converting byte to char. Use this:
var s = bytes.Aggregate(string.Empty, (current, t) => current + Convert.ToChar(t));
This should help with that.
class Program
{
static void Main(string[] args)
{
Random rand = new Random();
int number = rand.Next(1, 1000);
byte[] intBytes = BitConverter.GetBytes(number);
string answer = "";
for (int i = 0; i < intBytes.Length; i++)
{
answer += intBytes[i] + #"\";
}
Console.WriteLine(answer);
Console.WriteLine(number);
Console.ReadKey();
}
}
Obviously, you should implement two steps to achieve the goal:
Extract bytes from the integer in the appropriate order (little-endian or big-endian, it's up to you to decide), using bit arithmetics.
Merge extracted bytes into string using the format you need.
Possible implementation:
using System;
using System.Text;
public class Test
{
public static void Main()
{
Int32 value = 5152;
byte[] bytes = new byte[4];
for (int i = 0; i < 4; i++)
{
bytes[i] = (byte)((value >> i * 8) & 0xFF);
}
StringBuilder result = new StringBuilder();
for (int i = 0; i < 4; i++)
{
result.Append("\\" + bytes[i].ToString("X2"));
}
Console.WriteLine(result);
}
}
Ideone snippet: http://ideone.com/wLloo1
I think you are saying that you want to convert each byte into a character literal, using escape sequences for the non printable characters.
After converting the integer to 4 bytes, cast to char. Then use Char.IsControl() to identify the non-printing characters. Use the printable char directly, and use a lookup table to find the corresponding escape sequence for each non-printable char.
Assume that I have a string containing a hex value. For example:
string command "0xABCD1234";
How can I convert that string into another string (for example, string codedString = ...) such that this new string's ASCII-encoded representation has the same binary as the original strings contents?
The reason I need to do this is because I have a library from a hardware manufacturer that can transmit data from their piece of hardware to another piece of hardware over SPI. Their functions take strings as an input, but when I try to send "AA" I am expecting the SPI to transmit the binary 10101010, but instead it transmits the ascii representation of AA which is 0110000101100001.
Also, this hex string is going to be 32 hex characters long (that is, 256-bits long).
string command = "AA";
int num = int.Parse(command,NumberStyles.HexNumber);
string bits = Convert.ToString(num,2); // <-- 10101010
I think I understand what you need... here is the main code part.. asciiStringWithTheRightBytes is what you would send to your command.
var command = "ABCD1234";
var byteCommand = GetBytesFromHexString(command);
var asciiStringWithTheRightBytes = Encoding.ASCII.GetString(byteCommand);
And the subroutines it uses are here...
static byte[] GetBytesFromHexString(string str)
{
byte[] bytes = new byte[str.Length * sizeof(byte)];
for (var i = 0; i < str.Length; i++)
bytes[i] = HexToInt(str[i]);
return bytes;
}
static byte HexToInt(char hexChar)
{
hexChar = char.ToUpper(hexChar); // may not be necessary
return (byte)((int)hexChar < (int)'A' ?
((int)hexChar - (int)'0') :
10 + ((int)hexChar - (int)'A'));
}