C# - Convert Bytes Representing Char Numbers to Int16 - c#

If I have a byte array representing a number read from a file, how can the byte array be converted to an Int16/short?
byte[] bytes = new byte[]{45,49,54,50 } //Byte array representing "-162" from text file
short value = 0; //How to convert to -162 as a short here?
Tried using BitConverter.ToInt16(bytes, 0), but the value is not correct.
Edit: Looking for a solution that does NOT use string conversion.

This function performs some validations that you may be able to exclude. You could simplify it if you know your input array will always contain at least one element and that the value will be a valid Int16.
const byte Negative = (byte)'-';
const byte Zero = (byte)'0';
static Int16 BytesToInt16(byte[] bytes)
{
if (null == bytes || bytes.Length == 0)
return 0;
int result = 0;
bool isNegative = bytes[0] == Negative;
int index = isNegative ? 1 : 0;
for (; index < bytes.Length; index++)
{
result = 10 * result + (bytes[index] - Zero);
}
if (isNegative)
result *= -1;
if (result < Int16.MinValue)
return Int16.MinValue;
if (result > Int16.MaxValue)
return Int16.MaxValue;
return (Int16)result;
}

Like willaien said, you want to first convert your bytes to a string.
byte[] bytes = new byte[]{ 45,49,54,50 };
string numberString = Encoding.UTF8.GetString(bytes);
short value = Int16.Parse(numberString);
If you're not sure that your string can be parsed, I recommend using Int16.TryParse:
byte[] bytes = new byte[]{ 45,49,54,50 };
string numberString = Encoding.UTF8.GetString(bytes);
short value;
if (!Int16.TryParse(numberString, out value))
{
// Parsing failed
}
else
{
// Parsing worked, `value` now contains your value.
}

Related

C# How to fix loss of data during file to binary to string to binary conversion

I read a file as binary, convert to hex string, convert back to binary, and write to a new file.
I expect a duplicate, but get a corrupted file.
I have been trying different ways to convert the binary into the hex string but can't seem to retain the entire file.
byte[] binary1 = File.ReadAllBytes(#"....Input.jpg");
string hexString = "";
int counter1 = 0;
foreach (byte b in binary1)
{
counter1++;
hexString += (Convert.ToString(b, 16));
}
List<byte> bytelist = new List<byte>();
int counter2 = 0;
for (int i = 0; i < hexString.Length/2; i++)
{
counter2++;
string ch = hexString.Substring(i*2,2);
bytelist.Add(Convert.ToByte(ch, 16));
}
byte[] binary2 = bytelist.ToArray();
File.WriteAllBytes(#"....Output.jpg", binary2);
Counter 1 and counter 2 should be the same count, but counter 2 is always about 10% smaller.
I want to get a duplicate file output that I could have transferred around via that string value.
Converting byte 10 will give a single char, and not 2 characters. Your convert back method (logically) build on 2 chars per byte.
this case works
byte[] binary1 = new byte[] { 100 }; // convert will result in "64"
and this case fails
byte[] binary1 = new byte[] { 10 }; // convert will result in "a"
I quick fixed your code, by padding with a "0" in case of a single char.
so working code:
byte[] binary1 = new byte[] { 100 };
string hexString = "";
int counter1 = 0;
foreach (byte b in binary1)
{
counter1++;
var s = (Convert.ToString(b, 16));
// new
if (s.Length < 2)
{
hexString += "0";
}
// end new
hexString += s;
}
List<byte> bytelist = new List<byte>();
int counter2 = 0;
for (int i = 0; i < hexString.Length / 2; i++)
{
counter2++;
string ch = hexString.Substring(i * 2, 2);
var item = Convert.ToByte(ch, 16);
bytelist.Add(item);
}
byte[] binary2 = bytelist.ToArray();
Please note, your code could use some refactoring, e.g. don't string concat in a loop and maybe check the Single Responsibility Principle.
Update, got it fixed, but there are better solutions here: How do you convert a byte array to a hexadecimal string, and vice versa?

Problems with converting char array to string

I have a function in a small application that I'm writing to break a recycled one-time pad cypher. Having used VB.NET for most of my career I thought it would be interesting to implement the app in C#. However, I have encountered a problem due to my present unfamiliarity with C#.
The function takes in two strings (of binary digits), converts these strings to char arrays, and then performs an XOR on them and places the result in a third char array.
This is fine until I try to convert the third char array to a string. Instead of the string looking like "11001101" etc, I get the following result: " \0\0 \0 " i.e. the "1"s are being represented by spaces and the "0"s by "\0".
My code is as follows:
public string calcXor(string a, string b)
{
char[] charAArray = a.ToCharArray();
char[] charBArray = b.ToCharArray();
int len = 0;
// Set length to be the length of the shorter string
if (a.Length > b.Length)
len = b.Length - 1;
else
len = a.Length - 1;
char[] result = new char[len];
for (int i = 0; i < len; i++)
{
result[i] = (char)(charAArray[i] ^ charBArray[i]);
}
return new string(result);
}
Your problem is in the line
result[i] = (char)(charAArray[i] ^ charBArray[i]);
that should be
// (Char) 1 is not '1'!
result[i] = (char)((charAArray[i] ^ charBArray[i]) + '0');
More compact solution is to use StringBuilder, not arrays:
public string calcXor(String a, String b) {
int len = (a.Length < b.Length) ? a.Length : b.Length;
StringBuilder Sb = new StringBuilder();
for (int i = 0; i < len; ++i)
// Sb.Append(CharToBinary(a[i] ^ b[i])); // <- If you want 0's and 1's
Sb.Append(a[i] ^ b[i]); // <- Just int, not in binary format as in your solution
return Sb.ToString();
}
public static String CharToBinary(int value, Boolean useUnicode = false) {
int size = useUnicode ? 16 : 8;
StringBuilder Sb = new StringBuilder(size);
Sb.Length = size;
for (int i = size - 1; i >= 0; --i) {
Sb[i] = value % 2 == 0 ? '0' : '1';
value /= 2;
}
return Sb.ToString();
}
Your solution just computes xor's (e.g. "65") and put them into line (e.g. 65728...); if you want 0's and 1's representation, you should use formatting
Have a look at the ASCII Table. 0 is the Null character \0. You could try ToString()
Have you tried using binary / byte[]? It seems like the fastest way to me.
public string calcXor(string a, string b)
{
//String to binary
byte[] ab = ConvertToBinary(a);
byte[] bb = ConvertToBinary(b);
//(XOR)
byte[] cb = a^b
return cb.ToString();
}
public static byte[] ConvertToBinary(string str)
{
System.Text.ASCIIEncoding encoding = new System.Text.ASCIIEncoding();
return encoding.GetBytes(str);
}
I just wanted to add that the solution I eventually chose is as follows:
//Parameter binary is a bit string
public void someroutine(String binary)
{
var data = GetBytesFromBinaryString(binary);
var text = Encoding.ASCII.GetString(data);
}
public Byte[] GetBytesFromBinaryString(String binary)
{
var list = new List<Byte>();
for (int i = 0; i < binary.Length; i += 8)
{
String t = binary.Substring(i, 8);
list.Add(Convert.ToByte(t, 2));
}
return list.ToArray();
}

How to convert a byte array (MD5 hash) into a string (36 chars)?

I've got a byte array that was created using a hash function. I would like to convert this array into a string. So far so good, it will give me hexadecimal string.
Now I would like to use something different than hexadecimal characters, I would like to encode the byte array with these 36 characters: [a-z][0-9].
How would I go about?
Edit: the reason I would to do this, is because I would like to have a smaller string, than a hexadecimal string.
I adapted my arbitrary-length base conversion function from this answer to C#:
static string BaseConvert(string number, int fromBase, int toBase)
{
var digits = "0123456789abcdefghijklmnopqrstuvwxyz";
var length = number.Length;
var result = string.Empty;
var nibbles = number.Select(c => digits.IndexOf(c)).ToList();
int newlen;
do {
var value = 0;
newlen = 0;
for (var i = 0; i < length; ++i) {
value = value * fromBase + nibbles[i];
if (value >= toBase) {
if (newlen == nibbles.Count) {
nibbles.Add(0);
}
nibbles[newlen++] = value / toBase;
value %= toBase;
}
else if (newlen > 0) {
if (newlen == nibbles.Count) {
nibbles.Add(0);
}
nibbles[newlen++] = 0;
}
}
length = newlen;
result = digits[value] + result; //
}
while (newlen != 0);
return result;
}
As it's coming from PHP it might not be too idiomatic C#, there are also no parameter validity checks. However, you can feed it a hex-encoded string and it will work just fine with
var result = BaseConvert(hexEncoded, 16, 36);
It's not exactly what you asked for, but encoding the byte[] into hex is trivial.
See it in action.
Earlier tonight I came across a codereview question revolving around the same algorithm being discussed here. See: https://codereview.stackexchange.com/questions/14084/base-36-encoding-of-a-byte-array/
I provided a improved implementation of one of its earlier answers (both use BigInteger). See: https://codereview.stackexchange.com/a/20014/20654. The solution takes a byte[] and returns a Base36 string. Both the original and mine include simple benchmark information.
For completeness, the following is the method to decode a byte[] from an string. I'll include the encode function from the link above as well. See the text after this code block for some simple benchmark info for decoding.
const int kByteBitCount= 8; // number of bits in a byte
// constants that we use in FromBase36String and ToBase36String
const string kBase36Digits= "0123456789abcdefghijklmnopqrstuvwxyz";
static readonly double kBase36CharsLengthDivisor= Math.Log(kBase36Digits.Length, 2);
static readonly BigInteger kBigInt36= new BigInteger(36);
// assumes the input 'chars' is in big-endian ordering, MSB->LSB
static byte[] FromBase36String(string chars)
{
var bi= new BigInteger();
for (int x= 0; x < chars.Length; x++)
{
int i= kBase36Digits.IndexOf(chars[x]);
if (i < 0) return null; // invalid character
bi *= kBigInt36;
bi += i;
}
return bi.ToByteArray();
}
// characters returned are in big-endian ordering, MSB->LSB
static string ToBase36String(byte[] bytes)
{
// Estimate the result's length so we don't waste time realloc'ing
int result_length= (int)
Math.Ceiling(bytes.Length * kByteBitCount / kBase36CharsLengthDivisor);
// We use a List so we don't have to CopyTo a StringBuilder's characters
// to a char[], only to then Array.Reverse it later
var result= new System.Collections.Generic.List<char>(result_length);
var dividend= new BigInteger(bytes);
// IsZero's computation is less complex than evaluating "dividend > 0"
// which invokes BigInteger.CompareTo(BigInteger)
while (!dividend.IsZero)
{
BigInteger remainder;
dividend= BigInteger.DivRem(dividend, kBigInt36, out remainder);
int digit_index= Math.Abs((int)remainder);
result.Add(kBase36Digits[digit_index]);
}
// orientate the characters in big-endian ordering
result.Reverse();
// ToArray will also trim the excess chars used in length prediction
return new string(result.ToArray());
}
"A test 1234. Made slightly larger!" encodes to Base64 as "165kkoorqxin775ct82ist5ysteekll7kaqlcnnu6mfe7ag7e63b5"
To decode that Base36 string 1,000,000 times takes 12.6558909 seconds on my machine (I used the same build and machine conditions as provided in my answer on codereview)
You mentioned that you were dealing with a byte[] for the MD5 hash, rather than a hexadecimal string representation of it, so I think this solution provide the least overhead for you.
If you want a shorter string and can accept [a-zA-Z0-9] and + and / then look at Convert.ToBase64String
Using BigInteger (needs the System.Numerics reference)
Using BigInteger (needs the System.Numerics reference)
const string chars = "0123456789abcdefghijklmnopqrstuvwxyz";
// The result is padded with chars[0] to make the string length
// (int)Math.Ceiling(bytes.Length * 8 / Math.Log(chars.Length, 2))
// (so that for any value [0...0]-[255...255] of bytes the resulting
// string will have same length)
public static string ToBaseN(byte[] bytes, string chars, bool littleEndian = true, int len = -1)
{
if (bytes.Length == 0 || len == 0)
{
return String.Empty;
}
// BigInteger saves in the last byte the sign. > 7F negative,
// <= 7F positive.
// If we have a "negative" number, we will prepend a 0 byte.
byte[] bytes2;
if (littleEndian)
{
if (bytes[bytes.Length - 1] <= 0x7F)
{
bytes2 = bytes;
}
else
{
// Note that Array.Resize doesn't modify the original array,
// but creates a copy and sets the passed reference to the
// new array
bytes2 = bytes;
Array.Resize(ref bytes2, bytes.Length + 1);
}
}
else
{
bytes2 = new byte[bytes[0] > 0x7F ? bytes.Length + 1 : bytes.Length];
// We copy and reverse the array
for (int i = bytes.Length - 1, j = 0; i >= 0; i--, j++)
{
bytes2[j] = bytes[i];
}
}
BigInteger bi = new BigInteger(bytes2);
// A little optimization. We will do many divisions based on
// chars.Length .
BigInteger length = chars.Length;
// We pre-calc the length of the string. We know the bits of
// "information" of a byte are 8. Using Log2 we calc the bits of
// information of our new base.
if (len == -1)
{
len = (int)Math.Ceiling(bytes.Length * 8 / Math.Log(chars.Length, 2));
}
// We will build our string on a char[]
var chs = new char[len];
int chsIndex = 0;
while (bi > 0)
{
BigInteger remainder;
bi = BigInteger.DivRem(bi, length, out remainder);
chs[littleEndian ? chsIndex : len - chsIndex - 1] = chars[(int)remainder];
chsIndex++;
if (chsIndex < 0)
{
if (bi > 0)
{
throw new OverflowException();
}
}
}
// We append the zeros that we skipped at the beginning
if (littleEndian)
{
while (chsIndex < len)
{
chs[chsIndex] = chars[0];
chsIndex++;
}
}
else
{
while (chsIndex < len)
{
chs[len - chsIndex - 1] = chars[0];
chsIndex++;
}
}
return new string(chs);
}
public static byte[] FromBaseN(string str, string chars, bool littleEndian = true, int len = -1)
{
if (str.Length == 0 || len == 0)
{
return new byte[0];
}
// This should be the maximum length of the byte[] array. It's
// the opposite of the one used in ToBaseN.
// Note that it can be passed as a parameter
if (len == -1)
{
len = (int)Math.Ceiling(str.Length * Math.Log(chars.Length, 2) / 8);
}
BigInteger bi = BigInteger.Zero;
BigInteger length2 = chars.Length;
BigInteger mult = BigInteger.One;
for (int j = 0; j < str.Length; j++)
{
int ix = chars.IndexOf(littleEndian ? str[j] : str[str.Length - j - 1]);
// We didn't find the character
if (ix == -1)
{
throw new ArgumentOutOfRangeException();
}
bi += ix * mult;
mult *= length2;
}
var bytes = bi.ToByteArray();
int len2 = bytes.Length;
// BigInteger adds a 0 byte for positive numbers that have the
// last byte > 0x7F
if (len2 >= 2 && bytes[len2 - 1] == 0)
{
len2--;
}
int len3 = Math.Min(len, len2);
byte[] bytes2;
if (littleEndian)
{
if (len == bytes.Length)
{
bytes2 = bytes;
}
else
{
bytes2 = new byte[len];
Array.Copy(bytes, bytes2, len3);
}
}
else
{
bytes2 = new byte[len];
for (int i = 0; i < len3; i++)
{
bytes2[len - i - 1] = bytes[i];
}
}
for (int i = len3; i < len2; i++)
{
if (bytes[i] != 0)
{
throw new OverflowException();
}
}
return bytes2;
}
Be aware that they are REALLY slow! REALLY REALLY slow! (2 minutes for 100k). To speed them up you would probably need to rewrite the division/mod operation so that they work directly on a buffer, instead of each time recreating the scratch pads as it's done by BigInteger. And it would still be SLOW. The problem is that the time needed to encode the first byte is O(n) where n is the length of the byte array (this because all the array needs to be divided by 36). Unless you want to work with blocks of 5 bytes and lose some bits. Each symbol of Base36 carries around 5.169925001 bits. So 8 of these symbols would carry 41.35940001 bits. Very near 40 bytes.
Note that these methods can work both in little-endian mode and in big-endian mode. The endianness of the input and of the output is the same. Both methods accept a len parameter. You can use it to trim excess 0 (zeroes). Note that if you try to make an output too much small to contain the input, an OverflowException will be thrown.
System.Text.Encoding enc = System.Text.Encoding.ASCII;
string myString = enc.GetString(myByteArray);
You can play with what encoding you need:
System.Text.ASCIIEncoding,
System.Text.UnicodeEncoding,
System.Text.UTF7Encoding,
System.Text.UTF8Encoding
To match the requrements [a-z][0-9] you can use it:
Byte[] bytes = new Byte[] { 200, 180, 34 };
string result = String.Join("a", bytes.Select(x => x.ToString()).ToArray());
You will have string representation of bytes with char separator. To convert back you will need to split, and convert the string[] to byte[] using the same approach with .Select().
Usually a power of 2 is used - that way one character maps to a fixed number of bits. An alphabet of 32 bits for instance would map to 5 bits. The only challenge in that case is how to deserialize variable-length strings.
For 36 bits you could treat the data as a large number, and then:
divide by 36
add the remainder as character to your result
repeat until the division results in 0
Easier said than done perhaps.
you can use modulu.
this example encode your byte array to string of [0-9][a-z].
change it if you want.
public string byteToString(byte[] byteArr)
{
int i;
char[] charArr = new char[byteArr.Length];
for (i = 0; i < byteArr.Length; i++)
{
int byt = byteArr[i] % 36; // 36=num of availible charachters
if (byt < 10)
{
charArr[i] = (char)(byt + 48); //if % result is a digit
}
else
{
charArr[i] = (char)(byt + 87); //if % result is a letter
}
}
return new String(charArr);
}
If you don't want to lose data for de-encoding you can use this example:
public string byteToString(byte[] byteArr)
{
int i;
char[] charArr = new char[byteArr.Length*2];
for (i = 0; i < byteArr.Length; i++)
{
charArr[2 * i] = (char)((int)byteArr[i] / 36+48);
int byt = byteArr[i] % 36; // 36=num of availible charachters
if (byt < 10)
{
charArr[2*i+1] = (char)(byt + 48); //if % result is a digit
}
else
{
charArr[2*i+1] = (char)(byt + 87); //if % result is a letter
}
}
return new String(charArr);
}
and now you have a string double-lengthed when odd char is the multiply of 36 and even char is the residu. for example: 200=36*5+20 => "5k".

Convert an integer to a byte[] of specific length

I'm trying to create a function (C#) that will take 2 integers (a value to become a byte[], a value to set the length of the array to) and return a byte[] representing the value. Right now, I have a function which only returns byte[]s of a length of 4 (I'm presuming 32-bit).
For instance, something like InttoByteArray(0x01, 2) should return a byte[] of {0x00, 0x01}.
Does anyone have a solution to this?
You could use the following
static public byte[] ToByteArray(object anyValue, int length)
{
if (length > 0)
{
int rawsize = Marshal.SizeOf(anyValue);
IntPtr buffer = Marshal.AllocHGlobal(rawsize);
Marshal.StructureToPtr(anyValue, buffer, false);
byte[] rawdatas = new byte[rawsize * length];
Marshal.Copy(buffer, rawdatas, (rawsize * (length - 1)), rawsize);
Marshal.FreeHGlobal(buffer);
return rawdatas;
}
return new byte[0];
}
Some test cases are:
byte x = 45;
byte[] x_bytes = ToByteArray(x, 1);
int y = 234;
byte[] y_bytes = ToByteArray(y, 5);
int z = 234;
byte[] z_bytes = ToByteArray(z, 0);
This will create an array of whatever size the type is that you pass in. If you want to only return byte arrays, it should be pretty easy to change. Right now its in a more generic form
To get what you want in your example you could do this:
int a = 0x01;
byte[] a_bytes = ToByteArray(Convert.ToByte(a), 2);
You can use the BitConverter utility class for this. Though I don't think it allows you to specify the length of the array when you're converting an int. But you can always truncate the result.
http://msdn.microsoft.com/en-us/library/de8fssa4.aspx
Take your current algorithm and chop off bytes from the array if the length specified is less than 4, or pad it with zeroes if it's more than 4. Sounds like you already have it solved to me.
You'd want some loop like:
for(int i = arrayLen - 1 ; i >= 0; i--) {
resultArray[i] = (theInt >> (i*8)) & 0xff;
}
byte[] IntToByteArray(int number, int bytes)
{
if(bytes > 4 || bytes < 0)
{
throw new ArgumentOutOfRangeException("bytes");
}
byte[] result = new byte[bytes];
for(int i = bytes-1; i >=0; i--)
{
result[i] = (number >> (8*i)) & 0xFF;
}
return result;
}
It fills the result array from right to left with the the bytes from less to most significant.
byte byte1 = (byte)((mut & 0xFF) ^ (mut3 & 0xFF));
byte byte2 = (byte)((mut1 & 0xFF) ^ (mut2 & 0xFF));
quoted from
C#: Cannot convert from ulong to byte

BitConverter.ToString() in reverse? [duplicate]

This question already has answers here:
How do you convert a byte array to a hexadecimal string, and vice versa?
(53 answers)
Closed 8 years ago.
I have an array of bytes that I would like to store as a string. I can do this as follows:
byte[] array = new byte[] { 0x01, 0x02, 0x03, 0x04 };
string s = System.BitConverter.ToString(array);
// Result: s = "01-02-03-04"
So far so good. Does anyone know how I get this back to an array? There is no overload of BitConverter.GetBytes() that takes a string, and it seems like a nasty workaround to break the string into an array of strings and then convert each of them.
The array in question may be of variable length, probably about 20 bytes.
Not a built in method, but an implementation. (It could be done without the split though).
String[] arr=str.Split('-');
byte[] array=new byte[arr.Length];
for(int i=0; i<arr.Length; i++) array[i]=Convert.ToByte(arr[i],16);
Method without Split: (Makes many assumptions about string format)
int length=(s.Length+1)/3;
byte[] arr1=new byte[length];
for (int i = 0; i < length; i++)
arr1[i] = Convert.ToByte(s.Substring(3 * i, 2), 16);
And one more method, without either split or substrings. You may get shot if you commit this to source control though. I take no responsibility for such health problems.
int length=(s.Length+1)/3;
byte[] arr1=new byte[length];
for (int i = 0; i < length; i++)
{
char sixteen = s[3 * i];
if (sixteen > '9') sixteen = (char)(sixteen - 'A' + 10);
else sixteen -= '0';
char ones = s[3 * i + 1];
if (ones > '9') ones = (char)(ones - 'A' + 10);
else ones -= '0';
arr1[i] = (byte)(16*sixteen+ones);
}
(basically implementing base16 conversion on two chars)
You can parse the string yourself:
byte[] data = new byte[(s.Length + 1) / 3];
for (int i = 0; i < data.Length; i++) {
data[i] = (byte)(
"0123456789ABCDEF".IndexOf(s[i * 3]) * 16 +
"0123456789ABCDEF".IndexOf(s[i * 3 + 1])
);
}
The neatest solution though, I believe, is using extensions:
byte[] data = s.Split('-').Select(b => Convert.ToByte(b, 16)).ToArray();
If you don't need that specific format, try using Base64, like this:
var bytes = new byte[] { 0x12, 0x34, 0x56 };
var base64 = Convert.ToBase64String(bytes);
bytes = Convert.FromBase64String(base64);
Base64 will also be substantially shorter.
If you need to use that format, this obviously won't help.
byte[] data = Array.ConvertAll<string, byte>(str.Split('-'), s => Convert.ToByte(s, 16));
I believe the following will solve this robustly.
public static byte[] HexStringToBytes(string s)
{
const string HEX_CHARS = "0123456789ABCDEF";
if (s.Length == 0)
return new byte[0];
if ((s.Length + 1) % 3 != 0)
throw new FormatException();
byte[] bytes = new byte[(s.Length + 1) / 3];
int state = 0; // 0 = expect first digit, 1 = expect second digit, 2 = expect hyphen
int currentByte = 0;
int x;
int value = 0;
foreach (char c in s)
{
switch (state)
{
case 0:
x = HEX_CHARS.IndexOf(Char.ToUpperInvariant(c));
if (x == -1)
throw new FormatException();
value = x << 4;
state = 1;
break;
case 1:
x = HEX_CHARS.IndexOf(Char.ToUpperInvariant(c));
if (x == -1)
throw new FormatException();
bytes[currentByte++] = (byte)(value + x);
state = 2;
break;
case 2:
if (c != '-')
throw new FormatException();
state = 0;
break;
}
}
return bytes;
}
it seems like a nasty workaround to break the string into an array of strings and then convert each of them.
I don't think there's another way... the format produced by BitConverter.ToString is quite specific, so if there is no existing method to parse it back to a byte[], I guess you have to do it yourself
the ToString method is not really intended as a conversion, rather to provide a human-readable format for debugging, easy printout, etc.
I'd rethink about the byte[] - String - byte[] requirement and probably prefer SLaks' Base64 solution

Categories