How to get a string from a bytes array? - c#

I'm creating my own DNS server and host blocker, I want to get host from DNS request message byte[]
dns message hex dump:
e07901000001000000000000057961686f6f03636f6d0000010001
.y...........yahoo.com.....
code:
using System;
using System.Text;
public class Program
{
public static void Main()
{
string b64 = "4HkBAAABAAAAAAAABXlhaG9vA2NvbQAAAQAB";
int pad = b64.Length % 4;
if (pad > 0 )
{
b64 += new string('=', 4 - pad);
}
byte[] decoded = Convert.FromBase64String(b64);
int start = 13;
int end = start;
while(decoded[end] != 0){
end++;
}
int hostLength = end-start;
byte[] byteHost = new byte[hostLength];
Array.Copy(decoded, start, byteHost, 0, hostLength);
string host = Encoding.Default.GetString(byteHost);
Console.WriteLine(host); // yahoo♥com
}
}
The questions:
is my method above to get host name right/efficient/fastest ?
why I get weird character replacing the dot yahoo♥com ?
change to Encoding.ASCII or Encoding.UTF8 has no effect

There's no need for the second array; Encoding.GetString allows you to pass in an offset and count, so: GetString(decoded, start, hostLength)
Never use Encoding.Default; that is badly named - it should be called Encoding.Wrong :) Find out what encoding the data is in (probably UTF-8 or ASCII), and use that
You should be able to use IndexOf to find the terminating '\0'; also consider what your code should do if it doesn't find one
As for the unusual character: the data contains an 03 byte where you would expect the .; check the DNS protocol specification to see if this is expected. 03 is ETX (end of text). Beyond that: I don't know.

Found the answer, 03 is not ETX but length of the next string, let see the the example
00 05 79 61 68 6F 6F 03 63 6F 6D
. . y a h o o . c o m
05 mean is the length of yahoo and 03 is for com
Valid host or domain name contains only ASCII range from 44-127 or [a-z0-9-\.], domain like bücher.nu will be converted into xn--bcher-kva.nu, so I replace byte like 03,0C,09 or under 44 with the dot .
and thanks to #Marc Gravell for GetString(decoded, start, hostLength)
/*
I0sBAAABAAAAAAAABmMtcmluZwZtc2VkZ2UDbmV0AAABAAE
ldgBAAABAAAAAAAABWZwLXZwCWF6dXJlZWRnZQNuZXQAAAEAAQ
4HkBAAABAAAAAAAABXlhaG9vA2NvbQAAAQAB
*/
string b64 = "4VoBAAABAAAAAAAAIGYyNWIzNjgyMGUyNDljNGQxY2I0YzQzNGUxNjc5YTljA2Nsbwxmb290cHJpbnRkbnMDY29tAAABAAE";
int pad = b64.Length % 4;
if (pad > 0)
{
b64 += new string ('=', 4 - pad);
}
byte[] decoded = Convert.FromBase64String(b64);
int start = 13;
int end = start;
while (decoded[end] != 0)
{
if(decoded[end] < 44)
decoded[end] = 0x2e;
end++;
}
int hostLength = end - start;
string host = Encoding.ASCII.GetString(decoded, start, hostLength);
Console.WriteLine(host);
edit: micro optimization, benchmark with 1e9 or 1 billion loop
Convert.ToChar() finished in 00:00:04
for(int i =0; i<1e9; i++){
while (decoded[end] != 0)
{
if(decoded[end] < 44)
decoded[end] = 0x2e;
host += Convert.ToChar(decoded[end]);
end++;
}
}
VS Encoding.ASCII.GetString() finished in 00:03:20 (200 seconds)
for(int i =0; i<1e9; i++){
while (decoded[end] != 0)
{
if(decoded[end] < 44)
decoded[end] = 0x2e;
end++;
}
int hostLength = end - start;
string host = Encoding.ASCII.GetString(decoded, start, hostLength);

Related

How to Parse received Hex bytes into readable string

As the title says, I've been working on MiFare Classic reading a card.
I'm using the MiFare v1.1.3 Library from NuGet
and it returns a byte array, which I parse to readable Hex strings, by looping thru it.
Here's the code snippet:
int sector = 1;
int block = 0;
int size = 16;
var data = await localCard.GetData(sector, block, size);
string hexString = "";
for (int i = 0; i < data.Length; i++)
{
hexString += data[i].ToString("X2") + " ";
}
// hexString returns 84 3D 17 B0 1E 08 04 00 02 63 B5 F6 B9 BE 77 1D
Now, how can I parse it properly?
I've tried parsing it into ASCII, ANSI, Int, Int64, Base64, Long
and all of them didn't match the 'data' that it's suppose to contain
EDIT:
The expected output: 1206058
HEX String returned: 84 3D 17 B0 1E 08 04 00 02 63 B5 F6 B9 BE 77 1D
I've checked the source code
it looks like both Task<byte[]> GetData Task SetData methods do not have any special logic to transform the data. Data are just saved (and read) as byte[]
I suppose that you have to contact author/company that has wrote data you are trying to read.
The expected output: 1206058
Looks strange since you are reading 16 bytes size = 16 and expecting 7 characters to be read.
Is it possible that block or sector values are incorrect ?
I have written a simple program to solve your problem. Perhaps this is what you want to achieve:
// The original byte data array; some random data
byte[] data = { 0, 1, 2, 3, 4, 85, 128, 255 };
// Byte data -> Hex string
StringBuilder hexString = new StringBuilder();
foreach (byte item in data)
{
hexString.Append($"{item.ToString("X2")} ");
}
Console.WriteLine(hexString.ToString().Trim());
// Hex string -> List of bytes
string[] hexArray = hexString.ToString().Trim().Split(' ');
List<byte> dataList = new List<byte>();
foreach (string item in hexArray)
{
dataList.Add(byte.Parse(item, System.Globalization.NumberStyles.HexNumber));
}
dataList.ForEach(b => Console.Write($"{b} "));
Console.WriteLine();
If it is not the right solution please provide us more info about your problem.
If var data potentially is string - you can reverse it from hex by:
// To Hex
byte[] plainBytes = Encoding.ASCII.GetBytes("MiFare v1.1.3");
string hexString = "";
for (int i = 0; i < plainBytes.Length; i++)
hexString += plainBytes[i].ToString("X2") + " ";
Console.WriteLine(hexString); // Result: "4D 69 46 61 72 65 20 76 31 2E 31 2E 33"
// From Hex
hexString = hexString.Replace(" ", ""); // Remove whitespaces to have "4D69466172652076312E312E33"
byte[] hexBytes = new byte[hexString.Length / 2];
for (int i = 0; i < hexString.Length / 2; i++)
hexBytes[i] = Convert.ToByte(hexString.Substring(2 * i, 2), 16);
string plainString = Encoding.ASCII.GetString(hexBytes);
Console.WriteLine(plainString); // Result: "MiFare v1.1.3"
Just, probably, should be needed to define correct Encoding.

Encode UINT64 to a float

I have code in C# that converts an UInt64 to a float, the entered value is for example '4537294320117481472'. The code that does the work is in the first block, the second block shows the relevant functions, and the answers are at the bottom.
byte[] rawParameterData = new byte[8];
Console.Write("Enter Value: ");
string rawDataString = Console.ReadLine();
UInt64 rawParameterInteger = UInt64.Parse(rawDataString);
rawParameterData = ConvertFromUInt64(rawParameterInteger);
float convertedParameterData = ConvertToFloat(rawParameterData, 0);
rawParameterData now equals a byte array of [62,247,181,37,0,0,0,0]
convertedParameterData now equals 0.4838039
public static byte[] ConvertFromUInt64(UInt64 data)
{
var databuf = BitConverter.GetBytes(data);
return SwapBytes(databuf, 8); // DIS is big-endian; need to convert to little-endian: least significant byte is at lower byte location.
}
static float ConvertToFloat(byte[] data, int offset)
{
var databuf = CopyData(data, offset, 4);
return BitConverter.ToSingle(databuf, 0);
}
static byte[] SwapBytes(byte[] srcbuf, int datalength)
{
var destbuf = new byte[srcbuf.Length];
for (var i = 0; i < datalength; i++)
destbuf[datalength - 1 - i] = srcbuf[i];
return destbuf;
}
It seems that the code is relying on the BitConverter.ToSingle(databuf, 0) function that is part of C#.
Can this be done in Python? Thanks.
In python, this is as simple as
import struct
a = 4537294320117481472
b = struct.pack('Q', a)
f = struct.unpack('ff', b)
print(f) # (0.0, 0.4838038980960846)
https://docs.python.org/3/library/struct.html
import struct
import math
# We only care about the first 4 of eight of the digits so we
# need to shift by the character size * number of characters
# then we will just have databuf in our significantBits variable
# So we are converting from: 3E F7 B5 25 0 0 0 0 to 3E F7 B5 25
characterWidth = 8
significantBits = 4537294320117481472 >> (characterWidth * 4)
# We can use this python function convert from bits to a float
# Taken from https://stackoverflow.com/a/14431225/825093
def bitsToFloat(b):
s = struct.pack('>l', b)
return struct.unpack('>f', s)[0]
float = bitsToFloat(significantBits)
print(float) # 0.483803898096

C# Reading String inside a binary file

I have some problems to understand how reading files from different format than the text format. I know that inside a given file there are some information as string. I managed to write the hex code to a text file which is helping me a lot for another function of the process, because I know that after some combinations of hex codes there might be string writed in the file.
For instance, I have this batch of hex codes.
00 39 AF 32 DD 24 BA 09 07 06 03 DB
I know that when the hex codes are equal to AF 32 the next information should be the string. For instance: "Invoice Number 223232"
Any help or reference will be appreciated.
Kind regards,
static void Main(string[] args)
{
StreamWriter writer = new StreamWriter("output.txt", true);
FileStream fs = new FileStream("File", FileMode.Open);
int hexIn;
String hex;
for (int i = 0; (hexIn = fs.ReadByte()) != -1; i++)
{
writer.Write(hexIn + " ");
hex = string.Format("{0:X2}", hexIn);
writer.Write(hex + " ");
}
}
The sample code you have looks like you are trying to read a binary file, not a hex-encoded text file.
If the source file is binary (which is ideal), you would read it byte-by-byte, and run through a state machine to know when to expect a string. You would have to know how long the string is. In the sample below I am assuming a null-terminated C-style string. For pascal style strings you would read a length prefix, or for fixed width just keep track of the expected number of character.
bool done = false;
int state = 0;
StringBuilder result = new StringBuilder();
while (!done) {
int byteValue = fs.ReadByte();
if (bytesValue == -1)
done = true;
else {
switch (state) {
case 0: //looking for 0xAF
if (byteValue == 0xAF)
state = 1;
break;
case 1: //looking for 0x32
if (byteValue == 0x32)
state = 2;
else
state = 0;
break;
case 2: //start reading string
if (byteValue == 0) {//end of C-style string
//Do something with result.ToString()
result.Clear();
state = 0; //go back to looking for more strings
} else {
result.Append((char)byteValue); //assuming 8-bit ASCII string
}
break;
}
}
}
If you are reading a hex-encoded text file, it would be more difficult, as you would have to read hex nibbles at a time and reconstruct the bytes, but the state machine approach would be similar.

RC4 Encryption non-alphanumeric wrong

Background: I'm trying to convert Mike Shaffer's VB RC4 encryption to C# (https://web.archive.org/web/20210927195845/https://www.4guysfromrolla.com/articles/091802-1.3.aspx). See a previous question of mine at Converting Mike Shaffer's RC4Encryption to C#.
It seems my encryption is not working.
Using the demo page at: https://web.archive.org/web/20000303125329/http://www.4guysfromrolla.com:80/demos/rc4test.asp, with password of "abc":
Plain text: og;|Q{Fe should result in
A2 FA E2 55 09 A4 AB 16
However, my code is generating the 5th char as 9, instead of 09:
A2 FA E2 55 9 A4 AB 16
Another example - Plain text: cl**z!Ss should result in
AE F1 F3 03 22 FE BE 00
However, my code is generating:
AE F1 F3 3 22 FE BE 0
It seems it's only a problem with certain non-alphanumeric characters.
Here's my code:
private static string EnDeCrypt(string text)
{
int i = 0;
int j = 0;
string cipher = "";
// Call our method to initialize the arrays used here.
RC4Initialize(password);
// Set up a for loop. Again, we use the Length property
// of our String instead of the Len() function
for (int a = 1; a <= text.Length; a++)
{
// Initialize an integer variable we will use in this loop
int itmp = 0;
// Like the RC4Initialize method, we need to use the %
// in place of Mod
i = (i + 1) % 256;
j = (j + sbox[i]) % 256;
itmp = sbox[i];
sbox[i] = sbox[j];
sbox[j] = itmp;
int k = sbox[(sbox[i] + sbox[j]) % 256];
// Again, since the return type of String.Substring is a
// string, we need to convert it to a char using
// String.ToCharArray() and specifying that we want the
// first value, [0].
char ctmp = text.Substring(a - 1, 1).ToCharArray()
[0];
itmp = ctmp; //there's an implicit conversion for char to int
int cipherby = itmp ^ k;
cipher += (char)cipherby; //just cast cipherby to a char
}
// Return the value of cipher as the return value of our
// method
return cipher;
}
public static string ConvertAsciiToHex(string input)
{
return string.Join(string.Empty, input.Select(c => Convert.ToInt32(c).ToString("X")).ToArray());
}
public static string Encrypt(string text)
{
return ConvertAsciiToHex(EnDeCrypt(text));
}
Here's how I get my encrypted result:
var encryptedResult = RC4Encrypt.Encrypt(valuetoencrypt);
The output is correct (leading zeros don't change the value), your code is simply not padding values that fit into a single hex digit (such as 9 or 3 or 0). Use .ToString("X2") instead of .ToString("X").

How to convert a byte array (MD5 hash) into a string (36 chars)?

I've got a byte array that was created using a hash function. I would like to convert this array into a string. So far so good, it will give me hexadecimal string.
Now I would like to use something different than hexadecimal characters, I would like to encode the byte array with these 36 characters: [a-z][0-9].
How would I go about?
Edit: the reason I would to do this, is because I would like to have a smaller string, than a hexadecimal string.
I adapted my arbitrary-length base conversion function from this answer to C#:
static string BaseConvert(string number, int fromBase, int toBase)
{
var digits = "0123456789abcdefghijklmnopqrstuvwxyz";
var length = number.Length;
var result = string.Empty;
var nibbles = number.Select(c => digits.IndexOf(c)).ToList();
int newlen;
do {
var value = 0;
newlen = 0;
for (var i = 0; i < length; ++i) {
value = value * fromBase + nibbles[i];
if (value >= toBase) {
if (newlen == nibbles.Count) {
nibbles.Add(0);
}
nibbles[newlen++] = value / toBase;
value %= toBase;
}
else if (newlen > 0) {
if (newlen == nibbles.Count) {
nibbles.Add(0);
}
nibbles[newlen++] = 0;
}
}
length = newlen;
result = digits[value] + result; //
}
while (newlen != 0);
return result;
}
As it's coming from PHP it might not be too idiomatic C#, there are also no parameter validity checks. However, you can feed it a hex-encoded string and it will work just fine with
var result = BaseConvert(hexEncoded, 16, 36);
It's not exactly what you asked for, but encoding the byte[] into hex is trivial.
See it in action.
Earlier tonight I came across a codereview question revolving around the same algorithm being discussed here. See: https://codereview.stackexchange.com/questions/14084/base-36-encoding-of-a-byte-array/
I provided a improved implementation of one of its earlier answers (both use BigInteger). See: https://codereview.stackexchange.com/a/20014/20654. The solution takes a byte[] and returns a Base36 string. Both the original and mine include simple benchmark information.
For completeness, the following is the method to decode a byte[] from an string. I'll include the encode function from the link above as well. See the text after this code block for some simple benchmark info for decoding.
const int kByteBitCount= 8; // number of bits in a byte
// constants that we use in FromBase36String and ToBase36String
const string kBase36Digits= "0123456789abcdefghijklmnopqrstuvwxyz";
static readonly double kBase36CharsLengthDivisor= Math.Log(kBase36Digits.Length, 2);
static readonly BigInteger kBigInt36= new BigInteger(36);
// assumes the input 'chars' is in big-endian ordering, MSB->LSB
static byte[] FromBase36String(string chars)
{
var bi= new BigInteger();
for (int x= 0; x < chars.Length; x++)
{
int i= kBase36Digits.IndexOf(chars[x]);
if (i < 0) return null; // invalid character
bi *= kBigInt36;
bi += i;
}
return bi.ToByteArray();
}
// characters returned are in big-endian ordering, MSB->LSB
static string ToBase36String(byte[] bytes)
{
// Estimate the result's length so we don't waste time realloc'ing
int result_length= (int)
Math.Ceiling(bytes.Length * kByteBitCount / kBase36CharsLengthDivisor);
// We use a List so we don't have to CopyTo a StringBuilder's characters
// to a char[], only to then Array.Reverse it later
var result= new System.Collections.Generic.List<char>(result_length);
var dividend= new BigInteger(bytes);
// IsZero's computation is less complex than evaluating "dividend > 0"
// which invokes BigInteger.CompareTo(BigInteger)
while (!dividend.IsZero)
{
BigInteger remainder;
dividend= BigInteger.DivRem(dividend, kBigInt36, out remainder);
int digit_index= Math.Abs((int)remainder);
result.Add(kBase36Digits[digit_index]);
}
// orientate the characters in big-endian ordering
result.Reverse();
// ToArray will also trim the excess chars used in length prediction
return new string(result.ToArray());
}
"A test 1234. Made slightly larger!" encodes to Base64 as "165kkoorqxin775ct82ist5ysteekll7kaqlcnnu6mfe7ag7e63b5"
To decode that Base36 string 1,000,000 times takes 12.6558909 seconds on my machine (I used the same build and machine conditions as provided in my answer on codereview)
You mentioned that you were dealing with a byte[] for the MD5 hash, rather than a hexadecimal string representation of it, so I think this solution provide the least overhead for you.
If you want a shorter string and can accept [a-zA-Z0-9] and + and / then look at Convert.ToBase64String
Using BigInteger (needs the System.Numerics reference)
Using BigInteger (needs the System.Numerics reference)
const string chars = "0123456789abcdefghijklmnopqrstuvwxyz";
// The result is padded with chars[0] to make the string length
// (int)Math.Ceiling(bytes.Length * 8 / Math.Log(chars.Length, 2))
// (so that for any value [0...0]-[255...255] of bytes the resulting
// string will have same length)
public static string ToBaseN(byte[] bytes, string chars, bool littleEndian = true, int len = -1)
{
if (bytes.Length == 0 || len == 0)
{
return String.Empty;
}
// BigInteger saves in the last byte the sign. > 7F negative,
// <= 7F positive.
// If we have a "negative" number, we will prepend a 0 byte.
byte[] bytes2;
if (littleEndian)
{
if (bytes[bytes.Length - 1] <= 0x7F)
{
bytes2 = bytes;
}
else
{
// Note that Array.Resize doesn't modify the original array,
// but creates a copy and sets the passed reference to the
// new array
bytes2 = bytes;
Array.Resize(ref bytes2, bytes.Length + 1);
}
}
else
{
bytes2 = new byte[bytes[0] > 0x7F ? bytes.Length + 1 : bytes.Length];
// We copy and reverse the array
for (int i = bytes.Length - 1, j = 0; i >= 0; i--, j++)
{
bytes2[j] = bytes[i];
}
}
BigInteger bi = new BigInteger(bytes2);
// A little optimization. We will do many divisions based on
// chars.Length .
BigInteger length = chars.Length;
// We pre-calc the length of the string. We know the bits of
// "information" of a byte are 8. Using Log2 we calc the bits of
// information of our new base.
if (len == -1)
{
len = (int)Math.Ceiling(bytes.Length * 8 / Math.Log(chars.Length, 2));
}
// We will build our string on a char[]
var chs = new char[len];
int chsIndex = 0;
while (bi > 0)
{
BigInteger remainder;
bi = BigInteger.DivRem(bi, length, out remainder);
chs[littleEndian ? chsIndex : len - chsIndex - 1] = chars[(int)remainder];
chsIndex++;
if (chsIndex < 0)
{
if (bi > 0)
{
throw new OverflowException();
}
}
}
// We append the zeros that we skipped at the beginning
if (littleEndian)
{
while (chsIndex < len)
{
chs[chsIndex] = chars[0];
chsIndex++;
}
}
else
{
while (chsIndex < len)
{
chs[len - chsIndex - 1] = chars[0];
chsIndex++;
}
}
return new string(chs);
}
public static byte[] FromBaseN(string str, string chars, bool littleEndian = true, int len = -1)
{
if (str.Length == 0 || len == 0)
{
return new byte[0];
}
// This should be the maximum length of the byte[] array. It's
// the opposite of the one used in ToBaseN.
// Note that it can be passed as a parameter
if (len == -1)
{
len = (int)Math.Ceiling(str.Length * Math.Log(chars.Length, 2) / 8);
}
BigInteger bi = BigInteger.Zero;
BigInteger length2 = chars.Length;
BigInteger mult = BigInteger.One;
for (int j = 0; j < str.Length; j++)
{
int ix = chars.IndexOf(littleEndian ? str[j] : str[str.Length - j - 1]);
// We didn't find the character
if (ix == -1)
{
throw new ArgumentOutOfRangeException();
}
bi += ix * mult;
mult *= length2;
}
var bytes = bi.ToByteArray();
int len2 = bytes.Length;
// BigInteger adds a 0 byte for positive numbers that have the
// last byte > 0x7F
if (len2 >= 2 && bytes[len2 - 1] == 0)
{
len2--;
}
int len3 = Math.Min(len, len2);
byte[] bytes2;
if (littleEndian)
{
if (len == bytes.Length)
{
bytes2 = bytes;
}
else
{
bytes2 = new byte[len];
Array.Copy(bytes, bytes2, len3);
}
}
else
{
bytes2 = new byte[len];
for (int i = 0; i < len3; i++)
{
bytes2[len - i - 1] = bytes[i];
}
}
for (int i = len3; i < len2; i++)
{
if (bytes[i] != 0)
{
throw new OverflowException();
}
}
return bytes2;
}
Be aware that they are REALLY slow! REALLY REALLY slow! (2 minutes for 100k). To speed them up you would probably need to rewrite the division/mod operation so that they work directly on a buffer, instead of each time recreating the scratch pads as it's done by BigInteger. And it would still be SLOW. The problem is that the time needed to encode the first byte is O(n) where n is the length of the byte array (this because all the array needs to be divided by 36). Unless you want to work with blocks of 5 bytes and lose some bits. Each symbol of Base36 carries around 5.169925001 bits. So 8 of these symbols would carry 41.35940001 bits. Very near 40 bytes.
Note that these methods can work both in little-endian mode and in big-endian mode. The endianness of the input and of the output is the same. Both methods accept a len parameter. You can use it to trim excess 0 (zeroes). Note that if you try to make an output too much small to contain the input, an OverflowException will be thrown.
System.Text.Encoding enc = System.Text.Encoding.ASCII;
string myString = enc.GetString(myByteArray);
You can play with what encoding you need:
System.Text.ASCIIEncoding,
System.Text.UnicodeEncoding,
System.Text.UTF7Encoding,
System.Text.UTF8Encoding
To match the requrements [a-z][0-9] you can use it:
Byte[] bytes = new Byte[] { 200, 180, 34 };
string result = String.Join("a", bytes.Select(x => x.ToString()).ToArray());
You will have string representation of bytes with char separator. To convert back you will need to split, and convert the string[] to byte[] using the same approach with .Select().
Usually a power of 2 is used - that way one character maps to a fixed number of bits. An alphabet of 32 bits for instance would map to 5 bits. The only challenge in that case is how to deserialize variable-length strings.
For 36 bits you could treat the data as a large number, and then:
divide by 36
add the remainder as character to your result
repeat until the division results in 0
Easier said than done perhaps.
you can use modulu.
this example encode your byte array to string of [0-9][a-z].
change it if you want.
public string byteToString(byte[] byteArr)
{
int i;
char[] charArr = new char[byteArr.Length];
for (i = 0; i < byteArr.Length; i++)
{
int byt = byteArr[i] % 36; // 36=num of availible charachters
if (byt < 10)
{
charArr[i] = (char)(byt + 48); //if % result is a digit
}
else
{
charArr[i] = (char)(byt + 87); //if % result is a letter
}
}
return new String(charArr);
}
If you don't want to lose data for de-encoding you can use this example:
public string byteToString(byte[] byteArr)
{
int i;
char[] charArr = new char[byteArr.Length*2];
for (i = 0; i < byteArr.Length; i++)
{
charArr[2 * i] = (char)((int)byteArr[i] / 36+48);
int byt = byteArr[i] % 36; // 36=num of availible charachters
if (byt < 10)
{
charArr[2*i+1] = (char)(byt + 48); //if % result is a digit
}
else
{
charArr[2*i+1] = (char)(byt + 87); //if % result is a letter
}
}
return new String(charArr);
}
and now you have a string double-lengthed when odd char is the multiply of 36 and even char is the residu. for example: 200=36*5+20 => "5k".

Categories