How to create byte[] with length 16 using FromBase64String [duplicate] - c#

This question already has an answer here:
Calculate actual data size from Base64 encoded string length
(1 answer)
Closed 10 years ago.
I have a requirement to create a byte[] with length 16. (A byte array that has 128 bit to be used as Key in AES encryption).
Following is a valid string
"AAECAwQFBgcICQoLDA0ODw=="
What is the algorithm that determines whether a string will be 128 bit? Or is trial and error the only way to create such 128 bit strings?
CODE
static void Main(string[] args)
{
string firstString = "AAECAwQFBgcICQoLDA0ODw=="; //String Length = 24
string secondString = "ABCDEFGHIJKLMNOPQRSTUVWX"; //String Length = 24
int test = secondString.Length;
byte[] firstByteArray = Convert.FromBase64String((firstString));
byte[] secondByteArray = Convert.FromBase64String((secondString));
int firstLength = firstByteArray.Length;
int secondLength = secondByteArray.Length;
Console.WriteLine("First Length: " + firstLength.ToString());
Console.WriteLine("Second Length: " + secondLength.ToString());
Console.ReadLine();
}
Findings:
For 256 bit, we need 256/6 = 42.66 chars. That is rounded to 43 char. [To make it divisible by 4 add =]
For 512 bit, we need 512/6 = 85.33 chars. That is rounded to 86 char. [To make it divisible by 4 add ==]
For 128 bit, we need 128/6 = 21.33 chars. That is rounded to 22 char. [To make it divisible by 4 add ==]

A base64 string for 16 bytes will always be 24 characters and have == at the end, as padding.
(At least when it's decodable using the .NET method. The padding is not always inlcuded in all uses of base64 strings, but the .NET implementation requires it.)

In Base64 encoding '=' is a special symbol that is added to end of the Base64 string to indicate that there is no data for these chars in original value.
Each char is equal to 6 original bits of data, so to produce 8 bit values the string length has to be dividable by 4 without remainder. (6 bits * 4 = 8 bits * 3). When the resulting BASE64 string is shorter than 4n then '=' are added at the end to make it valid.
Update
Last char before '==' encodes only 2 bits of information, so by replacing it with all possible Base64 chars will give you only 4 different keys out of 64 possible combinations. In other words, by generating strings in format "bbbbbbbbbbbbbbbbbbbbbb==" (where 'b' is valid Base64 character) you'll get 15 duplicate keys per each unique key.

You can use PadRight() to pad the string to the end of it with a char that you will later remove once decrypted.

Related

C# : How can I encode GUIDs to 11 character ids?

Like https://www.youtube.com/watch?v={id}
{id}:11characters
I try to use Convert.ToBase64String
string encoded = Convert.ToBase64String(guid.ToByteArray())
.Replace("/", "_")
.Replace("+", "-").Replace("=", "");
like this
The GUIDs is only reduced to 22 characters.
How can I encode GUIDs to 11 character ids?(or less 22 characters)
11 characters, even assuming you could use all 8 bits per character in a URL (hint: you can't), would only allow 88 bits.
A UUID/GUID is 128 bits. Therefore, the conversion you propose is not possible without losing data.
This is off topic answer but it might give you an ID with only 11 characters.
In C# a long value has 64 bits, which if encoded with Base64, there will be 12 characters, including 1 padding =. If we trim the padding =, there will be 11 characters.
One crazy idea here is we could use a combination of Unix Epoch and a counter for one epoch value to form a long value. The Unix Epoch in C# DateTimeOffset.ToUnixEpochMilliseconds is in long format, but the first 2 bytes of the 8 bytes are always 0, because otherwise the date time value will be greater than the maximum date time value. So that gives us 2 bytes to place an ushort counter in.
So, in total, as long as the number of ID generation does not exceed 65536 per millisecond, we can have an unique ID:
// This is the counter for current epoch. Counter should reset in next millisecond
ushort currentCounter = 123;
var epoch = DateTimeOffset.UtcNow.ToUnixTimeMilliseconds();
// Because epoch is 64bit long, so we should have 8 bytes
var epochBytes = BitConverter.GetBytes(epoch);
if (BitConverter.IsLittleEndian)
{
// Use big endian
epochBytes = epochBytes.Reverse().ToArray();
}
// The first two bytes are always 0, because if not, the DateTime.UtcNow is greater
// than DateTime.Max, which is not possible
var counterBytes = BitConverter.GetBytes(currentCounter);
if (BitConverter.IsLittleEndian)
{
// Use big endian
counterBytes = counterBytes.Reverse().ToArray();
}
// Copy counter bytes to the first 2 bytes of the epoch bytes
Array.Copy(counterBytes, 0, epochBytes, 0, 2);
// Encode the byte array and trim padding '='
var shortUid = Convert.ToBase64String(epochBytes).TrimEnd('=');

Conversion from Base 64 error

I'm trying to convert from a Base64 string. First I tried this:
string a = "BTQmJiI6JzFkZ2ZhY";
byte[] b = Convert.FromBase64String(a);
string c = System.Text.Encoding.ASCII.GetString(b);
Then got the exception - System.FormatException was caught Message=Invalid length for a Base-64 char array.
So after googling,I tried this:
string a1 = "BTQmJiI6JzFkZ2ZhY";
int mod4 = a1.Length % 4;
if (mod4 > 0)
{
a1 += new string('=', 4 - mod4);
}
byte[] b1 = Convert.FromBase64String(a1);
string c1 = System.Text.Encoding.ASCII.GetString(b1);
Here I got the exception - System.FormatException was caught Message=Invalid character in a Base-64 string.
Is there any invalid character in "BTQmJiI6JzFkZ2ZhY"? Or is it the length issue?
EDIT: I first decrypt the input string using the below code:
string sourstr, deststr,strchar;
int strlen;
decimal ascvalue, ConvValue;
deststr = "";
sourstr = "InputString";
strlen = sourstr.Length;
for (int intI = 0; intI <= strlen - 1; intI++)
{
strchar = sourstr.Substring(intI, 1);
ascvalue = (decimal)strchar[0];
ConvValue = (decimal)((int)ascvalue ^ 85);
if ((char)ConvValue.ToString().Length == 0)
{
deststr = deststr + strchar;
}
else
{
deststr = deststr + (char)ConvValue;
}
}
This output deststr is passed to below code
Convert.ToBase64String(System.Text.Encoding.ASCII.GetBytes(deststr));
This is where I got "BTQmJiI6JzFkZ2ZhY"
You cannot get such base64 string by encoding whole number of bytes. While encoding, every 3 bytes are represented as 4 characters, because 3 bytes is 24 bits, and each base64 character is 6 bits (2^6=64), so 4 of them is also 24 bits. If number of bytes to encode is not divisable by 3 - you have some bytes left. You can have 2 or 1 bytes left.
If you have 2 bytes left - that's 16 bits and you need at least 3 characters to encode that (2 characters is just 12 bits - not enough). So in case you have 2 bytes left - you encode them with 3 characters and apply "=" padding.
If you have 1 byte left - that's 8 bits. You need at least 2 characters for that. You encode to 2 characters and apply "==" padding.
Note that there is no way to encode something to just one character (and for that reason - there is no "===" padding).
Your string can be divided in 4 character blocks: "BTQm", "JiI6", "JzFk", "Z2Zh", "Y". 4 first blocks each represent 3 bytes, but what "Y" represents? Who knows. You can say that it represents 1 byte in range 0-63, but from above you can see that's not how it works, so to interpret it like that you have to do it yourself.
From above you can see that you cannot get base64 string with length 17 (without padding). You can get 16, 18, 19, 20, but never 17
Are you sure you took all chars from base64 output?
Appending "==" at the end of the string will make your first approach work without any problems. Although there is strange character at the beginning of the output. So the next question is: Are you sure it is "ASCI" Encoding?

How to convert 2 Guids into string of max 50 characters length (2 way conversion)

have an interesting problem - I need to convert 2 (randomly) generated Guids into a string. Here are the constraints:
string max 50 charactes length.
only numbers and small letters can be used (0123456789abcdefghijklmnopqrstuvwxyz)
the algorithm has to be 2 way - need to be able to decode the encoded string into same 2 separate guids.
I've browsed a lot looking for toBase36 conversion bo so far no luck with Guid.
Any ideas? (C#)
First of all, you're in luck, 36^50 is around 2^258.5, so you can store the information in a 50 byte base-36 string. I wonder, though, why anybody would have to use base-36 for this.
You need to treat each GUID as a 128-bit number, then combine them into a 256-bit number, which you will then convert to a base-36 'number'. Converting back is doing the same in reverse.
Guid.ToByteArray will convert a GUID to a 16 byte array. Do it for both GUIDs and you have a 32 byte (which is 256 bits) array. Construct a BigInt from that array (there's a constructor), and then just convert that number to base-36.
To convert a number to base-36, do something like this (I assume everything is positive)
const string digits = "0123456789abcdefghijklmnopqrstuvwxyz";
string ConvertToBase36(BigInt number)
{
string result = "";
while(number > 0)
{
char digit = string[number % 36];
result += digit;
number /= 36;
}
}

Truncating a byte array vs. substringing the Encoded string coming out of SHA-256

I am not familiar with Hashing algorithms and the risks associated when using them and therefore have a question on the answer below that I received on a previous question . . .
Based on the comment that the hash value must, when encoded to ASCII, fit within 16 ASCI characters, the solution is first, to choose some cryptographic hash function (the SHA-2 family includes SHA-256, SHA-384, and SHA-512)
then, to truncate the output of the chosen hash function to 96 bits (12 bytes) - that is, keep the first 12 bytes of the hash function output and discard the remaining bytes
then, to base-64-encode the truncated output to 16 ASCII characters (128 bits)
yielding effectively a 96-bit-strong cryptographic hash.
If I substring the base-64-encoded string to 16 characters is that fundamentally different then keeping the first 12 bytes of the hash function and then base-64-encoding them? If so, could someone please explain (provide example code) for truncating the byte array?
I tested the substring of the full hash value against 36,000+ distinct values and had no collisions. The code below is my current implementation.
Thanks for any help (and clarity) you can provide.
public static byte[] CreateSha256Hash(string data)
{
byte[] dataToHash = (new UnicodeEncoding()).GetBytes(data);
SHA256 shaM = new SHA256Managed();
byte[] hashedData = shaM.ComputeHash(dataToHash);
return hashedData;
}
public override void InputBuffer_ProcessInputRow(InputBufferBuffer Row)
{
byte[] hashedData = CreateSha256Hash(Row.HashString);
string s = Convert.ToBase64String(hashedData, Base64FormattingOptions.None);
Row.HashValue = s.Substring(0, 16);
}
[Original post]
(http://stackoverflow.com/questions/4340471/is-there-a-hash-algorithm-that-produces-a-hash-size-of-64-bits-in-c)
No, there is no difference. However, it's easier to just get the base64 string of the first 12 bytes of the array, instead of truncating the array:
public override void InputBuffer_ProcessInputRow(InputBufferBuffer Row) {
byte[] hashedData = CreateSha256Hash(Row.HashString);
Row.HashValue = Convert.ToBase64String(hashedData, 0, 12);
}
The base 64 encoding simply puts 6 bits in each character, so 3 bytes (24 bits) goes into 4 characters. As long as you are splitting the data at an even 3 byte boundary, it's the same as splitting the string at the even 4 character boundary.
If you try to split the data between these boundaries, the base64 string will be padded with filler data up to the next boundary, so the result would not be the same.
Truncating is as easy as adding Take(12) here:
Change
byte[] hashedData = CreateSha256Hash(Row.HashString);
To:
byte[] hashedData = CreateSha256Hash(Row.HashString).Take(12).ToArray();

Equivalent of sprintf in C#?

Is there something similar to sprintf() in C#?
I would for instance like to convert an integer to a 2-byte byte-array.
Something like:
int number = 17;
byte[] s = sprintf("%2c", number);
string s = string.Format("{0:00}", number)
The first 0 means "the first argument" (i.e. number); the 00 after the colon is the format specifier (2 numeric digits).
However, note that .NET strings are UTF-16, so a 2-character string is 4 bytes, not 2
(edit: question changed from string to byte[])
To get the bytes, use Encoding:
byte[] raw = Encoding.UTF8.GetBytes(s);
(obviously different encodings may give different results; UTF8 will give 2 bytes for this data)
Actually, a shorter version of the first bit is:
string s = number.ToString("00");
But the string.Format version is more flexible.
EDIT: I'm assuming that you want to convert the value of an integer to a byte array and not the value converted to a string first and then to a byte array (check marc's answer for the latter.)
To convert an int to a byte array you can use:
byte[] array = BitConverter.GetBytes(17);
but that will give you an array of 4 bytes and not 2 (since an int is 32 bits.)
To get an array of 2 bytes you should use:
byte[] array = BitConverter.GetBytes((short)17);
If you just want to convert the value 17 to two characters then use:
string result = string.Format("{0:00}", 17);
But as marc pointed out the result will consume 4 bytes since each character in .NET is 2 bytes (UTF-16) (including the two bytes that hold the string length it will be 6 bytes).
It turned out, that what I really wanted was this:
short number = 17;
System.IO.BinaryWriter writer = new System.IO.BinaryWriter(stream);
writer.Write(number);
writer.Flush();
The key here is the Write-function of the BinaryWriter class. It has 18 overloads, converting different formats to a byte array which it writes to the stream. In my case I have to make sure the number I want to write is kept in a short datatype, this will make the Write function write 2 bytes.

Categories