C# AES CFB compatibility with 3rd party C implementation - c#

I have a 3rd party AES library for C (from Lantronix). I wrapped their API from within C#'s managed code as shown below, and it works:
[DllImport("cbx_enc.dll", CharSet = CharSet.Ansi, CallingConvention = CallingConvention.Cdecl)]
static extern unsafe void VC_blockEncrypt(char* iv, char* key, int length, char* text, int RDkeyLen);
/// <summary>
/// Managed Encrypt Wrapper
/// </summary>
/// <param name="buffer">provides the plain text and receives the same length cipher text</param>
static readonly string key = "abcdef0123456789";
static readonly string iv = "0123456789ABCDEF";
public static void Encrypt(ref byte[] buffer)
{
var keyPtr = Marshal.StringToHGlobalAnsi(key);
var ivPtr = Marshal.StringToHGlobalAnsi(iv);
byte[] temp = new byte[16];
Marshal.Copy(ivPtr, temp, 0, 16);
int index = 0;
for (int i = 0; i < buffer.Length; i++)
{
if (index == 0)
{
Marshal.Copy(temp, 0, ivPtr, 16);
unsafe
{
VC_blockEncrypt((char*) ivPtr, (char*) keyPtr, 0, (char*) ivPtr, 128);
}
Marshal.Copy(ivPtr, temp, 0, 16);
index = 16;
}
temp[16 - index] ^= buffer[i];
buffer[i] = temp[16 - index];
index--;
}
Marshal.FreeHGlobal(ivPtr);
Marshal.FreeHGlobal(keyPtr);
}
Now, when I wrote my own, using System.Security.Cryptography to completely avoid using their unmanaged DLL, my final ciphertext seems to differ from them! I am using the same mode, same key, same iv and same plain text, yet the algorithms are not compatible. Shown below is the property settings for the RijndaelManaged object and the code; am I missing something that causes this incompatibility?
/// <summary>
/// Managed Encrypt
/// </summary>
/// <param name="buffer">provides the plain text and receives the same length cipher text</param>
static readonly string key = "abcdef0123456789";
static readonly string iv = "0123456789ABCDEF";
public static void Encrypt(ref byte[] buffer)
{
using (RijndaelManaged cipher = new RijndaelManaged())
{
cipher.Mode = CipherMode.CFB;
cipher.Key = Encoding.ASCII.GetBytes(key);
cipher.IV = Encoding.ASCII.GetBytes(iv);
cipher.Padding = PaddingMode.None;
cipher.FeedbackSize = 128;
ICryptoTransform encryptor = cipher.CreateEncryptor(cipher.Key, cipher.IV);
using (MemoryStream msEncrypt = new MemoryStream())
{
using (CryptoStream csEncrypt = new CryptoStream(msEncrypt, encryptor, CryptoStreamMode.Write))
{
using (BinaryWriter swEncrypt = new BinaryWriter(csEncrypt))
{
swEncrypt.Write(buffer);
}
buffer = msEncrypt.ToArray();
}
}
}
}
Alternatively, the algorithm that I elucidated from Lantronix architecture looks very straightforward - the API does the encryption, and XORing the output with the plain text is done in the calling-method. Whereas, with .NET library, I don't have such access to the intermediate encrypted output (or is there one?), so that I could XOR the way Lantronix does manually after the encryption...
The end goal is to stop using the unmanaged code, yet should be able to generate the same ciphertext using fully managed .NET code.
Thanks for your help in advance.
p.s. I can provide the 3rd party C library cbx_enc.dll, if you need.
Edit: #Topaco, here are some sample data as requested. Haven’t heard from the vendor as with distributing their DLL; working on it…
Common inputs to CFB:
byte[] buffer = Encoding.ASCII.GetBytes("AAAAAAAAAAAAAABBBBBBBBBBBBBBBBBD"); //plain text
string key = "abcdef0123456789";
string iv = "0123456789ABCDEF";
I/O from the wrapper to the unmanaged DLL:
PlainText Hex: 4141414141414141414141414141424242424242424242424242424242424244
CipherText Hex: C9094F820428E07AE035B6749E18546C62F9D5FD4A78480215DA3625D376A271
I/O from the managed code with FeedbackSize = 128; //CFB128:
PlainText Hex: 4141414141414141414141414141424242424242424242424242424242424244
CipherText Hex: 6A1A5088ACDA505B47192093DD06CD987868BFD85278A4D7D3120CC85FCD3D83
I/O from the managed code with FeedbackSize = 8 //CFB8:
PlainText Hex: 4141414141414141414141414141424242424242424242424242424242424244
CipherText Hex: 6ACA3B1159D38568504248CDFF159C87BB2D3850EDAEAD89493BD91087ED7507
I also did the additional test using ECB to see whether their API behaves like ECB (hence comes the need for external XORing). So, I passed the IV to my ECB code as plain text as shown below, and compared it with their output right before the first XOR – they both don’t match either!
Passed IV as the PlainText to ECB : 30313233343536373839414243444546
CipherText Hex: 2B5B11C9ED9B111A065861D29C478FDA
CipherText Hex from the unmanaged DLL, before the first XOR: 88480EC34569A13BA174F735DF59162E
And finally, here is my ECB implementation for the above test:
static readonly string key = "abcdef0123456789";
static readonly string iv = "0123456789ABCDEF";
public static void Encrypt(ref byte[] buffer)
{
buffer = Encoding.ASCII.GetBytes(iv);
Console.WriteLine($"PlainText: {HexHelper.ToHexString(buffer)}");
var aes = new AesManaged
{
KeySize = 128,
Key = Encoding.ASCII.GetBytes(key),
BlockSize = 128,
Mode = CipherMode.ECB,
Padding = PaddingMode.None,
IV = new byte[] { 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0 }
};
ICryptoTransform encryptor = aes.CreateEncryptor(aes.Key, aes.IV);
buffer = encryptor.TransformFinalBlock(buffer, 0, buffer.Length);
Console.WriteLine($"CipherText: {HexHelper.ToHexString(buffer)}");
}
Thanks.

Like to express my thanks all for the help, especially to #Topaco many thanks – your insight in encoding the plain text in HEX as according to the docs helped!
Here is the revised wrapper code; as you can see its cipher now matches the managed code’s cipher! Perfect!!
static readonly string key = "61626364656630313233343536373839"; //"abcdef0123456789";
static readonly string iv = "0123456789ABCDEF";
public static void Encrypt(ref byte[] buffer)
{
Console.WriteLine($"PlainText: {HexHelper.ToHexString(buffer)}");
var keyPtr = Marshal.StringToHGlobalAnsi(key);
var ivPtr = Marshal.StringToHGlobalAnsi(iv);
byte[] temp = new byte[16];
Marshal.Copy(ivPtr, temp, 0, 16);
int index = 0;
for (int i = 0; i < buffer.Length; i++)
{
if (index == 0)
{
Marshal.Copy(temp, 0, ivPtr, 16);
unsafe
{
VC_blockEncrypt((char*) ivPtr, (char*) keyPtr, 0, (char*) ivPtr, 128);
}
Marshal.Copy(ivPtr, temp, 0, 16);
index = 16;
Console.WriteLine($"CipherText BeforeXOR: {HexHelper.ToHexString(temp)}");
}
temp[16 - index] ^= buffer[i];
buffer[i] = temp[16 - index];
index--;
}
Marshal.FreeHGlobal(ivPtr);
Marshal.FreeHGlobal(keyPtr);
Console.WriteLine($"CipherText: {HexHelper.ToHexString(buffer)}");
}
I/O from the revised wrapper code:
PlainText: 4141414141414141414141414141424242424242424242424242424242424244
CipherText: 6A1A5088ACDA505B47192093DD06CD987868BFD85278A4D7D3120CC85FCD3D83
I/O from the managed code:
PlainText: 4141414141414141414141414141424242424242424242424242424242424244
CipherText: 6A1A5088ACDA505B47192093DD06CD987868BFD85278A4D7D3120CC85FCD3D83
Cheers!!

Related

Specified padding mode is not valid for this algorithm when using AES & PKCS#7 padding in .Net core 2.0

I spent a whole day investigating this and search all related questions on Stack Overflow for this question so please don't mention about possible duplicates.
The code below gives me a System.Security.Cryptography.CryptographicException: 'Specified padding mode is not valid for this algorithm.'
While using the very same parameters on this website : http://aes.online-domain-tools.com it decrypts perfectly into "Hello world" then filled with five 'x05' bytes for padding (PKCS#7 padding).
However the code below will always yield an exception when calling the TransformFinalBlock()
Context:
Console application running on Win8.1 with .NET Core 2.0 / Algorithm is AES / CBC / padding PKCS#7
I also tried the proposed solution here: Specified padding mode is not valid for this algorithm - c# - System.Security.Cryptography but no success (I also don't understand why if IV is already set in the SymmetricAlgorithm instance, it should be used later on when deciphering?
static void Main(string[] args)
{
string encryptedStr = "e469acd421dd71ade4937736c06fdc9d";
string passphraseStr = "1e089e3c5323ad80a90767bdd5907297b4138163f027097fd3bdbeab528d2d68";
string ivStr = "07dfd3f0b90e25e83fd05ba338d0be68";
// Convert hex strings to their ASCII representation
ivStr = HexStringToString(ivStr);
passphraseStr = HexStringToString(passphraseStr);
encryptedStr = HexStringToString(encryptedStr);
// Convert our ASCII strings to byte arrays
byte[] encryptedBytes = Encoding.ASCII.GetBytes(encryptedStr);
byte[] key = Encoding.ASCII.GetBytes(passphraseStr);
byte[] iv = Encoding.ASCII.GetBytes(ivStr);
// Configure our AES decryptor
SymmetricAlgorithm algorithm = Aes.Create();
algorithm.Mode = CipherMode.CBC;
algorithm.Padding = PaddingMode.PKCS7;
algorithm.KeySize = 256;
//algorithm.BlockSize = 128;
algorithm.Key = key;
algorithm.IV = iv;
Console.WriteLine("IV length " + iv.Length); // 16
Console.WriteLine("Key length " + key.Length); // 32
ICryptoTransform transform = algorithm.CreateDecryptor(algorithm.Key, algorithm.IV);
// Perform decryption
byte[] outputBuffer = transform.TransformFinalBlock(encryptedBytes, 0, encryptedBytes.Length);
// Convert it back to a string
string result = Encoding.ASCII.GetString(outputBuffer);
Console.WriteLine(result);
Console.ReadLine();
}
public static string HexStringToString(string hexString)
{
var sb = new StringBuilder();
for (var i = 0; i < hexString.Length; i += 2)
{
var hexChar = hexString.Substring(i, 2);
sb.Append((char)Convert.ToByte(hexChar, 16));
}
return sb.ToString();
}
The problem is in the way how you convert hex string to byte array. Try to debug your code and check the value of array encryptedBytes. You'll see the following array:
{ 0x3f, 0x69, 0x3f, 0x3f, 0x21, 0x3f, 0x71, 0x3f, 0x3f, 0x3f, 0x77, 0x36, 0x3f, 0x6f, 0x3f, 0x3f }
which is far from input e469acd421dd71ade4937736c06fdc9d.
You shouldn't use System.String object as just a holder of binary char codes because .Net strings are UTF16-encoded.
Now when root cause is clear, the fix is pretty straighforward. Change your HexStringToString method so that it converts hex string to bytes array directly:
public static byte[] HexStringToByteArray(string hexString)
{
if (hexString.Length % 2 != 0)
{
throw new InvalidOperationException($"Inalid hex string '{hexString}'");
}
byte[] bytes = new byte[hexString.Length / 2];
for (var i = 0; i < hexString.Length; i += 2)
{
var hexChar = hexString.Substring(i, 2);
bytes[i / 2] = Convert.ToByte(hexChar, 16);
}
return bytes;
}
Then adjust the code in Main():
byte[] encryptedBytes = HexStringToByteArray(encryptedStr);
byte[] key = HexStringToByteArray(passphraseStr);
byte[] iv = HexStringToByteArray(ivStr);
This will give you desired Hello world in result variable.

TripleDES Length of the data to encrypt is invalid

I have the following code :
public static string Encrypt3Des(string cipherString)
{
string result = "";
byte[] keyArray;
byte[] ivArray;
byte[] toEncryptArray = Enc3DesPerChar(cipherString);
//string toEncryptString = ByteArrayToString(toEncryptArray);
// Get the key from config file
System.Configuration.AppSettingsReader settingsReader = new AppSettingsReader();
string key = (string)settingsReader.GetValue("SecurityKey", typeof(String));
string iv = (string)settingsReader.GetValue("InitializationVector", typeof(String));
keyArray = StringToByteArray(key);
ivArray = StringToByteArray(iv);
TripleDESCryptoServiceProvider tdes = new TripleDESCryptoServiceProvider();
//set the secret key for the tripleDES algorithm
tdes.Key = keyArray;
tdes.IV = ivArray;
//ChiperMode
tdes.Mode = CipherMode.CBC;
//PaddingMode(if any extra byte added)
tdes.Padding = PaddingMode.None;
ICryptoTransform cTransform = tdes.CreateEncryptor();
//transform the specified region of bytes array to resultArray
byte[] resultArray = cTransform.TransformFinalBlock(toEncryptArray, 0, toEncryptArray.Length);
//Release resources held by TripleDes Encryptor
tdes.Clear();
result = ByteArrayToString(resultArray);
return result;
}
And this is my method :
protected static string ByteArrayToString(byte[] ba)
{
StringBuilder hex = new StringBuilder(ba.Length * 2);
foreach (byte b in ba)
hex.AppendFormat("{0:x2}", b);
return hex.ToString();
}
protected static byte[] StringToByteArray(String hex)
{
int NumberChars = hex.Length;
byte[] bytes = new byte[NumberChars / 2];
for (int i = 0; i < NumberChars; i += 2)
bytes[i / 2] = Convert.ToByte(hex.Substring(i, 2), 16);
return bytes;
}
protected static byte[] Enc3DesPerChar(String toEncrypt)
{
string toAsciiString = ByteArrayToString(Encoding.ASCII.GetBytes(toEncrypt));
string toRoll = toAsciiString;
int NumberChars = toRoll.Length;
byte[] bytes = new byte[NumberChars / 2];
for (int i = 0; i < NumberChars; i += 2)
{
bytes[i / 2] = Convert.ToByte(toRoll.Substring(i, 2), 16);
}
return bytes;
}
Everything works fine with the above method until I found that the method cannot accept less than 8 character.
The block code that raise an error :
byte[] resultArray = cTransform.TransformFinalBlock(toEncryptArray, 0, toEncryptArray.Length);
Error message :
Length of the data to encrypt is invalid.
Example input :
Encrypt3Des("14022000"); // return encrypt because 8 character or more
Encrypt3Des("1402200"); // return error because 7 character
Does anybody know why this is or how I can fix it? (I don't know if it comes from my encrypting method, but I know a web app which uses the exact same thing to encrypt strings and that one does work.)
EDIT :
The tool that I used for manual encrypt : 3des
The option must :
Text input type
Plaintext input text
3DES function
CBC mode
Fixed Key Hex
Fixed Init Vector
You are using padding as none. Set the padding mode to PKCS7.
Ok, I think just found the solution (my client told me how), I need to fill up the character with null before the loop. null can be converted to ascii with "00". so I decide to PadRight to the ascii result with '0' to 16 character, so one of my method become :
protected static byte[] Enc3DesPerChar(String toEncrypt)
{
string toAsciiString = ByteArrayToString(Encoding.ASCII.GetBytes(toEncrypt));
string toRoll = toAsciiString.PadRight(16,'0');
int NumberChars = toRoll.Length;
byte[] bytes = new byte[NumberChars / 2];
for (int i = 0; i < NumberChars; i += 2)
{
bytes[i / 2] = Convert.ToByte(toRoll.Substring(i, 2), 16);
}
return bytes;
}

AES C# Encryption Decryption FIPS

I'm trying to do the following test to return results that should return a specific cipher. They provide the Key, IV and Plaintext string as seen below.
But I am getting "Specified initialization vector (IV) does not match the block size for this algorithm."
I been stuck on this for a while and can't find a good simple example and tried a combination of things.
Below is my C# code. I tried to keep it very simple.
string AesPlainText = "1654001d3e1e9bbd036a2f26d9a77b7f";
string AesKey = "3ccb6039c354c9de72adc9ffe9f719c2c8257446c1eb4b86f2a5b981713cf998";
string AesIV = "ce7d4f9679dfc3930bc79aab81e11723";
AesCryptoServiceProvider aes = new AesCryptoServiceProvider();
aes.KeySize = 256;
aes.IV = HexToByteArray(AesIV);
aes.Key = HexToByteArray(AesKey);
aes.Mode = CipherMode.CBC;
// Convert string to byte array
byte[] src = Encoding.Unicode.GetBytes(AesPlainText);
// encryption
using (ICryptoTransform encrypt = aes.CreateEncryptor())
{
byte[] dest = encrypt.TransformFinalBlock(src, 0, src.Length);
// Convert byte array to Base64 strings
Console.WriteLine(Convert.ToBase64String(dest));
}
UPDATED PER ANSWER:
Thanks, great observation. I changed Encoding.UTF8.GetBytes to use HexToByteArray in the above example and it works now.
public static byte[] HexToByteArray(String hex)
{
int NumberChars = hex.Length;
byte[] bytes = new byte[NumberChars / 2];
for (int i = 0; i < NumberChars; i += 2)
bytes[i / 2] = Convert.ToByte(hex.Substring(i, 2), 16);
return bytes;
}
Your plaintext, key and IV seem to be specified in hexadecimals, so you need to decode the hexadecimals to get to the underlying bytes instead of performing UTF8 encoding.
You can get a byte array from hex here. Note that the name of the method should have something with hex in in, don't call it StringToByteArray or atoi or something stupid like that.

Encrypt with iOS; Decrypt with .Net

I need to encrypt a string (an XML file, actually) on an iPhone or iPad and then decrypt it with a .Net application. Thanks to David Veksler's question here, AES interoperability between .Net and iPhone?, and blog post here, http://automagical.rationalmind.net/2009/02/12/aes-interoperability-between-net-and-iphone/, I think I am quite close to accomplishing this.
But in the decrypted string (XML) returned by the C# method, the first 16 characters are gibberish. Beginning with the 17th character, the decrypted string matches the string that was encrypted by the objective-c method.
I followed David's code as closely as possible, but may have changed a couple of things after some trial and error. Here is the encryption code (the password and initVector are just hard-coded in there for now):
CCCryptorStatus result = CCCryptorCreate(kCCEncrypt,
kCCAlgorithmAES128,
kCCOptionPKCS7Padding, // 0x0000 or kCCOptionPKCS7Padding
(const void *)[#"1234567891123456" dataUsingEncoding:NSUTF8StringEncoding].bytes,
[#"1234567891123456" dataUsingEncoding:NSUTF8StringEncoding].length,
(const void *)[#"0000000000000000" dataUsingEncoding:NSUTF8StringEncoding].bytes,
&thisEncipher
);
uint8_t *bufferPtr = NULL;
size_t bufferPtrSize = 0;
size_t remainingBytes = 0;
size_t movedBytes = 0;
size_t plainTextBufferSize = 0;
size_t totalBytesWritten = 0;
uint8_t *ptr;
NSData *plainText = [xmlFileText dataUsingEncoding:NSASCIIStringEncoding];
plainTextBufferSize = [plainText length];
bufferPtrSize = CCCryptorGetOutputLength(thisEncipher, plainTextBufferSize, true);
bufferPtr = malloc(bufferPtrSize * sizeof(uint8_t));
memset((void *)bufferPtr, 0x0, bufferPtrSize);
ptr = bufferPtr;
remainingBytes = bufferPtrSize;
result = CCCryptorUpdate(thisEncipher,
(const void *)[plainText bytes],
plainTextBufferSize,
ptr,
remainingBytes,
&movedBytes
);
ptr += movedBytes;
remainingBytes -= movedBytes;
totalBytesWritten += movedBytes;
result = CCCryptorFinal(thisEncipher,
ptr,
remainingBytes,
&movedBytes
);
totalBytesWritten += movedBytes;
if (thisEncipher)
{
(void) CCCryptorRelease(thisEncipher);
thisEncipher = NULL;
}
if (result == kCCSuccess)
{
NSData *encryptedData = [NSData dataWithBytes:(const void *)bufferPtr length:(NSUInteger)totalBytesWritten];
[[encryptedData base64EncodedStringWithOptions:NSDataBase64EncodingEndLineWithCarriageReturn] writeToFile:docFile atomically:NO encoding:NSUTF8StringEncoding error:nil];
NSLog(#"%d:%d:%d:%#:%#", xmlFileText.length,
encryptedData.length,
[encryptedData base64EncodedStringWithOptions:NSDataBase64EncodingEndLineWithCarriageReturn].length,
encryptedData,
[encryptedData base64EncodedStringWithOptions:NSDataBase64EncodingEndLineWithCarriageReturn]);
if (bufferPtr)
free(bufferPtr);
return;
}
And here is the decryption code:
public static string DecryptString(string base64StringToDecrypt, string passphrase)
{
//Set up the encryption objects
using (AesCryptoServiceProvider acsp = GetProvider(Encoding.Default.GetBytes(passphrase)))
{
byte[] RawBytes = Convert.FromBase64String(base64StringToDecrypt);
ICryptoTransform ictD = acsp.CreateDecryptor();
//RawBytes now contains original byte array, still in Encrypted state
//Decrypt into stream
MemoryStream msD = new MemoryStream(RawBytes, 0, RawBytes.Length);
CryptoStream csD = new CryptoStream(msD, ictD, CryptoStreamMode.Read);
//csD now contains original byte array, fully decrypted
//return the content of msD as a regular string
return (new StreamReader(csD)).ReadToEnd();
}
}
From spot-comparing a few, it appears that the NSData, encryptedData contains the same values as the byte[], RawBytes. But the XML string returned after StreamReader.ReadToEnd() matches the NSString, xmlFileText, except for the first 16 characters. I suspect the problem is either the way I'm encrypting to obtain NSData *encryptedData, or the way I'm converting that to a Base64-Encoded String and writing that to the file, or the way I'm decrypting byte[] RawBytes, or the way I'm converting the decrypted csD back to a string. If anyone can see where I'm going wrong, I will appreciate it.
Update: After David's comments I'm taking a closer look at the IV. I'm trying to use 16 zeros for now.
On iOS, I'm using:
(const void *)[#"0000000000000000" dataUsingEncoding:NSUTF8StringEncoding].bytes
And on .Net I'm using:
new byte[] { 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0 }
These may not be equivalent.
If just the first part of your message is garbled, then it's likely that your IV (Initialization Vector) is not identical at the encryption and decryption ends. The IV affects the first block of data, so it makes sense that having the wrong IV would cause your first block to be wrong but the rest to be right.
On one end of your code, a string of "0" characters is used as the IV. At the other end, a byte array of 0-value bytes is used at the IV. These are not the same; a '0' char is not necessarily a 0 byte value. You must make the IVs identical.

CNG AES -> C# AES

I want to make a program in C# that can open KeePass 1.x kdb files. I downloaded sources and trying to port password database reading functionality. Database contents is encrypted. Encryption key is obtained the following way:
User enters password;
SHA256 hash of password is calculated and split in two 128-bit halves;
Several rounds of AES is applied to each half of hash using key from database header;
Halves are concatenated back;
Result is salted with salt from database header;
SHA256 hash of step 4 result is calculated. That is the encryption key.
I'm stuck on step 3. KeePass uses CNG for AES. Simplified source (for half of hash, other half had the same applied to it):
BCRYPT_ALG_HANDLE hAes = NULL;
BCRYPT_KEY_HANDLE hKey = NULL;
BYTE pbKey32[32] = <encryption key>;
BYTE pbData16[16] = <half of hash from step 2>;
BCryptOpenAlgorithmProvider(&hAes, BCRYPT_AES_ALGORITHM, NULL, 0);
DWORD dwKeyObjLen = 0;
ULONG uResult = 0;
BCryptGetProperty(hAes, BCRYPT_OBJECT_LENGTH, (PUCHAR)&dwKeyObjLen, sizeof(DWORD), &uResult, 0);
BCryptSetProperty(hAes, BCRYPT_CHAINING_MODE, (PUCHAR)BCRYPT_CHAIN_MODE_ECB, static_cast<ULONG>((wcslen(BCRYPT_CHAIN_MODE_ECB) + 1) * sizeof(wchar_t)), 0);
BCRYPT_KEY_DATA_BLOB_32 keyBlob;
ZeroMemory(&keyBlob, sizeof(BCRYPT_KEY_DATA_BLOB_32));
keyBlob.dwMagic = BCRYPT_KEY_DATA_BLOB_MAGIC;
keyBlob.dwVersion = BCRYPT_KEY_DATA_BLOB_VERSION1;
keyBlob.cbKeyData = 32;
memcpy(keyBlob.pbData, pbKey32, 32);
pKeyObj = new UCHAR[dwKeyObjLen];
BCryptImportKey(hAes, NULL, BCRYPT_KEY_DATA_BLOB, &hKey, pKeyObj.get(), dwKeyObjLen, (PUCHAR)&keyBlob, sizeof(BCRYPT_KEY_DATA_BLOB_32), 0);
for (int i = 0; i < rounds; ++i)
{
BCryptEncrypt(hKey, pbData16, 16, NULL, NULL, 0, pbData16, 16, &uResult, 0);
}
So, as far as I understand, it uses AES algorithm with ECB chaining mode and it passes NULL and 0 as 5th and 6th argument of BCryptEncrypt function meaning it will not use initialization vector.
Now, how do I do the same in C#? I wrote following function to do one round of transformation (based on MSDN sample):
public static byte[] KeyTransform(byte[] buffer, byte[] key)
{
Aes aes = Aes.Create();
aes.Key = key;
aes.BlockSize = 128;
aes.KeySize = key.Length * 8;
aes.Mode = CipherMode.ECB;
//aes.IV = new byte[16] { 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0};
ICryptoTransform ct = aes.CreateEncryptor();
using (MemoryStream ms = new MemoryStream())
{
using (CryptoStream cs = new CryptoStream(ms, ct, CryptoStreamMode.Write))
{
using (BinaryWriter bw = new BinaryWriter(cs))
{
bw.Write(buffer);
}
cs.Flush();
}
return ms.ToArray();
}
}
Then I compare buffers after one round of AES applied in original and in my code. My code produces different results from original. How do I fix it?
By the way, no matter if I specify IV or not my code produces different result every time (so I believe IV is always generated and used). If I try to set aes.IV to null it throws exception saying that I can't set it to null.
It seems that initializing ICryptoTransform like this:
ICryptoTransform ct = aes.CreateEncryptor(key, new byte[] { 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0 });
does the trick.
The only thing that worries me is that resulting memory stream has 32 bytes instead of 16. But if I drop last 16 of them it produces what I need.

Categories