I need to encrypt a string (an XML file, actually) on an iPhone or iPad and then decrypt it with a .Net application. Thanks to David Veksler's question here, AES interoperability between .Net and iPhone?, and blog post here, http://automagical.rationalmind.net/2009/02/12/aes-interoperability-between-net-and-iphone/, I think I am quite close to accomplishing this.
But in the decrypted string (XML) returned by the C# method, the first 16 characters are gibberish. Beginning with the 17th character, the decrypted string matches the string that was encrypted by the objective-c method.
I followed David's code as closely as possible, but may have changed a couple of things after some trial and error. Here is the encryption code (the password and initVector are just hard-coded in there for now):
CCCryptorStatus result = CCCryptorCreate(kCCEncrypt,
kCCAlgorithmAES128,
kCCOptionPKCS7Padding, // 0x0000 or kCCOptionPKCS7Padding
(const void *)[#"1234567891123456" dataUsingEncoding:NSUTF8StringEncoding].bytes,
[#"1234567891123456" dataUsingEncoding:NSUTF8StringEncoding].length,
(const void *)[#"0000000000000000" dataUsingEncoding:NSUTF8StringEncoding].bytes,
&thisEncipher
);
uint8_t *bufferPtr = NULL;
size_t bufferPtrSize = 0;
size_t remainingBytes = 0;
size_t movedBytes = 0;
size_t plainTextBufferSize = 0;
size_t totalBytesWritten = 0;
uint8_t *ptr;
NSData *plainText = [xmlFileText dataUsingEncoding:NSASCIIStringEncoding];
plainTextBufferSize = [plainText length];
bufferPtrSize = CCCryptorGetOutputLength(thisEncipher, plainTextBufferSize, true);
bufferPtr = malloc(bufferPtrSize * sizeof(uint8_t));
memset((void *)bufferPtr, 0x0, bufferPtrSize);
ptr = bufferPtr;
remainingBytes = bufferPtrSize;
result = CCCryptorUpdate(thisEncipher,
(const void *)[plainText bytes],
plainTextBufferSize,
ptr,
remainingBytes,
&movedBytes
);
ptr += movedBytes;
remainingBytes -= movedBytes;
totalBytesWritten += movedBytes;
result = CCCryptorFinal(thisEncipher,
ptr,
remainingBytes,
&movedBytes
);
totalBytesWritten += movedBytes;
if (thisEncipher)
{
(void) CCCryptorRelease(thisEncipher);
thisEncipher = NULL;
}
if (result == kCCSuccess)
{
NSData *encryptedData = [NSData dataWithBytes:(const void *)bufferPtr length:(NSUInteger)totalBytesWritten];
[[encryptedData base64EncodedStringWithOptions:NSDataBase64EncodingEndLineWithCarriageReturn] writeToFile:docFile atomically:NO encoding:NSUTF8StringEncoding error:nil];
NSLog(#"%d:%d:%d:%#:%#", xmlFileText.length,
encryptedData.length,
[encryptedData base64EncodedStringWithOptions:NSDataBase64EncodingEndLineWithCarriageReturn].length,
encryptedData,
[encryptedData base64EncodedStringWithOptions:NSDataBase64EncodingEndLineWithCarriageReturn]);
if (bufferPtr)
free(bufferPtr);
return;
}
And here is the decryption code:
public static string DecryptString(string base64StringToDecrypt, string passphrase)
{
//Set up the encryption objects
using (AesCryptoServiceProvider acsp = GetProvider(Encoding.Default.GetBytes(passphrase)))
{
byte[] RawBytes = Convert.FromBase64String(base64StringToDecrypt);
ICryptoTransform ictD = acsp.CreateDecryptor();
//RawBytes now contains original byte array, still in Encrypted state
//Decrypt into stream
MemoryStream msD = new MemoryStream(RawBytes, 0, RawBytes.Length);
CryptoStream csD = new CryptoStream(msD, ictD, CryptoStreamMode.Read);
//csD now contains original byte array, fully decrypted
//return the content of msD as a regular string
return (new StreamReader(csD)).ReadToEnd();
}
}
From spot-comparing a few, it appears that the NSData, encryptedData contains the same values as the byte[], RawBytes. But the XML string returned after StreamReader.ReadToEnd() matches the NSString, xmlFileText, except for the first 16 characters. I suspect the problem is either the way I'm encrypting to obtain NSData *encryptedData, or the way I'm converting that to a Base64-Encoded String and writing that to the file, or the way I'm decrypting byte[] RawBytes, or the way I'm converting the decrypted csD back to a string. If anyone can see where I'm going wrong, I will appreciate it.
Update: After David's comments I'm taking a closer look at the IV. I'm trying to use 16 zeros for now.
On iOS, I'm using:
(const void *)[#"0000000000000000" dataUsingEncoding:NSUTF8StringEncoding].bytes
And on .Net I'm using:
new byte[] { 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0 }
These may not be equivalent.
If just the first part of your message is garbled, then it's likely that your IV (Initialization Vector) is not identical at the encryption and decryption ends. The IV affects the first block of data, so it makes sense that having the wrong IV would cause your first block to be wrong but the rest to be right.
On one end of your code, a string of "0" characters is used as the IV. At the other end, a byte array of 0-value bytes is used at the IV. These are not the same; a '0' char is not necessarily a 0 byte value. You must make the IVs identical.
Related
I have a 3rd party AES library for C (from Lantronix). I wrapped their API from within C#'s managed code as shown below, and it works:
[DllImport("cbx_enc.dll", CharSet = CharSet.Ansi, CallingConvention = CallingConvention.Cdecl)]
static extern unsafe void VC_blockEncrypt(char* iv, char* key, int length, char* text, int RDkeyLen);
/// <summary>
/// Managed Encrypt Wrapper
/// </summary>
/// <param name="buffer">provides the plain text and receives the same length cipher text</param>
static readonly string key = "abcdef0123456789";
static readonly string iv = "0123456789ABCDEF";
public static void Encrypt(ref byte[] buffer)
{
var keyPtr = Marshal.StringToHGlobalAnsi(key);
var ivPtr = Marshal.StringToHGlobalAnsi(iv);
byte[] temp = new byte[16];
Marshal.Copy(ivPtr, temp, 0, 16);
int index = 0;
for (int i = 0; i < buffer.Length; i++)
{
if (index == 0)
{
Marshal.Copy(temp, 0, ivPtr, 16);
unsafe
{
VC_blockEncrypt((char*) ivPtr, (char*) keyPtr, 0, (char*) ivPtr, 128);
}
Marshal.Copy(ivPtr, temp, 0, 16);
index = 16;
}
temp[16 - index] ^= buffer[i];
buffer[i] = temp[16 - index];
index--;
}
Marshal.FreeHGlobal(ivPtr);
Marshal.FreeHGlobal(keyPtr);
}
Now, when I wrote my own, using System.Security.Cryptography to completely avoid using their unmanaged DLL, my final ciphertext seems to differ from them! I am using the same mode, same key, same iv and same plain text, yet the algorithms are not compatible. Shown below is the property settings for the RijndaelManaged object and the code; am I missing something that causes this incompatibility?
/// <summary>
/// Managed Encrypt
/// </summary>
/// <param name="buffer">provides the plain text and receives the same length cipher text</param>
static readonly string key = "abcdef0123456789";
static readonly string iv = "0123456789ABCDEF";
public static void Encrypt(ref byte[] buffer)
{
using (RijndaelManaged cipher = new RijndaelManaged())
{
cipher.Mode = CipherMode.CFB;
cipher.Key = Encoding.ASCII.GetBytes(key);
cipher.IV = Encoding.ASCII.GetBytes(iv);
cipher.Padding = PaddingMode.None;
cipher.FeedbackSize = 128;
ICryptoTransform encryptor = cipher.CreateEncryptor(cipher.Key, cipher.IV);
using (MemoryStream msEncrypt = new MemoryStream())
{
using (CryptoStream csEncrypt = new CryptoStream(msEncrypt, encryptor, CryptoStreamMode.Write))
{
using (BinaryWriter swEncrypt = new BinaryWriter(csEncrypt))
{
swEncrypt.Write(buffer);
}
buffer = msEncrypt.ToArray();
}
}
}
}
Alternatively, the algorithm that I elucidated from Lantronix architecture looks very straightforward - the API does the encryption, and XORing the output with the plain text is done in the calling-method. Whereas, with .NET library, I don't have such access to the intermediate encrypted output (or is there one?), so that I could XOR the way Lantronix does manually after the encryption...
The end goal is to stop using the unmanaged code, yet should be able to generate the same ciphertext using fully managed .NET code.
Thanks for your help in advance.
p.s. I can provide the 3rd party C library cbx_enc.dll, if you need.
Edit: #Topaco, here are some sample data as requested. Haven’t heard from the vendor as with distributing their DLL; working on it…
Common inputs to CFB:
byte[] buffer = Encoding.ASCII.GetBytes("AAAAAAAAAAAAAABBBBBBBBBBBBBBBBBD"); //plain text
string key = "abcdef0123456789";
string iv = "0123456789ABCDEF";
I/O from the wrapper to the unmanaged DLL:
PlainText Hex: 4141414141414141414141414141424242424242424242424242424242424244
CipherText Hex: C9094F820428E07AE035B6749E18546C62F9D5FD4A78480215DA3625D376A271
I/O from the managed code with FeedbackSize = 128; //CFB128:
PlainText Hex: 4141414141414141414141414141424242424242424242424242424242424244
CipherText Hex: 6A1A5088ACDA505B47192093DD06CD987868BFD85278A4D7D3120CC85FCD3D83
I/O from the managed code with FeedbackSize = 8 //CFB8:
PlainText Hex: 4141414141414141414141414141424242424242424242424242424242424244
CipherText Hex: 6ACA3B1159D38568504248CDFF159C87BB2D3850EDAEAD89493BD91087ED7507
I also did the additional test using ECB to see whether their API behaves like ECB (hence comes the need for external XORing). So, I passed the IV to my ECB code as plain text as shown below, and compared it with their output right before the first XOR – they both don’t match either!
Passed IV as the PlainText to ECB : 30313233343536373839414243444546
CipherText Hex: 2B5B11C9ED9B111A065861D29C478FDA
CipherText Hex from the unmanaged DLL, before the first XOR: 88480EC34569A13BA174F735DF59162E
And finally, here is my ECB implementation for the above test:
static readonly string key = "abcdef0123456789";
static readonly string iv = "0123456789ABCDEF";
public static void Encrypt(ref byte[] buffer)
{
buffer = Encoding.ASCII.GetBytes(iv);
Console.WriteLine($"PlainText: {HexHelper.ToHexString(buffer)}");
var aes = new AesManaged
{
KeySize = 128,
Key = Encoding.ASCII.GetBytes(key),
BlockSize = 128,
Mode = CipherMode.ECB,
Padding = PaddingMode.None,
IV = new byte[] { 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0 }
};
ICryptoTransform encryptor = aes.CreateEncryptor(aes.Key, aes.IV);
buffer = encryptor.TransformFinalBlock(buffer, 0, buffer.Length);
Console.WriteLine($"CipherText: {HexHelper.ToHexString(buffer)}");
}
Thanks.
Like to express my thanks all for the help, especially to #Topaco many thanks – your insight in encoding the plain text in HEX as according to the docs helped!
Here is the revised wrapper code; as you can see its cipher now matches the managed code’s cipher! Perfect!!
static readonly string key = "61626364656630313233343536373839"; //"abcdef0123456789";
static readonly string iv = "0123456789ABCDEF";
public static void Encrypt(ref byte[] buffer)
{
Console.WriteLine($"PlainText: {HexHelper.ToHexString(buffer)}");
var keyPtr = Marshal.StringToHGlobalAnsi(key);
var ivPtr = Marshal.StringToHGlobalAnsi(iv);
byte[] temp = new byte[16];
Marshal.Copy(ivPtr, temp, 0, 16);
int index = 0;
for (int i = 0; i < buffer.Length; i++)
{
if (index == 0)
{
Marshal.Copy(temp, 0, ivPtr, 16);
unsafe
{
VC_blockEncrypt((char*) ivPtr, (char*) keyPtr, 0, (char*) ivPtr, 128);
}
Marshal.Copy(ivPtr, temp, 0, 16);
index = 16;
Console.WriteLine($"CipherText BeforeXOR: {HexHelper.ToHexString(temp)}");
}
temp[16 - index] ^= buffer[i];
buffer[i] = temp[16 - index];
index--;
}
Marshal.FreeHGlobal(ivPtr);
Marshal.FreeHGlobal(keyPtr);
Console.WriteLine($"CipherText: {HexHelper.ToHexString(buffer)}");
}
I/O from the revised wrapper code:
PlainText: 4141414141414141414141414141424242424242424242424242424242424244
CipherText: 6A1A5088ACDA505B47192093DD06CD987868BFD85278A4D7D3120CC85FCD3D83
I/O from the managed code:
PlainText: 4141414141414141414141414141424242424242424242424242424242424244
CipherText: 6A1A5088ACDA505B47192093DD06CD987868BFD85278A4D7D3120CC85FCD3D83
Cheers!!
I've ported over some C# to PHP as to which I thought was successful, it turns out that the PHP version of phpseclib is appending extra random bytes to my result.
Here's my C# result in what it should be:
And here's my PHP result, as you can see it appends random bytes to the string.
The PHP functionality I'm using to decode is:
$rijndael = new Rijndael();
$rijndael->setKey(hex2bin($rgbKey));
$rijndael->setIV(hex2bin($rgbIv));
$rijndael->setKeyLength(256);
$rijndael->setBlockLength(128);
$rijndael->disablePadding();
$result = $rijndael->decrypt(rtrim(hex2bin(implode("", $hexdata))));
The C# function that's decoding is:
int num2 = 0;
int count1 = rijndaelManaged.BlockSize / 8;
byte[] buffer4 = new byte[count1];
fileStream.Seek((long)Start, SeekOrigin.Begin);
using (CryptoStream cryptoStream = new CryptoStream((Stream)memoryStream, decryptor, CryptoStreamMode.Write)) {
while (fileStream.Position < fileStream.Length - 128L) {
int count2 = fileStream.Read(buffer4, 0, count1);
num2 += count2;
cryptoStream.Write(buffer4, 0, count2);
if (count2 <= 0)
break;
}
cryptoStream.FlushFinalBlock();
cryptoStream.Close();
}
fileStream.Close();
If anyone could point me in the right direction to prevent the extra bytes that'd be great, the other thing I should note is the amount of extra bytes vary based on the file which is decoded
It appears that removing $rijndael->disablePadding(); fixed the problem!
I'm trying to port following code to java, but wondering how this can work in C#.
Provider is being initialized with keySize = 192, but key is only 16 bytes long.
When doing this in java I got error about incorrect key size.
Can someone explain what is going on here?
public static byte[] TripleDesEncryptOneBlock(byte[] plainText)
{
//This is key of length 16 bytes (128 bits)
byte[] key = Encoding.ASCII.GetBytes("0123456789abcdef");
byte[] result;
try
{
int count = (plainText.Length > 8) ? 8 : plainText.Length;
byte[] array = new byte[8];
Buffer.BlockCopy(plainText, 0, array, 0, count);
ICryptoTransform cryptoTransform = new TripleDESCryptoServiceProvider
{
KeySize = 192,
Key = key,
Mode = CipherMode.ECB,
Padding = PaddingMode.None
}.CreateEncryptor();
byte[] array2 = cryptoTransform.TransformFinalBlock(array, 0, 8);
result = array2;
}
catch (Exception)
{
result = null;
}
return result;
}
It's copying the first 8 bytes of the key to the end of the key. The TDEA standard calls this "keying option 2", and it only provides 80 bits of effective security (that is, it's a weak cipher).
Many Java providers won't do this automatically; by forcing the application to do it explicitly, unsuspecting fallback to a weaker scheme is less likely.
I'm trying to do the following test to return results that should return a specific cipher. They provide the Key, IV and Plaintext string as seen below.
But I am getting "Specified initialization vector (IV) does not match the block size for this algorithm."
I been stuck on this for a while and can't find a good simple example and tried a combination of things.
Below is my C# code. I tried to keep it very simple.
string AesPlainText = "1654001d3e1e9bbd036a2f26d9a77b7f";
string AesKey = "3ccb6039c354c9de72adc9ffe9f719c2c8257446c1eb4b86f2a5b981713cf998";
string AesIV = "ce7d4f9679dfc3930bc79aab81e11723";
AesCryptoServiceProvider aes = new AesCryptoServiceProvider();
aes.KeySize = 256;
aes.IV = HexToByteArray(AesIV);
aes.Key = HexToByteArray(AesKey);
aes.Mode = CipherMode.CBC;
// Convert string to byte array
byte[] src = Encoding.Unicode.GetBytes(AesPlainText);
// encryption
using (ICryptoTransform encrypt = aes.CreateEncryptor())
{
byte[] dest = encrypt.TransformFinalBlock(src, 0, src.Length);
// Convert byte array to Base64 strings
Console.WriteLine(Convert.ToBase64String(dest));
}
UPDATED PER ANSWER:
Thanks, great observation. I changed Encoding.UTF8.GetBytes to use HexToByteArray in the above example and it works now.
public static byte[] HexToByteArray(String hex)
{
int NumberChars = hex.Length;
byte[] bytes = new byte[NumberChars / 2];
for (int i = 0; i < NumberChars; i += 2)
bytes[i / 2] = Convert.ToByte(hex.Substring(i, 2), 16);
return bytes;
}
Your plaintext, key and IV seem to be specified in hexadecimals, so you need to decode the hexadecimals to get to the underlying bytes instead of performing UTF8 encoding.
You can get a byte array from hex here. Note that the name of the method should have something with hex in in, don't call it StringToByteArray or atoi or something stupid like that.
I am trying to do some conversion in C#, and I am not sure how to do this:
private int byteArray2Int(byte[] bytes)
{
// bytes = new byte[] {0x01, 0x03, 0x04};
// how to convert this byte array to an int?
return BitConverter.ToInt32(bytes, 0); // is this correct?
// because if I have a bytes = new byte [] {0x32} => I got an exception
}
private string byteArray2String(byte[] bytes)
{
return System.Text.ASCIIEncoding.ASCII.GetString(bytes);
// but then I got a problem that if a byte is 0x00, it show 0x20
}
Could anyone give me some ideas?
BitConverter is the correct approach.
Your problem is because you only provided 8 bits when you promised 32. Try instead a valid 32-bit number in the array, such as new byte[] { 0x32, 0, 0, 0 }.
If you want an arbitrary length array converted, you can implement this yourself:
ulong ConvertLittleEndian(byte[] array)
{
int pos = 0;
ulong result = 0;
foreach (byte by in array) {
result |= ((ulong)by) << pos;
pos += 8;
}
return result;
}
It's not clear what the second part of your question (involving strings) is supposed to produce, but I guess you want hex digits? BitConverter can help with that too, as described in an earlier question.
byte[] bytes = { 0, 0, 0, 25 };
// If the system architecture is little-endian (that is, little end first),
// reverse the byte array.
if (BitConverter.IsLittleEndian)
Array.Reverse(bytes);
int i = BitConverter.ToInt32(bytes, 0);
Console.WriteLine("int: {0}", i);
this is correct, but you're
missing, that Convert.ToInt32
'wants' 32 bits (32/8 = 4 bytes)
of information to make a conversion,
so you cannot convert just One byte:
`new byte [] {0x32}
absolutely the the same trouble
you have. and do not forget about
the encoding you use: from encoding to encoding you have 'different byte count per symbol'
A fast and simple way of doing this is just to copy the bytes to an integer using Buffer.BlockCopy:
UInt32[] pos = new UInt32[1];
byte[] stack = ...
Buffer.BlockCopy(stack, 0, pos, 0, 4);
This has the added benefit of being able to parse numerous integers into an array just by manipulating offsets..