How decrypt string in c# was encrypted in iOS using Rijndael - c#

I'm trying to encrypt and decrypt the string using objective c and C#. both are working fine in native code, but when I was try to decrypt string in c# was encrypted in iOS. I get some error.
This was the code I used in the objective c
- (NSData *)AES256EncryptWithKey:(NSString *)key Data: (NSData *) data
{
char keyPtr[kCCKeySizeAES256+1]; // room for terminator (unused)
bzero(keyPtr, sizeof(keyPtr)); // fill with zeroes (for padding)
[key getCString:keyPtr maxLength:sizeof(keyPtr) encoding:NSUTF8StringEncoding];
NSUInteger dataLength = [data length];
NSData *iv = [#"abcdefghijklmnopqrstuvwxyz123456" dataUsingEncoding:NSUTF8StringEncoding];
size_t bufferSize = dataLength + kCCBlockSizeAES128;
void *buffer = malloc(bufferSize);
size_t numBytesEncrypted = 0;
CCCryptorStatus cryptStatus = CCCrypt(kCCEncrypt, kCCAlgorithmAES128, kCCOptionPKCS7Padding,
keyPtr, kCCKeySizeAES256,
[iv bytes] /* initialization vector (optional) */,
[data bytes], dataLength, /* input */
buffer, bufferSize, /* output */
&numBytesEncrypted);
if (cryptStatus == kCCSuccess)
{
return [NSData dataWithBytesNoCopy:buffer length:numBytesEncrypted];
}
free(buffer); //free the buffer;
return nil;
}
In want to know how to decrypt in C#, I give blocksize is 256, ivsize to 32 and used "RijndaelManaged()". I'm not using salt & password.
Error: something like "Padding is invalid and cannot be removed."
I tried to set padding too like PKCS7, none, zero but nothing help to decrypt.
can any one help this?
Edit:
My C# code here
public string DecryptString(string encrypted)
{
string result = null;
_encoder = new UTF8Encoding();
if (!string.IsNullOrWhiteSpace(encrypted) && (encrypted.Length >= 32))
{
var messageBytes = Convert.FromBase64String(encrypted);
using (var rm = new RijndaelManaged())
{
rm.BlockSize = _blockSize;
rm.Key = _encoder.GetBytes("mykey_here");
rm.IV = _encoder.GetBytes("abcdefghijklmnopqrstuvwxyz123456"); ;
rm.Padding = PaddingMode.Zeros;
var decryptor = rm.CreateDecryptor(rm.Key, messageBytes.Take(_ivSize).ToArray());
result = _encoder.GetString(Transform(messageBytes.Skip(_ivSize).ToArray(), decryptor));
}
}
return result;
}
protected byte[] Transform(byte[] buffer, ICryptoTransform transform)
{
byte[] result;
using (var stream = new MemoryStream())
using (var cs = new CryptoStream(stream, transform, CryptoStreamMode.Write))
{
cs.Write(buffer, 0, buffer.Length);
cs.FlushFinalBlock();
result = stream.ToArray();
}
return result;
}

iOS (Common Crypto) explicitly specifies all encryption parameters, the C# code implicitly determines many parameters. These implicit parameters while simplifying usage are problematic when trying to achieve interoperability.
The C# class RijndaelManaged allows explicitly specifying parameter, change your code to use these, in particular BlockSize (128), KeySize (128), Mode (CipherMode.CBC) and Padding (PaddingMode.PKCS7). The defaults for mode and Padding are OK. See RijndaelManaged Documentation
AES and Rijndael are not the same, in particular AES uses only a block size of 128 bits (16 bytes) and Rijndael allows several block sizes. So one needs to specify a block size of 128 bits for Rijndael. Thus the iv is also 128 bits (16 bytes).
Both support encryption keys of 128, 192 and 256 bytes.
You would probably be better off using the AESManaged class than the RijndaelManaged class. See AesManaged Documentation
The C# side expects the data to be Base64 encoded, the iOS side does not show that encoding operation, make sure that is being done on the iOS side.
Since you are using an iv make sure you are using CBC mode on both sides. In Common Crypto CBC mode is the default, make sure CBC mode is being used on the C# side.
Make sure the C# side is using PKCS#7 or PKCS#5 padding, they are equivalent. It appears that PKCS#7 is the default on the C# side so this should be OK.
It is best to use a key of exactly the size specified and not rely on default padding. In Common Crypto the key size is explicitly specified and null padded if the supplied key is to short. The C# looks like it is determining the key size by the supplied key, in this case the key is 10 bytes so the decryption key probably defaults to 128 bits and the key is being internally padded with nulls. On iOS you are explicitly specifying a key size of 256 bits. This is a mis-match that needs to be fixed. Supply a key that is the exact size specified on the iOS side.
Finally there is the iv, the C# code expects the iv to be prepended to the encrypted data but the iOS code is not providing that. The solution is to change the iOS code to prepend the iv to the encrypted code. Change the iv to be 16 bytes, the AES block size.
Finally provide hex dumps of the test data in, data out, iv and key just prior to and after the encryption call if you need more help.

Related

TripleDES TransformFinalBlock occasionally giving 'Bad Data. ' error

I have the following code:
public static string PerformEncryption(string text, string uniqueKey, bool encrypt = false)
{
byte[] textBytes = encrypt ? Encoding.UTF8.GetBytes(text) : Convert.FromBase64String(text);
byte[] resultArray;
var staticKey = Convert.FromBase64String(ConfigReader.SecretKey);
using (TripleDESCryptoServiceProvider tDes = new TripleDESCryptoServiceProvider())
{
tDes.Mode = CipherMode.ECB;
tDes.Padding = PaddingMode.PKCS7;
tDes.Key = GenerateTripleDesKey(uniqueKey, staticKey);     
CTransform = encrypt ? tDes.CreateEncryptor() : tDes.CreateDecryptor();
resultArray = CTransform.TransformFinalBlock(textBytes, 0, textBytes.Length);
tDes.Clear();
}
if (encrypt)
return Convert.ToBase64String(resultArray, 0, resultArray.Length);
return Encoding.UTF8.GetString(resultArray);
}
private static byte[] GenerateTripleDesKey(string uniqueKey, byte[] staticKey)
{
byte[] keyArray;
using (SHA512CryptoServiceProvider hash = new SHA512CryptoServiceProvider())
keyArray = hash.ComputeHash(Encoding.UTF8.GetBytes(string.Format("{0}{1}", uniqueKey, staticKey)));
byte[] trimmedBytes = new byte[24];
Buffer.BlockCopy(keyArray, 0, trimmedBytes, 0, 24);
return trimmedBytes;
}
 
PerformEncryption is used as a helper method to perform encryption/decryption of a string. A secret key is also supplied for either operation.
It is used in a web API application that is consumed by a mobile app in Android & iOS devices.
Bad Data error is occurring with large portion of the users that are on Android, with a much smaller occurance of this error on iOS. Any tests I have conducted on similar mobile devices do not produce the issue.
Only way I can reproduce error is if I modify the string value in a unit test after its encrypted.
The web API uses Async/Await so I'm not sure if this has something to do with it?
Is there anything I miss with the code above that I have left out or is bad practice?
I don't have access to the raw request being sent to the server so I can't determine if the encrypted value has its content appended with dodgy characters from Android/iOS in the request??
My other thoughts are:
should I switch from using UTF8 getbytes() to ASCII equivalent helper class if that is causing issues with iOS and android environments
Should I just switch to using a diff algorithm completely like AES.
3DES should not be used for new code, instead use AES.
Do not use ECB mode, it is insecure, see ECB mode, scroll down to the Penguin. Instead use CBC mode with a random IV, just prefix the encrypted data with the IV for use in decryption.
PBKDF2 is more secure than a simple hash or even a salted hash. The main difference is iteration in order to take more time to calculate the key,100,000 iterations is common for PBKDF2.
UTF-8 is to be preferred over ASCII which is too limited.

interoperability of encryption between C++ and C# with Crypto

I am trying to encrypt a string in C++ with Crypto++ lib in a Qt project and decrypt the same in C# in a web application. Here is my code.
C++ Code, using Crypto++ lib
std::string Crypter::encrypt(const std::string& str_in, const std::string& key, const std::string& iv)
{
std::string str_out;
CryptoPP::CFB_Mode<CryptoPP::AES>::Encryption encryption((byte*)key.c_str(), key.length(), (byte*)iv.c_str());
qDebug() << encryption.DefaultKeyLength();
qDebug() << encryption.DefaultIVLength();
CryptoPP::StringSource encryptor(str_in, true,
new CryptoPP::StreamTransformationFilter(encryption,
new CryptoPP::Base64Encoder(
new CryptoPP::StringSink(str_out),
false // do not append a newline
)
)
);
return str_out;
}
Calling the function here
std::string str = "123456789012345";
std::string key = "01234567891234560123456789123456"; // 32 bytes
std::string iv = "0123456789123456"; // 16 bytes
std::string str_encrypted = c->encrypt(str, key, iv);
std::string str_decrypted = c->decrypt(str_encrypted, key, iv);
std::cout << "str_encrypted: " << str_encrypted << std::endl;
std::cout << "str_decrypted: " << str_decrypted << std::endl;
This code produces following result
Plain text: "123456789012345"
Encrypted value (base64): 3Qo/6hWctRiID3txA9nC
The same code I have written in C# here
private void button1_Click(object sender, EventArgs e)
{
string strOutput = Encrypt("123456789012345");
Debug.WriteLine("Encrypted value is: " + strOutput);
}
private string Encrypt(string clearText)
{
byte[] clearBytes = Encoding.ASCII.GetBytes(clearText + "\0");
using (Aes encryptor = Aes.Create("AES"))
{
encryptor.BlockSize = 128;
encryptor.KeySize = 128;
encryptor.Mode = CipherMode.CFB;
encryptor.Key = Encoding.ASCII.GetBytes("01234567891234560123456789123456");
encryptor.IV = Encoding.ASCII.GetBytes("0123456789123456");
using (MemoryStream ms = new MemoryStream())
{
using (CryptoStream cs = new CryptoStream(ms, encryptor.CreateEncryptor(), CryptoStreamMode.Write))
{
cs.Write(clearBytes, 0, clearBytes.Length);
cs.Close();
}
byte[] bt = ms.ToArray();
clearText = Convert.ToBase64String(bt);
}
}
return clearText;
}
Which produces following result
Encrypted value is: 3YklwM2vG20ZmkOT029jTTL7FlSZHrh0RfvaT1FFa2k=
Can someone please suggest me what am I missing ? What is the correct way to get similar output from both languages.
My objective here is to encrypt a value in C++ and decrypt the same in C#.
Edit
I did certain changes.
Replaced Hello world with 123456789012345
Changed the encoding from utf to Ascii
Added a null byte at the end of C# string
Change the mode to CFB
I have also edited the original result with the new result
Unfortunately, after doing this also, both the strings are not matching.
I have ensured that both the inputs are same.
Your C++ code is in terms of std::string. That is most likely holding text encoded under an ANSI code page. When you pass it into that CryptoPP::StringSource I expect it works upon the bytes of that text directly without transforming it to any other encoding.
Your C# is passing the result of Encoding.Unicode.GetBytes. That means the encryption is working upon the bytes of UTF-16 encoded data.
Since the encodings are differerent, the byte representations are different. Then since the bytes are different, the encrypted result is different.
You need to get both pieces of code working under the same scheme.
If ANSI (or even just ASCII) characters are all that you want to deal with (which is probably the case given your C++ code), then you could modify the C# code to use Encoding.Default.GetBytes (or possibly Encoding.ASCII.GetBytes) to get the bytes of the clearText.
EDIT
Looking further, your C++ code is using CryptoPP::CFB_Mode while your C# code is using encryptor.Mode = CipherMode.CBC;. Those modes need to match otherwise the algorithm will be applied differently.
You may need to go over other properties, such as padding, to ensure both are working under the same scheme.
There appear to be two underlying issues. The following code will produce the same output as the CryptoCC library (3Qo/6hWctRiID3txA9nC):
byte[] clearBytes = Encoding.ASCII.GetBytes(clearText);
using (var encryptor = RijndaelManaged.Create())
{
encryptor.KeySize = 128;
encryptor.Padding = PaddingMode.Zeros;
encryptor.Mode = CipherMode.CFB;
encryptor.Key = Encoding.ASCII.GetBytes("01234567891234560123456789123456");
encryptor.IV = Encoding.ASCII.GetBytes("0123456789123456");
using (MemoryStream ms = new MemoryStream())
{
using (CryptoStream cs = new CryptoStream(ms, encryptor.CreateEncryptor(), CryptoStreamMode.Write))
{
cs.Write(clearBytes, 0, clearBytes.Length);
cs.Close();
}
Array.Copy(ms.ToArray(), clearBytes, clearBytes.Length);
clearText = Convert.ToBase64String(clearBytes);
}
}
return clearText;
Likewise, the following Crypto++ implementation will provide the value .NET returned in your example (3YklwM2vG20ZmkOT029j).
std::string encrypt(const std::string& str_in, const std::string& key, const std::string& iv)
{
std::string str_out;
CryptoPP::AES::Encryption e1((byte*)key.c_str(), key.length());
// use feedback size of 1 byte.
CryptoPP::CFB_Mode_ExternalCipher::Encryption encryption(e1, (byte*)iv.c_str(), 1);
CryptoPP::StringSource encryptor(str_in, true,
new CryptoPP::StreamTransformationFilter(encryption,
new CryptoPP::Base64Encoder(
new CryptoPP::StringSink(str_out),
false // do not append a newline
)
)
);
return str_out;
}
A few notes:
It's not necessary to append a trailing zero to the string.
The Crypto++ implementation does not allow padding in Cipher Feedback (CFB) mode. The .NET implementation requires padding; however, the excess data can be truncated manually (as is done in the .NET example above). (See http://social.msdn.microsoft.com/Forums/vstudio/en-US/a1be5f49-5f0f-4f5f-b01c-af46fdc71915/des-encryption-cfb-mode).
See this post on the implications of using AES in place of Rijndael as the CSP. In particular, the following warning applies to CFB mode:
Essentially, if you want to use RijndaelManaged as AES you need to make sure that:
The block size is set to 128 bits
You are not using CFB mode, or if you are the feedback size is also 128 bits
In this case, using CFB mode introduces additional complications. Note that this is a consequence of using CFB; if you use Cipher Block Chaining (CBC) mode, both Aes and Rijndael return the same result as Crypto++ for the given key and value (IwffxivpwdSuS9BV0KeyCg==).

AES _Encryption in Mysql , Decryption in C#.Net

Mysql :
SELECT AES_ENCRYPT('Test','pass')
AES_ENCRYPT() and AES_DECRYPT() enable encryption and decryption of data using the official AES (Advanced Encryption Standard) algorithm, previously known as “Rijndael.” Encoding with a 128-bit key length is used, but you can extend it up to 256 bits by modifying the source. We chose 128 bits because it is much faster and it is secure enough for most purposes.
http://dev.mysql.com/doc/refman/5.5/en/encryption-functions.html#function_aes-encrypt
I was trying to convert that Encrypted string into Decryped Strig in C#.net but i don't get the results as i expect.
http://msdn.microsoft.com/en-us/library/system.security.cryptography.rijndael.aspx#Y0
C#
static string DecryptStringFromBytes(byte[] cipherText, byte[] Key, byte[] IV)
In this method I pass ciphertext,Key value which i usedfrom Mysql and
Rijndael.Create().IV for byte[] IV
I use the code but i don't get expected result.
Review the code and comment Idk where made a mistake
What you are doing is following a road of pain. Either decrypt/encrypt on MySQL and use an encrypted connection to the database (if that matters) or encrypt/decrypt on your .NET application, storing the encrypted data in a suitable column.
Mixing AES implementations is prone to mistakes and things can break more easily if you change versions of .NET or MySQL.
Now, to know what exactly is wrong we need to know if the IV is compatible between MySQL and .NET, or else find out what is MySQL's implementation IV and supply that.
And the other potential source of problems is how you have generated the byte arrays (we are not seeing that in your example). You have to consider character encoding issues in generating the arrays if the key is textual.
In the comments of this MySQL docs link there is information about the missing parameters.
After a long hours, I found a solution to this issue.
Couple of FYI's:
MySQL as a default for AES_Encrypt uses 128 bit, with ECB mode, which does not require an IV.
What padding mode they use is not specified, but they do say they pad it. For padding I use PaddingMode.Zeros.
In C#, use AesManaged, not RijndaelManaged since that is not recommended anymore.
If your Key is longer than 128 bits (16 bytes), then use a function below to create the correct key size, since the default MySQL AES algorithm uses 128 bit keys.
Make sure you play around with the correct Encoding and know exactly what type of character encoding you will receive back when translating the bytes to characters.
For more info go here: https://forums.mysql.com/read.php?38,193084,195959#msg-195959
Code:
public static string DecryptAESStringFromBytes(byte[] encryptedText, byte[] key)
{
// Check arguments.
if ((encryptedText == null || encryptedText.Length <= 0) || (key == null || key.Length <= 0))
{
throw new ArgumentNullException("Missing arguments");
}
string decryptedText = null;
// Create an AES object with the specified key and IV.
using (AesManaged aesFactory = new AesManaged())
{
aesFactory.KeySize = 128;
aesFactory.Key = AESCreateKey(key, aesFactory.KeySize / 8);
aesFactory.IV = new byte[16];
aesFactory.BlockSize = 128;
aesFactory.Mode = CipherMode.ECB;
aesFactory.Padding = PaddingMode.Zeros;
// Create a decryptor to perform the stream transform.
ICryptoTransform decryptor = aesFactory.CreateDecryptor();
// Create the streams used for decryption.
using (MemoryStream stream = new MemoryStream())
{
using (CryptoStream decryptStream = new CryptoStream(stream, decryptor, CryptoStreamMode.Write))
{
decryptStream.Write(encryptedText, 0, encryptedText.Length);
}
decryptedText = Encoding.Default.GetString(stream.ToArray());
}
}
return decryptedText.Trim();
}
public static byte[] AESCreateKey(byte[] key, int keyLength)
{
// Create the real key with the given key length.
byte[] realkey = new byte[keyLength];
// XOR each byte of the Key given with the real key until there's nothing left.
// This allows for keys longer than our Key Length and pads short keys to the required length.
for (int i = 0; i < key.Length; i++)
{
realkey[i % keyLength] ^= key[i];
}
return realkey;
}
Here is some working code for achieving the same encryption via C# as MySQL:
public byte[] AESEncrypt(byte[] plaintext, byte[] key) {
/*
* Block Length: 128bit
* Block Mode: ECB
* Data Padding: Padded by bytes which Asc() equal for number of padded bytes (done automagically)
* Key Padding: 0x00 padded to multiple of 16 bytes
* IV: None
*/
RijndaelManaged aes = new RijndaelManaged();
aes.BlockSize = 128;
aes.Mode = CipherMode.ECB;
aes.Key = key;
ICryptoTransform encryptor = aes.CreateEncryptor();
MemoryStream mem = new MemoryStream();
CryptoStream cryptStream = new CryptoStream(mem, encryptor,
CryptoStreamMode.Write);
cryptStream.Write(plaintext, 0, plaintext.Length);
cryptStream.FlushFinalBlock();
byte[] cypher = mem.ToArray();
cryptStream.Close();
cryptStream = null;
encryptor.Dispose();
aes = null;
return cypher;
}
For details see MySQL Bug # 16713
EDIT:
Since the above is relying on officially non-documented information (though it is working) I would recommend to avoid it and use one of the options described in the answer from Vinko Vrsalovic .
If you run SELECT AES_ENCRYPT('Test','pass')
your are sending the pass over the network unencrypted so any one can unencrypted the data.
The AES_ENCRYPT is used to store data so if the database gets hacked your data is safe, not to transmit data.
if you want data encryption over the net work connect to your mysql server using the ssl socket

C# AES-256 Encryption

I am using RijndaelManaged to make a simple encryption/decryption utility. This is working fine, but I am trying to get it integrated with another program which is created in Unix (Oracle). My problem is, for all smaller input string, i am getting the exact same encrypted hex as the Unix code is generation, but for longer strings, half of my encrypted hex is same, but the other half is different:
Unix Output:
012345678901234 - 00984BBED076541E051A239C02D97117
0123456789012345678 - A0ACE158AD8CF70CEAE8F76AA27F62A30EA409ECE2F7FF84F1A9AF50817FC0C4
Windows Output (my code):
012345678901234 - 00984BBED076541E051A239C02D97117 (same as above)
0123456789012345678 - A0ACE158AD8CF70CEAE8F76AA27F62A3D9A1B396A614DA2C1281AA1F48BC3EBB (half exactly same as above)
My Windows code is:
public string Encrypt(byte[] PlainTextBytes, byte[] KeyBytes, string InitialVector)
{
byte[] InitialVectorBytes = Encoding.ASCII.GetBytes(InitialVector);
RijndaelManaged SymmetricKey = new RijndaelManaged();
SymmetricKey.Mode = CipherMode.ECB;
SymmetricKey.Padding = PaddingMode.PKCS7;
ICryptoTransform Encryptor = SymmetricKey.CreateEncryptor(KeyBytes, InitialVectorBytes);
MemoryStream MemStream = new MemoryStream();
CryptoStream CryptoStream = new CryptoStream(MemStream, Encryptor, CryptoStreamMode.Write);
CryptoStream.Write(PlainTextBytes, 0, PlainTextBytes.Length);
CryptoStream.FlushFinalBlock();
byte[] CipherTextBytes = MemStream.ToArray();
MemStream.Close();
CryptoStream.Close();
return ByteToHexConversion(CipherTextBytes);
}
Unix (PL/SQL) code:
FUNCTION Encrypt_Card (plain_card_id VARCHAR2)
RETURN RAW AS
num_key_bytes NUMBER := 256/8; -- key length 256 bits (32 bytes)
encrypted_raw RAW (2000); -- stores encrypted binary text
encryption_type PLS_INTEGER := -- total encryption type
DBMS_CRYPTO.ENCRYPT_AES256
+ DBMS_CRYPTO.CHAIN_CBC
+ DBMS_CRYPTO.PAD_PKCS5;
key_bytes_raw RAW(64) :=my_hex_key;
BEGIN
encrypted_raw := DBMS_CRYPTO.ENCRYPT
(
src => UTL_I18N.STRING_TO_RAW (plain_card_id, 'AL32UTF8'),
typ => encryption_type,
key => key_bytes_raw
);
RETURN encrypted_raw;
EXCEPTION
WHEN OTHERS THEN
dbms_output.put_line (plain_card_id || ' - ' || SUBSTR(SQLERRM,1,100) );
RETURN HEXTORAW ('EEEEEE');
The only difference i see is use of PKCS5 and PCKS7. But, .NET doesn't have PCKS5.
What abc said and also you don't seem to have any IV (Initialization Vector) in you PL/SQL code at all.
The fact that the first part are the same has to do with the different modes (ECB and CBC). ECB encrypts each block separately while CBC uses the previous block when encrypting the next one.
What happens here is that since you use CBC and do not set an IV the IV is all zeroes.
That means that the first block of ECB encryption and CBC encryption will be the same.
(Since A XOR 0 = A).
You need to make sure you use the same encryption mode in both systems and if you decide on CBC make sure you use the same IV.
You use ECB in one case and CBC in the other case.

Decrypting RijndaelManaged Encrypted strings with CryptDecrypt

Ok I'm trying to use the Win32 Crypto API in C++ to decrypt a string encrypted in C# (.NET 2) with the RijndaelManaged Class. But I'm having no luck at all i get jibberish or a bad data Win32 error code. All my keys, IV and salt match, I've looked in the watch for both test apps. I've spent all say looking at it and I'm officialy stuck.
Anyway here is the C#
Rfc2898DeriveBytes pdb = new Rfc2898DeriveBytes(GetPassPhrase(), salt, 1000);
RijndaelManaged rijndael = new RijndaelManaged();
rijndael.BlockSize = 128;
rijndael.KeySize = 256;
rijndael.Mode = CipherMode.CBC;
rijndael.Key = pdb.GetBytes(m_KeySize);
rijndael.IV = GetIV(iv);
ICryptoTransform encryptor = rijndael.CreateEncryptor();
MemoryStream msEncrypt = new MemoryStream();
CryptoStream csEncrypt = new CryptoStream(msEncrypt, encryptor, CryptoStreamMode.Write);
Byte[] encryptedBytes = null;
Byte[] toBeEncrypted = UnicodeEncoding.Unicode.GetBytes(value);
csEncrypt.Write(toBeEncrypted, 0, toBeEncrypted.Length);
csEncrypt.FlushFinalBlock();
encryptedBytes = msEncrypt.ToArray();
The C++ to decrypt it is:
keyBlob.hdr.bType = PLAINTEXTKEYBLOB;
keyBlob.hdr.bVersion = CUR_BLOB_VERSION;
keyBlob.hdr.reserved = 0;
keyBlob.hdr.aiKeyAlg = CALG_AES_256;
keyBlob.cbKeySize = KEY_SIZE;
keyBlob.rgbKeyData = &byKey[0];
if ( CryptImportKey( hProv, (const LPBYTE) &keyBlob, sizeof(BLOBHEADER) + sizeof(DWORD) + KEY_SIZE, 0, CRYPT_EXPORTABLE, &hKey ) )
{
if ( CryptSetKeyParam( hKey, KP_IV, (const BYTE *) &byIV, 0))
{
DWORD dwLen = iDestLen;
if ( CryptDecrypt( hKey, 0, TRUE, 0, pbyData, &dwLen))
{
if ( dwLen < (DWORD) *plOutSize)
{
memcpy_s(pbyOutput, *plOutSize, pbyData, dwLen);
*plOutSize = dwLen;
bRet = TRUE;
}
}
else
{
// Log
DWORD dwErr = ::GetLastError();
int y =0;
}
}
}
I'm calling CryptAcquireContext successfully and my C++ is executing fine. Can anyone spot the error in my ways. It's starting to depress me know :(
Ok my fault, I didn't include the Struct def for the keyblob in the C++ and it turns out you need a contigous block of data for the key with the header but I was using the MSDN example that had a pointer to the key data. Which is wrong!
I see that you are using CBC chaining mode to encrypt the plain text.
Are you sure you are using the same chaining mode to decrypt the cypher text?
(I am sorry. I am not able to understand that from the code)
There are a handful of things you should check, since some of the code (declarations etc) are missing:
Block size - this usually should be the same as key size, I think it might even be the default since you dont specify it on C++ side. Set it to 256 on C# side, I guess it best that you explicitly specify it in C++ too.
Padding - the managed classes have PKCS7 as their default padding, I think its the default for cryptoAPI functions too, but I'm not sure.
I assume that GetPassPhrase, GetIV etc give you he same keys you're using on the C++ side?
It's not clear how the encrypted data is passed between the programs, is it possible there is some kind of translation error? E.g. base64, URL encode, etc.

Categories