C# Verify Json string via signature and RSA public key - c#

I'm creating an IAP plugin for the local Android market that I think has similar API as google play.
I build an Android side and it will return all response from the market to the unity c# side.
All part working fine but I can't figure out how to verify the signature of the response.
I'm new in cryptography and searching day to day about this.
Please correct me if I'm wrong.
They use a hash algorithm to sign data and encrypt that hash with private key.
I must decrypt signature with public key and compare hashes.
This is my public key (Base64):
MIHNMA0GCSqGSIb3DQEBAQUAA4G7ADCBtwKBrwDltnT/AaF3oMt+F3lza5JEvce0GLS1L9v9Z53lQ3LNluGk0eI2ukgWm7tAiIOLQgn11Sb9mW2VWkYTWGnZ1AZtY0GwdUQJUr7u3CWNznE6XH4UCVOVhGDCLnFrG8BcfDelhcfReGZQ3izOefhc4Oq6vZf5PfLwximK+FH27fR6XL8vg3yyK4LSwT764Dfd6H3IGes6EdTx/C3C690jdyMvhi2Q3qBiqfepHzW/jV8CAwEAAQ==
This key is in ASN.1 DER format.
I break it out and find this data:
SEQUENCE
SEQUENCE
OBJECT IDENTIFIER 1.2.840.113549.1.1.1 rsaEncryption(PKCS #1)
NULL
BIT STRING
SEQUENCE
INTEGER 969837669069837851043825067021609343597253227631794160042862620526559…
INTEGER 65537
As I read on net first INTEGER is Modules and second INTEGER is Exponent
So in c# write a code like this =>
var parameter = new RSAParameters
{
Modulus = HexToByteArray(/* "first_INTEGET" */),
Exponent = BitConverter.GetBytes(/* "second_INTEGER" */)
};
Market send me a Json like this:
{"orderId": "0j8oJgE0Bett-neB", "purchaseToken": "0j8oJgE0Bett-neB", "developerPayload": "payload", "packageName": "com.some.market", "purchaseState": 0, "purchaseTime": 1520676644872, "productId": "card-1"}
The signature is like this:
hTFeQd25PZJ2DhGmXd0eO+C+oBeWsg983I4e5ztXtKAUrOIaNBaqAxHU3vW8acBs1I9fE5cxx/DI/sQGY4QSvpDnSm9aYz3do3joHPOXIVvXjSJfejxwzp9DKMUPd6LrgtxkaGevG+94NuKHFxpCdZlovEPXRJZyEznbASuYLqeW0KjP3jnvvw2O5iNlQRdh98h4Q18bSsaxq9zaRKExFLHkhNf/yO5m84kRB1G8
I'm searched for a method to do this but I don't know which method is true for me.
My verify code is this:
using (var rsa = new RSACryptoServiceProvider())
{
rsa.ImportParameters(parameter);
var hash = new SHA1Managed();
bool dataOK =
rsa.VerifyData(hash.ComputeHash(Encoding.UTF8.GetBytes(json)), CryptoConfig.MapNameToOID("SHA1"), Encoding.UTF8.GetBytes(signature));
}
How truly convert signature to byte[] to verify? (with encoding or what???)
I'm searching a lot but more search more confuse.
Am I going the wrong way or using wrong method or ...?
Why should workflow be complicated?
Can anyone help me, please?
Thanks.

OK, I'll answer in order:
They use a hash algorithm to sign data and encrypt that hash with private key. I must decrypt signature with public key and compare hashes.
No, that's not correct. You should use a signature verification method, as you're currently doing. Seeing signature as encryption of a hash is incorrect; even the latest RSA standards go out of their way to explain this. For RSA the internal padding method is different. For ECDSA there is no direct encryption/decryption possible using the same scheme.
As I read on net first INTEGER is Modules and second INTEGER is Exponent
Yes, although it is spelled modulus, not modules. It is the public exponent, there is also a private exponent for the private key. Also without the caps.
How truly convert signature to byte[] to verify? (with encoding or what???)
Standard Base 64 is already mentioned in the comment section. Note that the key and signature size are not common (but that's OK in itself).
Why should workflow be complicated?
Well, somebody has to code it in the end and crypto is hard. But to make it easier for you: the entire ASN.1 structure is called a SubjectPublicKeyInfo structure; if you look on the internet you will find pre-made code to import from such a structure.
To finally verify the structure: make sure that you use the correct signature format (RSA-PKCS#1 v1.5 or RSA-PSS) and that you know exactly which binary data is fed to the signature generation function. For instance, the signature over the JSON could be in ASCII or UTF-8 or it could be in UTF-16 LE or BE.
Better ask the creator of the signature.

Related

How can I get PHP to sign an input exactly the same as C#?

I'm working with an older API that requires XMLs to be signed. There is no proper documentation and the only code example was given in C#. I need to port this example to PHP. However, the code I've written in PHP gets a different output even when provided with the same input, causing the API call to fail.
I've narrowed it to this function in C#:
public byte[] CreateSignature(byte[] hash)
{
RSAPKCS1SignatureFormatter signatureFormatter = new RSAPKCS1SignatureFormatter(pfxCert.PrivateKey);
signatureFormatter.SetHashAlgorithm("SHA1");
return signatureFormatter.CreateSignature(hash);
}
Here's the same operation in PHP:
public function createSignature($hashByteArray, $certArray) {
$hash = implode(array_map("chr", $hashByteArray));
$hashEncoded = base64_encode($hash);
openssl_sign($hashEncoded,$signature, $certArray);
return unpack("C*", $signature);
}
Note, the input in openssl_sign can't take a byte array, so this is a possible point of difference. I've tried all algorithms provided by openssl_get_md_methods() and also phpseclib, neither have matched the output.
I've created a GitHub gist of the same example input and certificate in both C# and PHP to better illustrate the issue.
How can I get the same signing output in PHP as C#, given the same input?
There's a fundamental difference between them.
The RSAPKCS1SignatureFormatter.CreateSignature method expects a data hash. openssl_sign on the other hand expects the data itself.
From PHP: openssl_sign - Manual
openssl_sign() computes a signature for the specified data by
generating a cryptographic digital signature using the private key
associated with priv_key_id. Note that the data itself is not
encrypted.
Apparently you generate a hash of some data to use with the API written in C#. Do not do it with openssl_sign, instead call with the original data. openssl_sign will hash it before signing, this is by design.

How to validate the garbage value returned by Bouncy Castle rsa decryption with wrong key

I am looking for a reliable way to validate if the string which is returned after decrypting the encrypted payload is a garbage value (like: �=z�{���Z���:���k/����˃�d�A��*�Ԥ�= �?M����5).
I am trying to decrypt the encrypted string with a different public key so, as expected, I am getting a garbage string like posted above. My question is, what is the best way to validate in c# whether the string returned is a valid string or some garbage value.
Found an answer by Jon Skeet here: https://stackoverflow.com/a/7254261/2858235
foreach (char c in value)
{
if (c < 32 || c > 126)
{
...
}
}
The solution you propose will only check if the returned string is ASCII (technically a subset of ASCII)... Considering your name and your surname, you should have first hand knowledge that the world isn't made of ASCII strings. Unicode was "invented" for a reason. Now, as written here, RSA is "malleable". A very simple solution is to append/prepend the hash of the original text (for example using SHA256) to the text before encrypting everything and then verify if the hash is correct after decryption.
If you're concerned with RSA alone, use a proper padding scheme like Optimal Asymmetric Encryption Padding (OAEP) as defined in PKCS#1 v2.0. It uses hashing internally to check if there are errors. C# has native support for that and you will get an error if something wrong happened like a broken ciphertext or the wrong key were used.
If you're concerned with hybrid encryption (RSA + some block cipher), I suggest that you use RSA-OAEP along with an authenticated mode of operation such as AES-GCM (available in C# through BouncyCastle). That way, you can determine all (malicious) manipulations.

C# .net core encryption

I'm trying to use a public key of a user to encrypt a string in a sort of pgp fashion, but I keep getting the error:
bignum routines:BN_mod_inverse:no inverse
I've looked around and I cannot find anything specific as to what I'm doing wrong. I've looked around for .NET core information, but I cannot seem to find anything relevant.
I'm using the following code:
byte[] publicKey = Encoding.UTF8.GetBytes(key);
RSA rsa = RSA.Create();
RSAParameters RSAKeyInfo = new RSAParameters();
RSAKeyInfo.Modulus = publicKey;
RSAKeyInfo.Exponent = new byte[]{1,0,1};
rsa.ImportParameters(RSAKeyInfo);
var encrypted = rsa.Encrypt(Encoding.UTF8.GetBytes(user.challenge.text), RSAEncryptionPadding.Pkcs1);
It's entirely possible I'm going about this entirely wrong, so any thoughts or suggestions would be great!
Your overall structure (build RSAParameters, call ImportParameters, call Encrypt) is valid, which suggests that your error is in Modulus recovery.
If your Modulus is input as a string, it's likely encoded as (most to least)
Base64 (Convert.FromBase64String)
Hex (May need a manual parser)
UTF-8
UTF-8 is really unlikely, since the Modulus value can contain bytes whose value are 0 (and other invalid/unexpected UTF-8 sequences). While all even-length byte sequences encoded as hex can be validly decoded as Base64, it's extraordinarily unlikely that you'd misinterpret them given two or three different inputs.
Other noteworthy things:
RSA is IDisposable, you should put it in a using statement to ensure resources free up earlier when you are done with them.
The Exponent is usually 0x010001, but that isn't required. Unless you have a guaranteed constraint on it, you should be serializing it, too.
And if it is guaranteed to always be 0x010001, why make a new one each time? Save it as a static field and make the GC's job easier.
As Maarten said, RSA-ENC-PKCS1 is susceptible to a padding oracle attack, so (especially if your data is over the wire) you should use OAEP.
In the context of .NET, OaepSHA1 has the best support (all inbox providers). OAEP with a SHA-2 algorithm is only supported by RSACng (or the opaque RSA.Create() on Windows).

Can't validate signature value using certificate public key provided in an xml (X509Certificate2)

I spent 100s of hours researching this subject, and other senior programmer who coded the original project also could not make it work. I have an xml with a parameter of SignatureValue, Certificate (X509Certificate2) and Digest Value. The created and given Signature value stated in the same xml was made by converting concatinated fields (equal to Digest Value) into a hash (SHA1), then encrypted via private key. Private key is taken out of the certificate for privacy and I only have the Public key within. Now, no matter how I code around it, I always get a false value back (as in VerifyHash/verifyHashResult is false). Here is the code I am using:
// Need your help please.
static void VerifyHash(string sigVal , string digestVal, System.Security.Cryptography.X509Certificates.X509Certificate2 cert)
{
sigValInBytes = Convert.FromBase64String(sigVal);
try
{
using (RSACryptoServiceProvider rsaProviderDecrypt = (RSACryptoServiceProvider)cert.PublicKey.Key)
{
// Line below always return value of FALSE no matter how I code it. Here I want to verify the hashed freshly calculated digest value that is now hashed with the signature value
rsaProviderDecrypt.Decrypt(sigValInBytes, false);
rsaProviderDecrypt.Dispose();
}
}
}
// At the main program I get the certificate from the xml given and call the method above:
main
{
// Code below gets the certificate details from a given xml, details of each variable confirmed to be accurate.
char[] Base64_x509ByteArray;
Base64_x509ByteArray = t.DigitalSignatures.First().X509Data.ToCharArray();
byte[] x509ByteArray;
x509ByteArray = Convert.FromBase64CharArray(Base64_x509ByteArray, 0, Base64_x509ByteArray.Length);
// Here am creating the certificate from the gathered data/certificate:
System.Security.Cryptography.X509Certificates.X509Certificate2 cert = new System.Security.Cryptography.X509Certificates.X509Certificate2(x509ByteArray);
VerifyHash(t.DigitalSignatures.FirstOrDefault().SignatureValue.Trim(), concatenatedFieldValues, cert);
}
Some shots in the dark:
Find the piece that is broken: Try doing the entire "encrypt / hash check" process in code without transfering anything over XML. If you can hash a string locally, and the hashes match, then the problem is in XML. Otherwise, the problem is in the cert or decryptor.
If the problem is on the cert / encryptor side, try hash matching with a local .NET cryptography class. If that fails, the problem is an encryption setting. Otherwise, it is the cert.
BIG shot in the dark: The call to Dispose right after the hash check. It shouldn't matter, but that caused an issue while I was decrypting using the Rijndael algorithm. Best guess was the optimizer was closing the stream early or something weird like that. Moving the constructor out of the using statement and manually calling Dispose after accessing the result fixed that "optimization".
Might try a reversable encryption algorithm. Rinjdael is native to .NET, and is reversable. Good for debug and proof of concept work. (Note: it uses Time as part of the salt, so RJ doesn't match hashes, it decrypts. So not good for passwords in Production environments.)
If the XML is the cause, check the encodings. Encryption is very sensitive to encodings, and XML serializers are finnicky beasts to begin with. The strings may look the same, but represented differently, or extra control characters added. Sql Server nvarchars are UCS-2, varchars are iso-8859-1, C# strings are utf-8, etc. Easy for encodings to mis-match, and an encoding change would easily cause this. Try converting the original value to utf-16 before inserting into the Xml, and set the Xml Declaration Encoding to utf-16. Just to be safe.
Note about NotePad: if you have opened the Xml in Notepad to take a quick look or edit, and saved it, there are probably extra "end of line" characters on your strings now. If you did the same in Word... oh my... Might want to try an original copy.
Failing that, try generating new encrypted values and see if they match.

Is this the right way to use BouncyCastle.net to get an AES derived key from ECDH?

X9ECParameters curve = NistNamedCurves.GetByName("P-521");
ECDomainParameters ecparam = new ECDomainParameters(curve.Curve, curve.G, curve.N, curve.H, curve.GetSeed());
ECKeyPairGenerator generator = new ECKeyPairGenerator();
generator.Init(new ECKeyGenerationParameters(ecparam, new SecureRandom()));
AsymmetricCipherKeyPair ackp1 = generator.GenerateKeyPair();
AsymmetricCipherKeyPair ackp2 = generator.GenerateKeyPair();
then,
ECDHWithKdfBasicAgreement agreement = new ECDHWithKdfBasicAgreement("2.16.840.1.101.3.4.42", new ECDHKekGenerator(DigestUtilities.GetDigest("SHA256")));
agreement.Init(ackp1.PrivateKey);
BigInteger agInt = agreement.CalculateAgreement(ackp2.PublicKey);
byte[] aeskey = agInt.ToByteArrayUnsigned();
This goes through without generating any errors and I verified that the "aeskey" is the same when I swap in the other pair of public/private keys.
I found zero examples of this kind of usage with google.
The code seems correct to me, bu having to provide the Der OID for AES256 (instead of the string "AES256", which bombs in CalculateAgreement) makes me suspicious that I am doing something wrong.
this was reposted from This question on crypto.stackexchange.
You seem to be heading in the right direction though I'm not sure your OID is correct. It looks very suspicious to me, a quick internet search did not show up any expected results.
According to RFC 2631:
algorithm is the ASN.1 algorithm OID of the CEK wrapping algorithm
with which this KEK will be used. Note that this is NOT an
AlgorithmIdentifier, but simply the OBJECT IDENTIFIER. No
parameters are used.
So the use of an OID is correct, but the OID itself may not be. I would expect an OID that indicates e.g. AES in CBC or GCM mode. This won't show up if you use an invalid OID on both sides of course, it is only used to generate your key, not when you actually use it.
Note that the code of Bouncy Castle seems to be vulnerable to a bug that was also in the Java DH code: a BigInteger is always encoded in the minimal number of bytes. However, the key generated by any normal key agreement is a specific number of bytes, including initial 00 valued bytes. This means that just calling BigInteger.ToByteArray will generate the wrong number of bytes (or an illegal value) once in about 256 bytes as leading zero's will be lost during the conversion. Again, this won't make any differences in operation during testing against identical code on the same system. But the DH used against other systems will fail now and then (I've reported this to Bouncy for Java and it has been confirmed and then fixed in Bouncy 1.50)...
ECDHWithKdfBasicAgreement is a little awkward since it's a port of something that only exists in the JCE parts of the Java build. As #owlstead points out you need to deal with the BigInteger/byte[] conversion. In this case, with latest code, you can use:
int keyLen = GeneratorUtilities.GetDefaultKeySize(algorithm) / 8;
byte[] key = BigIntegers.AsUnsignedByteArray(keyLen, agInt);
or of course, just pad it out to the size you known you need. I think the AES256 thing is fixed in latest code too. Code is now on github (https://github.com/bcgit/bc-csharp), but a new beta build of C# is (finally) a mere day or two away also.

Categories