I need to load an OpenSSL private key into a C# based application.
The commands I used to generate the key are:
$ openssl ecparam -name prime256v1 -genkey -noout -out eckey.pem
$ openssl ec -in eckey.pem
read EC key
writing EC key
-----BEGIN EC PRIVATE KEY-----
MHcCAQEEIMiuwhV+yI0od5E5pSU6ZGuUcflskYD4urONi1g3G7EPoAoGCCqGSM49
AwEHoUQDQgAEe+C/M6u171u5CcL2SQKuFEb+OIEibjw1rx+S5LK4gNNePlDV/bqu
Ofjwc5JDqXA07shbfHNIPUn6Hum7qdiUKg==
-----END EC PRIVATE KEY-----
openssl pkcs8 -topk8 -nocrypt -in eckey.pem -out ec2.pem
cat ec2.pem
-----BEGIN PRIVATE KEY-----
MIGHAgEAMBMGByqGSM49AgEGCCqGSM49AwEHBG0wawIBAQQgyK7CFX7IjSh3kTml
JTpka5Rx+WyRgPi6s42LWDcbsQ+hRANCAAR74L8zq7XvW7kJwvZJAq4URv44gSJu
PDWvH5LksriA014+UNX9uq45+PBzkkOpcDTuyFt8c0g9Sfoe6bup2JQq
-----END PRIVATE KEY-----
The C# code I'm using
string privKeyPKCS8 = #"MIGHAgEAMBMGByqGSM49AgEGCCqGSM49AwEHBG0wawIBAQQgyK7CFX7IjSh3kTmlJTpka5Rx+WyRgPi6s42LWDcbsQ+hRANCAAR74L8zq7XvW7kJwvZJAq4URv44gSJuPDWvH5LksriA014+UNX9uq45+PBzkkOpcDTuyFt8c0g9Sfoe6bup2JQq";
byte[] privKeyBytes8 = Convert.FromBase64String(privKeyPKCS8);//Encoding.UTF8.GetBytes(privKeyEcc);
var pubCNG = CngKey.Import(privKeyBytes, CngKeyBlobFormat.EccPrivateBlob);
What is the correct way to load the EC based key into CngKey?
EDIT
The key within the base 64 encoding adheres to the following format:
ECPrivateKey ::= SEQUENCE {
version INTEGER { ecPrivkeyVer1(1) } (ecPrivkeyVer1),
privateKey OCTET STRING,
parameters [0] ECParameters {{ NamedCurve }} OPTIONAL,
publicKey [1] BIT STRING OPTIONAL
}
Using the secp256r1 curve and a public key in uncompressed point format.
Your key PEM / ASCII armor (the header, footer and base 64) is encoded using the format described in RFC 5915: Elliptic Curve Private Key Structure. This was first specified by Standards for Efficient Cryptography Group (SECG) which is also where the named curve secp256r1 got it's name from. The curve is supported by Microsoft CNG.
ECPrivateKey ::= SEQUENCE {
version INTEGER { ecPrivkeyVer1(1) } (ecPrivkeyVer1),
privateKey OCTET STRING,
parameters [0] ECParameters {{ NamedCurve }} OPTIONAL,
publicKey [1] BIT STRING OPTIONAL
}
You first need to convert this "raw" EC private key structure to a PKCS#8 structure using the command in the (updated) question:
openssl pkcs8 -topk8 -nocrypt -in eckey.pem -out ec2.pem
to get:
SEQUENCE(3 elem)
INTEGER 0 # version of PKCS#8 structure
SEQUENCE (2 elem)
OBJECT IDENTIFIER 1.2.840.10045.2.1 # it's an EC key)
OBJECT IDENTIFIER 1.2.840.10045.3.1.7 # it's secp256r1
OCTET STRING (1 elem) # the key structure
SEQUENCE (3 elem)
INTEGER 1 # version
OCTET STRING (32 byte) # private key value (removed)
[1] (1 elem)
BIT STRING (520 bit) # public key value (removed)
The resulting structure isn't that different, what you are seeing is actually the same as the initial structure. Except that the PKCS#8 structure has an Object Identifier (OID) for designating the key type and an OID for the curve itself in front while your key has just the OID afterwards as parameter. Both also carry the (optional) public key value in the BIT STRING.
So the decoder recognizes this type and returns the EC private key.
The EccPrivateBlob that you were using requires a Microsoft specific structure. See also my question here. It won't work with the structures mentioned above.
Related
I've been developing a project involving digital signatures. I'm given a publickey generated through getpublickeystring method of x509Certificate2.
This is the given publickey
3082010A0282010100D8F165280738A827AA1D2960D09ECA7EA3928BC3B6BA4CF034BEFE6E00DB434793B96C2FD2411BE086B6CE4B30441D18DFF377497F99F043E74B2C30555D2157C2838C9E7BDD566F0633CC0BE6F2649278BB8FE15EB2D4020C087B2252D8ED024DD260A3AA07F399D90EBDA722BEFECAD6D7A011A30F9013C4AFE3E15BA5E5B9028DE51079CDE125555B82E4269699E662D8BF182DF173FCC90135FC117B8254BF7036FE0A9B0667C0266E8EF94F9EE630CA0CF611FF0BB2BBFD99A7B9D9CC6C93D272A7DC87FE59D35C34D9487F1038179C6576114994C942F826168BDF112DD17FC5725684AE74359FED8C753D1D7F01AB8F5A070E93DD89CA0C9BC2520C850203010001
But I'm unable to construct a correct pem certificate from this.
This the code I have used to generate pem from the hex string.
byte[] certBytes = Enumerable.Range(0, certHex.Length)
.Where(x => x % 2 == 0)
.Select(x => Convert.ToByte(certHex.Substring(x, 2), 16))
.ToArray();
string certPem = "-----BEGIN RSA PUBLIC KEY-----\n" +
Convert.ToBase64String(certBytes, Base64FormattingOptions.InsertLineBreaks)
.TrimEnd() + "\n-----END RSA PUBLIC KEY-----\n";
Generated certPem
-----BEGIN CERTIFICATE-----
MIIBCgKCAQEA2PFlKAc4qCeqHSlg0J7KfqOSi8O2ukzwNL7+bgDbQ0eTuWwv0kEb4Ia2zkswRB0Y
3/N3SX+Z8EPnSywwVV0hV8KDjJ573VZvBjPMC+byZJJ4u4/hXrLUAgwIeyJS2O0CTdJgo6oH85nZ
Dr2nIr7+ytbXoBGjD5ATxK/j4Vul5bkCjeUQec3hJVVbguQmlpnmYti/GC3xc/zJATX8EXuCVL9w
Nv4KmwZnwCZujvlPnuYwygz2Ef8Lsrv9mae52cxsk9Jyp9yH/lnTXDTZSH8QOBecZXYRSZTJQvgm
FovfES3Rf8VyVoSudDWf7Yx1PR1/AauPWgcOk92JygybwlIMhQIDAQAB
-----END CERTIFICATE-----
But the result is failing when I use
X509Certificate2 cert = new X509Certificate2(Encoding.ASCII.GetBytes(certPem));
Going through some past questions I saw that rsa exponent and modulus values are separated using the, delimiter, but here, it is not present.
Kindly provide some insights to solve this issue.
Edit
Decryption Logic
csp = RSA.Create();
csp.ImportFromPem(certPem.ToCharArray());
var decryptedBytes = csp.Decrypt(encryptedData, RSAEncryptionPadding.Pkcs1);
Exception occurs during csp.Decrypt
Internal.Cryptography.CryptoThrowHelper.WindowsCryptographicException: 'Key does not exist.'
I have the following problem that I've been dealing for a couple of hours by now and it's driving nuts.
Context
I have a legacy database that stored passwords using the following algorithm. The legacy code used a Python library.
PBKDF2 with SHA256
1000 iterations
Salt has a length of 8
Password is stored like this $salt$hashedPassword
I'm switching login flow for the new system and I need to migrate that old algorithm to a new one. New system uses .netcore
Question
What I'm trying to do is even possible?. How can I achieve it?
What my logic dictates is that I can take the salt and recreate the hashing algorithm using .netcore Crypto library but its not working and the function returns always false.
Legacy Code
from werkzeug.security import generate_password_hash, check_password_hash
def setPassword(self, password):
self.password = generate_password_hash(password, method='pbkdf2:sha256')
Where generate_password_hash comes from the library, this is the code
SALT_CHARS = "abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789"
def generate_password_hash(password, method="pbkdf2:sha256", salt_length=8):
"""Hash a password with the given method and salt with a string of
the given length. The format of the string returned includes the method
that was used so that :func:`check_password_hash` can check the hash.
The format for the hashed string looks like this::
method$salt$hash
This method can **not** generate unsalted passwords but it is possible
to set param method='plain' in order to enforce plaintext passwords.
If a salt is used, hmac is used internally to salt the password.
If PBKDF2 is wanted it can be enabled by setting the method to
``pbkdf2:method:iterations`` where iterations is optional::
pbkdf2:sha256:80000$salt$hash
pbkdf2:sha256$salt$hash
:param password: the password to hash.
:param method: the hash method to use (one that hashlib supports). Can
optionally be in the format ``pbkdf2:<method>[:iterations]``
to enable PBKDF2.
:param salt_length: the length of the salt in letters.
"""
salt = gen_salt(salt_length) if method != "plain" else ""
h, actual_method = _hash_internal(method, salt, password)
return "%s$%s$%s" % (actual_method, salt, h)
def gen_salt(length):
"""Generate a random string of SALT_CHARS with specified ``length``."""
if length <= 0:
raise ValueError("Salt length must be positive")
return "".join(_sys_rng.choice(SALT_CHARS) for _ in range_type(length))
Code
using System;
using System.Security.Cryptography;
using System.Text;
namespace test_pwd
{
class Program
{
static void Main(string[] args)
{
var res = SameHash("Qwerty12", "84e8c8a5dbdafaf23523ffa5dfecf29d53522a35ca4c76fa877c5fcf9eb4b654", "laSgSC6R");
Console.WriteLine(res);
}
public static bool SameHash(string userpwd, string storedHash, string storedSalt)
{
var saltByte = Encoding.UTF8.GetBytes(storedSalt);
var rfc = new Rfc2898DeriveBytes(userpwd, saltByte, 1000);
var baseString = Convert.ToBase64String(rfc.GetBytes(64));
return baseString == storedHash;
}
}
}
Base string is converted into
k6vhCweBNz8ymMeEdhi+1czrea+oTTYLrW1OuwdinA78AFyEXKitpKUGLCt1ZdyS1Vka8Cptzd5u5Uzdbi4MbA==
Which is not the same as the stored password hash I'm sending. What I'm doing wrong or this idea is even feasible?.
Is there a way to convert a string to a GUID using SHA256 without truncating the 16 bytes?
Currently I have this:
using SHA256 sha2 = SHA256.Create())
{
var hash = sha2.ComputeHash(Encoding.Default.GetBytes(string));
return new Guid(hash.Take(16).ToArray());
}
A hash is not the same as a Guid. Trying to equate the 2 is incorrect.
If you want a unique identifier:
return new Guid();
That'll give you one.
If you want the hash, store is as bytes or a string, not a Guid
I'm trying to get the HMAC SHA256 value(str_signature), I followed the Ruby code from this post, although his example was converting code from Java(with a Hex key).
C#
string strRawSignature = "200123123891:12|11231231|GET|just%20test%20value"
// Convert signature to byte array in order to compute the hash
byte[] bSignature = Encoding.UTF8.GetBytes(strRawSignature);
// Convert ApiKey to byte array - for initializing HMACSHA256
byte[] bSecretKey = Convert.FromBase64String(strApiKey);
string strSignature = "";
using (HMACSHA256 hmac = new HMACSHA256(bSecretKey))
{
// Compute signature hash
byte[] bSignatureHash = hmac.ComputeHash(bSignature);
// Convert signature hash to Base64String for transmission
str_signature = Convert.ToBase64String(bSignatureHash);
}
Ruby
require "openssl"
require "base64"
digest = OpenSSL::Digest.new('sha256')
key = [ 'xiIm9FuYhetyijXA2QL58TRlvhuSJ73FtdxiSNU2uHE=' ]
#this is just a dummy signature to show what the possible values are
signature = "200123123891:12|11231231|GET|just%20test%20value"
hmac = OpenSSL::HMAC.digest(digest, key.pack("m*"), signature)
str_signature = Base64.urlsafe_encode64(hmac)
example result: "B0NgX1hhW-rsnadD2_FF-grcw9pWghwMWgG47mU4J94="
Update:
Changed the pack method to output base64 strings.
Edited variable names for concistency
References:
Used hexdigest, has a different ouput string length.
This example uses the digest method, although I'm not sure what value the key parameter has, hopefully it's a base 64 encoded string.
This uses hexdigest again. I am pretty sure that digest is the method to go vs hexdigest, since hexdigest ouput has a longer string compared to the sample HMAC value I have from C# script.
Finally got the monkey of my back!
I don't really need to create a sha256 digest object after all, I just had to put the 'sha256' parameter.
require 'openssl'
require "base64"
#API_KEY = base64 encoded string
key = Base64.decode64(API_KEY)
hash = OpenSSL::HMAC.digest('sha256', key, "Message")
puts Base64.encode64(hash)
thanks to this link
I've been coming against this for a few hours now and I can't seem to find a solution. I'm trying to encrypt a string using SHA512 and put it in a header for HTTP Request. I know that HTTP Headers only like ASCII characters but everytime I generate the encrypted string I only get nonASCII results, is there any way to force ASCII return? Here's my function generating the encryption:
private static string GenerateSignatureHeader(string data, string timestamp){
//Encode to UTF8 required for SHA512
byte[] encData = ASCIIEncoding.UTF8.GetBytes (data);
byte[] hash;
using (SHA512 shaM = new SHA512Managed ()) {
hash = shaM.ComputeHash (encData);
}
return ASCIIEncoding.Default.GetString (hash);
}
And adding to HTTP Request here:
request.Headers.Add("signature", GenerateSignatureHeader(body, timestamp));
Is what I'm doing here correct? The error is thrown when trying to add the header to the request:
Caused by: md52ce486a14f4bcd95899665e9d932190b.JavaProxyThrowable: System.ArgumentException: invalid header value: 4Ɓj�M�P��hM�$�
�s;��6��1!��,�y��.x;��d�G��2�#1'��1�
Parameter name: headerValue
System.Net.WebHeaderCollection.AddWithoutValidate (string,string)
System.Net.WebHeaderCollection.Add (string,string)
So I'm assuming that it's the nonASCII characters that are causing this?
A hash functions return pseudo-random bytes. Such a byte array very likely doesn't correspond to a valid character encoding such as ASCII. That is why you're getting those unprintable character placeholders.
You need to encode the output. For example using Base64 or Hex.
return System.Convert.ToBase64String(hash);