Is there any way to perform private key encryption in C#?
I know about the standard RSACryptoServiceProvider in System.Security.Cryptography, but these classes provide only public key encryption and private key decryption. Also, they provide digital signature functionality, which uses internally private key encryption, but there are not any publicly accessible functions to perform private key encryption and public key decryption.
I've found this article on codeproject, which is a very good start point for performing this kind of encryption, however, I was looking for some ready-to-use code, as the code in the article can hardly encrypt arbitrary-long byte arrays containing random values (that means any values, including zeroes).
Do you know some good components (preferably free) to perform private key encryption?
I use .NET 3.5.
Note: I know this is generally considered as bad way of using asymmetric encryption (encrypting using private key and decrypting using public key), but I just need to use it that way.
Additional Explanation
Consider you have
var bytes = new byte[30] { /* ... */ };
and you want to use 2048bit RSA to ensure no one have changed anything in this array.
Normally, you would use digital signature (ie. RIPEMD160), which you then attach to the original bytes and send over to the receiver.
So, you have 30 bytes of original data, and additional 256 bytes of digital signature (because it is a 2048bit RSA), which is overall of 286 bytes. Hovewer, only 160 bits of that 256 bytes are actually hash, so there is exactly 1888 bits (236 bytes) unused.
So, my idea was this:
Take the 30 bytes of original data, attach to it the hash (20 bytes), and now encrypt these 50 bytes. You get 256 bytes long message, which is much shorter than 286 bytes, because "you were able to push the actual data inside the digital signature".
ECDSA Resources
MSDN
Eggheadcafe.com
c-plusplus.de
MSDN Blog
Wiki
DSA Resources
CodeProject
MSDN 1
MSDN 2
MSDN 3
Final Solution
If anyone is interested how I've solved this problem, I'm going to use 1024bit DSA and SHA1, which is widely supported on many different versions of Windows (Windows 2000 and newer), security is good enough (I'm not signing orders, I just need to ensure that some child can't crack the signature on his iPhone (:-D)), and the signature size is only 40 bytes long.
What you are trying to design is known as a "Signature scheme with message recovery".
Designing a new signature scheme is hard. Designing a new signature scheme with message recovery is harder. I don't know all the details about your design, but there is a good chance that it is susceptible to a chosen message attack.
One proposal for signature schemes with message recovery is RSA PSS-R. But unfortunately, this proposal is covered with a patent.
The IEEE P1363 standarization group, once discussed the addition of signature schemes with message recovery. However, I'm not sure about the current state of this effort, but it might be worth checking out.
Your Public key is a sub-set of your private key. You can use your private key as a public key as it will only use the components of the full key it requires.
In .NET both your private & public keys are stored in the RSAParameters struct. The struct contains fields for:
D
DP
DQ
Exponent
InverseQ
Modulus
P
Q
If you're at the point where the data is so small that the digital signature is huge in comparison, then you have excess signature. The solution isn't to roll your own algorithm, but to cut down what's there. You definitely don't want to try to combine a key with the hash in an amateurish way: this has been broken already, which is why we have HMAC's.
So here's the basic idea:
Create a session key using a cryptographically strong RNG.
Transmit it via PKE.
Use the session key to generate an HMAC-SHA1 (or HMAC-RIPEMD160, or whatever).
If the size of the hash is absurdly large for the given data, cut it in half by XORing the top with the bottom. Repeat as needed.
Send the data and the (possibly cut-down) hash.
The receiver uses the data and the session key to regenerate the hash and then compares it with the one transmitted (possibly after first cutting it down.)
Change session keys often.
This is a compromise between the insanity of rolling your own system and using an ill-fitting one.
I'm wide open to constructive criticism...
I get it now, after reading the comments.
The answer is: don't do it.
Cryptographic signature algorithms are not algorithms from which you can pick and choose - or modify - steps. In particular, supposing a signature sig looks something like encrypt(hash), orig + sig is not the same as encrypt(orig + hash). Further, even outdated signature algorithms like PKCS v1.5 are not as simple as encrypt(hash) in the first place.
A technique like the one you describe sacrifices security for the sake of cleverness. If you don't have the bandwidth for a 256 byte signature, then you need one of:
a different algorithm,
more bandwidth, or
a smaller key.
And if you go with (1), please be sure it's not an algorithm you made up! The simple fact is that crypto is hard.
Related
I'm quite new to .net and had a question regarding DataProtector.
When using DataProtector.Protect without any configuration, the resulting encryption becomes too long for the API I need to pass it to, I was wondering if using the configuration methods (as seen here) would help? I tried the following in the class where I needed to protect the data:
var serviceCollection = new ServiceCollection();
serviceCollection.AddDataProtection()
.UseCustomCryptographicAlgorithms(new ManagedAuthenticatedEncryptionSettings()
{
// a type that subclasses SymmetricAlgorithm
EncryptionAlgorithmType = typeof(Aes),
// specified in bits
EncryptionAlgorithmKeySize = 128,
// a type that subclasses KeyedHashAlgorithm
ValidationAlgorithmType = typeof(HMACSHA256)
});
var services = serviceCollection.BuildServiceProvider();
_protector = services.GetDataProtector("MyClass.v1");
var protect = _protector.Protect(JsonConvert.SerializeObject(myData));
However even after changing the EncryptionAlgorithmKeySize from the default 256 to the minimum 128, 'protect' was still resulting in an encryption of the same length which makes me think that the configuration isn't working or configuration doesn't affect encryption length.
Does anyone know if this is being done the right way or if there is a better way to reduce encryption length?
For example a simple 9 character string gets encrypted to 134 characters.
Any help is much appreciated, thanks!
DPAPI is meant to secure data-at-rest, not data for transmission.
Ryan Dobbs is correct, above (or below? I can't figure out how StackOverflow sorts unaccepted answers...), weakening your encryption to attain a smaller payload is a very bad idea. The right way to address this is to secure the connection (TLS-style SSL), then you can just send things plaintext, or (as Ryan suggests) drop a properly-encrypted payload somewhere that both sender and receiver can access it.
But to answer your question more directly, the payload size is controlled by the hashing function. Encryption key size only tells you the cryptographic complexity of the encryption algorithm -- how hard the encryption is to break. The part that says HMACSHA256 is a SHA-256 hash which means it produces a 256-bit output.
MD5 is 128-bit but it's generally insecure (only good for checksums).
The documentation says the key size and hash size must be equivalent, so you can't go to 128 bits with SHA. The shortest SHA available is the old SHA1 algorithm (HMACSHA1) which is 160 bits, but the expectation is that anything less than 256-bits will be insecure relatively soon. The SHA2 algorithm yields HMACSHA256 and HMACSHA512.
In my scenario, I would like to encrypt a very big number (10^27) using a private key and later be able to decrypt it using a public key. The problem I have is that I want to keep the size of the encrypted text as small as possible.
I know that .NET has support for public key encryption (RSACryptoServiceProvider), but the encrypted text gets so huge.
Would it work to instead treat the private key as a public key?
Would Elliptic curve cryptography produce a smaller output?
First of all, if you want to achieve confidentiality you should always encrypt with the public key, not the private key. RSA encryption is not defined for encryption with the private key, and the results may vary (especially the kind of padding that is applied).
For direct RSA encryption, the size of the encrypted message is identical to the modulus. Now the modulus should be at least 2048 bits by now, and your message is only about (27/3)*10=90 bits. So RSA would have a large overhead, independent on the key used. Using ECIES is therefore likely to give significant benefits.
Is there an existing secure implementation to achieve following:
Guid original = Guid.NewGuid();
Guid inverted = MysteryImplementation(original, salt); // salt is some sort of input
Guid shouldBeOriginal = MysteryImplementation(inverted, salt);
Assert.AreEqual(original, shouldBeOriginal, "MysteryImplementation did no work");
EDIT:
As this got down voted (although I'm a bit unsure why), I think more background is needed:
In a place far far away, there is an application in which primary keys are stored as GUIDs. In this application these GUIDs are exposed to web clients.
In my pursuit of improving status quo I had an idea to map these GUIDs with user session data in order mitigate the risk of an accidental/malicious leakage of primary keys. Mapping these GUIDs has the added benefit that it would also allow easier implementation of working-copies for those objects that the GUIDs refer to.
Those were the reasons why I decided to start looking for "secure" way to map GUIDs.
To answer comments:
-Mapping should preserve the global uniqueness when compared to all other GUIDs (I wouldn't want those mapped GUIDs to collide with existing GUIDs).
-"Secure" in this context means that it should be impossible to figure out the original GUIDs without knowing the cipher key (a typical crypto req, which I think translates that the mapped GUIDs should have normalized distribution).
You can easily do this:
Guid original = Guid.NewGuid();
byte[] encrypted = Encrypt(original, key);
Guid decrypted = Decrypt(encrypted, key);
Any symmetric encoding algorithm will do, from ROT13 on up. However, that's not what you asked for. What you asked for is an algorithm that has two properties:
The encrypt and decrypt algorithms are exactly the same.
The encrypted form of a GUID is also a valid globally unique identifier.
There are plenty of algorithms where the encryption and decryption processes are different but not actually that many where they are exactly the same. The simplest algorithm where encryption and decryption are the same is:
Generate a crypto-strength random one-time pad of the same length as the plaintext.
XOR the plaintext with the pad to produce the ciphertext.
To decrypt, XOR the ciphertext with the pad.
However that algorithm does not necessarily maintain the property that the ciphertext is a valid GUID.
Can you explain why it is that you need the ciphertext to be a valid GUID? The property that a GUID has to have is that it needs to be globally unique; how are you planning on guaranteeing global uniqueness? What stops you from encrypting one GUID that you generated into another GUID that someone else has generated unbeknownst to you?
More generally, can you explain what problem you are trying to solve in the first place? Nine times out of ten that I see someone trying to use cryptography, they're using it for the wrong purpose.
Yes. Those mystery algorithms are called symmetric ciphers. What you call salt is just the key for the algorithm.
However, it might be a bit harder getting a GUID back from that since encryption algorithms usually operate either on streams or blocks of data and by modifying a GUID you compromise it's GU property.
any symetric encrytion will do
http://msdn.microsoft.com/en-us/library/as0w18af(v=vs.110).aspx
I have an unencrypted/unencoded string - "565040574". I also have the encrypted/encoded string for this string - "BSubW2AUWrSCL7dk9ucoiA==".
It looks like this string has been Base64ed after encryption, but I don't know which encryption algorithm has been used. If I convert "BSubW2AUWrSCL7dk9ucoiA==" string to bytes using Convert.FromBase64String("BSubW2AUWrSCL7dk9ucoiA=="), I get 16 bytes.
Is there anything using which I can know what type of encryption has been used to encrypt the "565040574" to "BSubW2AUWrSCL7dk9ucoiA=="?
No, there is nothing to tell you how it was encrypted. If you don't have the key to decrypt it then you will be out of luck anyway.
If the plan was to save this to a file or send it in email then it would be base-64 encoded, so that was a good guess.
You may be able to narrow down what it is not by looking at the fact that you have 7 bytes of padding perhaps, but whether it was IDEA or Blowfish or AES, there is no way to know.
Looking at it, from the top of my head I would say AES and more specifically Rijndael.
EDIT:
Just to add, as I said in my comment, without the key you will never know what this is. I am taking it on a best guess scenario, also based on implementations that could be termed "more common", which could also be a complete oversight from me.
Remember that if you can ever outright say what algorithm a ciphertext is in, never, ever use that algorithm.
What can you tell from the data you have? Well, the most concrete bit of information you have is that 9 bytes of cleartext encrypts to 16 bytes of ciphertext. Since it is unlikely that a data compression algorithm is being used on such a small chunk of data, this means we can make an educated guess that:
It is encrypted with a block cipher, with a block size <= 128 bits.
The encryption mode is ECB, since there is no room for an IV.
I want to encrypt a string and embed it in a URL, so I want to make sure the encrypted output isn't bigger than the input.
Is AES the way to go?
It's impossible to create any algorithm which will always create a smaller output than the input, but can reverse any output back to the input. If you allow "no bigger than the input" then basically you're just talking isomorphic algorithms where they're always the same size as the input. This is due to the pigeonhole principle.
Added to that, encryption usually has a little bit of padding (e.g. "to the nearest 8 bytes, rounded up" - in AES, that's 16 bytes). Oh, and on top of that you're got the issue of converting between text and binary. Encryption algorithms usually work in binary, but URLs are in text. Even if you assume ASCII, you could end up with an encrypted binary value which isn't ASCII. The simplest way of representing arbitrary binary data in text is to use base64. There are other alternatives which would be highly fiddly, but the general "convert text to binary, encrypt, convert binary to text" pattern is the simplest one.
Simple answer is no.
Any symmetric encryption algorithm ( AES included ) will produce an output of at minimum the same but often slightly larger. As Jon Skeet points out, usually because of padding or alignment.
Of course you could compress your string using zlib and encrypt but you'd need to decompress after decrypting.
Disclaimer - compressing the string with zlib will not guarantee it comes out smaller though
What matters is not really the cipher that you use, but the encryption mode that you use. For example the CTR mode has no length expansion, but every encryption needs a new distinct starting point for the counter. Other modes like OFB, CFB (or CBC with ciphertext stealing) also don't need to be padded to a multiple of the block length of the cipher, but they need an IV. It is unclear from your question if there is some information available from which an IV could be derived pseudorandomly an if any of these modes would be appropriate. It is also unclear if you need authentication, or if you need semantic security> i.e. is it a problem if you encrypt the same string twice and you get the same ciphertext twice?
If we are talking about symetric encription to obtain the original encrypted string from a cyphered one it is not possible. I think that unless you use hashes (SHA1, SHA256...) you will never obtain a cyphered string smaller than the original text. The problem with hashes is that they are not the solution for retrieving the original string because they are one way encryption algorithms.
When using AES, the output data will be rounded up to have a specific length (e.g a length divisible trough 16).
If you want to transfer secret data to another website, a HTTP post may do better than embedding the data into the URL.
Also just another thing to clarify:
Not only is it true that symmetric encryption algorithms produce an output that is at least as large as the input, the same is true of asymmetric encryption.
"Asymmetric encryption" and "cryptographic hashes" are two different things.
Asymmetric encryption (e.g. RSA) means that given the output (i.e. the ciphertext), you can get the input (i.e. the plaintext) back if you have the right key, it's just that decrypting requires a different key than the key used for encrypting. For asymmetric encryption, the same "pigeonhole principle" argument applies.
Cryptographic hashes (e.g. SHA-1) mean that given the output (i.e. the hash) you can't get the input back, and you can't even find a different input that hashes to the same value (assuming the hash is secure). For cryptographic hashes, the hash can be shorter than the input. (In fact the hash is the same size regardless of the length of the input.
And also one more thing: In any secure encryption system the ciphertext will be longer than the plaintext. This is because there are multiple possible ciphertexts that any given plaintext could encrypt to (e.g. using different IVs.) If this were not the case then the cipher would leak information because if two identical plaintexts were encrypted, they would encrypt to identical ciphertexts, and an adversary would then know that the plaintexts were the same.