I'm using SHA-256 with Hex to hash some text. However I am finding that the hashed text in my Node implementation differs from my .NET C# implementation.
In NodeJS I have the following:
return crypto.createHash('sha256').update(text).digest('hex');
and in .NET C# I have:
private static string Hash(string text)
{
byte[] bytes = Encoding.Unicode.GetBytes(text);
using (var generator = new SHA256Managed())
{
byte[] hash = generator.ComputeHash(bytes);
return BytesToHex(hash);
}
}
private static string BytesToHex(byte[] bytes)
{
var hex = new StringBuilder(bytes.Length * 2);
foreach (byte b in bytes)
{
hex.AppendFormat("{0:x2}", b);
}
return hex.ToString();
}
return Hash(text);
What have I done wrong in the NodeJS version?
As this means when I try and use my NodeJS app with hashes created by my .NET app the hashes don't match up!
Update: Apparently this could be due to charset...
So I tried:
return crypto.createHash('sha256').update(text, 'utf8').digest('hex');
But the hash is the same as before? So using utf8 instead of binary hasn't actually made any difference to the returned hash... And it's still mismatched from the .NET version.
The problem is a mismatch in encoding used in both systems, Encoding.Unicode will use UTF-16 format, not UTF-8 - the update you made on the NodeJS side is still required as omitting the input_encoding parameter means it will default to binary. However, you also need to update the .NET side to use UTF-8 encoding
byte[] bytes = Encoding.UTF8.GetBytes(text);
Alternatively, you can update the Node side to use UTF-16 using the, undocumented, ucs2 encoding
crypto.createHash('sha256').update(text, 'ucs2')
© IronGeek
Related
I have strings of sensitive information that I need to collect from my users. I am using a WPF PasswordBox to request this information. For the uninitiated, the PasswordBox control provides a SecurePassword property which is a SecureString object rather than an insecure string object. Within my application, the data from the PasswordBox gets passed as a SecureString to an encryption method.
What I need to be able to do is marshal the data to a byte array that essentially represents a .Net string value without first converting the data to a .Net string. More specifically, given a SecureString with a value such as...
abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ1234567890`~!##$%^&*()_-+={[}]|:;"'<,>.?/ ≈篭母
...how can I convert it to a byte array that is the equivalent a .Net string that's been serialized and written to a stream with a StreamWriter?
By using Marshal.SecureStringToCoTaskMemUnicode(...) I am able to do this with more traditional, western text. However, when I created the above text string using additional, not-typical characters and a string of Japanese text (see the last few bolded characters) my method of getting a Unicode byte array assigned to the IntPtr position doesn't seem to properly work anymore.
How can I emit the data of a SecureString in a secure way such that the returned byte data is structured the same as the byte data of a standard .Net string, serialized to binary output?
NOTE
Please ignore all security concerns at the moment. I am working on making various security upgrades to my application. For now, I need to use a SecureString for getting the sensitive data to the encryptor. The decryptor (for now) will still need to decrypt this data to string values, which is why I need to some how serialize the data in the the SecureString to a binary format similar to the binary format of the string object.
I agree that this approach is a bit unfortunate, however, I'm having to make incremental improvements on an existing application, and the first phase is locking down the data in SecureString objects from the user to the encryptor.
If you need to write secure string to stream, I'd suggest to create method like this:
public static class Extensions {
public static void WriteSecure(this StreamWriter writer, SecureString sec) {
int length = sec.Length;
if (length == 0)
return;
IntPtr ptr = Marshal.SecureStringToBSTR(sec);
try {
// each char in that string is 2 bytes, not one (it's UTF-16 string)
for (int i = 0; i < length * 2; i += 2) {
// so use ReadInt16 and convert resulting "short" to char
var ch = Convert.ToChar(Marshal.ReadInt16(ptr + i));
// write
writer.Write(ch);
}
}
finally {
// don't forget to zero memory
Marshal.ZeroFreeBSTR(ptr);
}
}
}
If you really need byte array - you can reuse this method too:
byte[] result;
using (var ms = new MemoryStream()) {
using (var writer = new StreamWriter(ms)) {
writer.WriteSecure(secureString);
}
result = ms.ToArray();
}
Though method from first comment might be a bit more pefomant (not sure if that's important for you).
I have C++ code which I want to rewrite to C#. This part
case ID_TYPE_UNICODE_STRING :
if(items[i].GetUString().length() > 0xFFFF)
throw dppError("error");
//GetUstring returns std::wstring type object
DataSize = (WORD) (sizeof(WCHAR)*(items[i].GetUString().length()));
blob.AppendData((const BYTE *) &DataSize, sizeof(WORD)); //blob is byte array
//GetUstring returns std::wstring type object
blob.AppendData((const BYTE *) items[i].GetUString().c_str(), DataSize);
break ;
basically serializes length in bytes of unicode string and string itself to byte array.
Here comes my problem (this code then sends this data to server). I don't know which encoding is used in above lines of code(UTF16, UTF8, etc.).
So I don't know what is the best way to reimplement it in C#.
How can I guess what encoding is used in this C++ project?
And if I can't find encoding used in C++ project, given endianness is same as stated in accepted answer of this question, do you think the two methods (GetBytes and GetString) in accepted answer will work for me (for serializing the unicode string as in C++ project and retrieving it back)? e.g.
these two:
static byte[] GetBytes(string str)
{
byte[] bytes = new byte[str.Length * sizeof(char)];
System.Buffer.BlockCopy(str.ToCharArray(), 0, bytes, 0, bytes.Length);
return bytes;
}
static string GetString(byte[] bytes)
{
char[] chars = new char[bytes.Length / sizeof(char)];
System.Buffer.BlockCopy(bytes, 0, chars, 0, bytes.Length);
return new string(chars);
}
Or I am better of to learn what is the encoding used in C++ project?
I will then need to reconstruct the string in the same way from byte array too. And if I am better of learning which encoding was used in C++, how do I get the length of the string in bytes in C#, using System.Text.ASCII.WhateverEncodingWasUsedinC++.GetByteCount(string); ??
PS. Do you think the C++ code is working in encoding agnostic way? If yes, how can I repeat that also in C#?
UPDATE: I am guessing the encoding used is UTF16 because I saw that being mentioned in several variables names, so I think I will assume UTF16 is used, and if something doesn't work out during testing, look for alternative solutions. In that case, what is the best way to get the number of bytes of the UTF16 string? Is following method OK: System.Text.ASCII.Unicode.GetByteCount(string); ??
feedback and comments welcome. Am I wrong somewhere in my reasoning? Thanks
Change the method signature as like this for getting byte[] equivalent of input string.
static byte[] GetBytes(string str)
{
UnicodeEncoding uEncoding = new UnicodeEncoding();
byte[] stringContentBytes = uEncoding.GetBytes("Your string");
return stringContentBytes;
}
For reverse:
static string GetString(byte[] bytes)
{
UnicodeEncoding uEncoding = new UnicodeEncoding();
string stringContent=uEncoding.GetString(bytes);
return new string(stringContent);
}
I'm actually trying to implement a very simple login mecanism for an app I'm developping in Visual C# .NET 2.0 on an embedded device. After some researches, I've found on the msdn a code sample performing password hashing :
How to store passwords
Unfortunately, when I try to use it, that code sample is raising a FormatException on the call to byte.Parse on the substrings of the hexadecimal string SaltValue. I really have trouble to understand why, since I haven't done any change to the code.
Here is the code :
using System;
using System.Collections.Generic;
using System.Text;
using System.Security.Cryptography;
using System.Globalization;
private const int SaltValueSize = 4;
private static string GenerateSaltValue()
{
UnicodeEncoding utf16 = new UnicodeEncoding();
if (utf16 != null)
{
// Create a random number object seeded from the value
// of the last random seed value. This is done
// interlocked because it is a static value and we want
// it to roll forward safely.
Random random = new Random(unchecked((int)DateTime.Now.Ticks));
if (random != null)
{
// Create an array of random values.
byte[] saltValue = new byte[SaltValueSize];
random.NextBytes(saltValue);
// Convert the salt value to a string. Note that the resulting string
// will still be an array of binary values and not a printable string.
// Also it does not convert each byte to a double byte.
//Original line :
//string saltValueString = utf16.GetString(saltValue);
//Replaced by :
string saltValueString = utf16.GetString(saltValue, 0, SaltValueSize);
// Return the salt value as a string.
return saltValueString;
}
}
return null;
}
private static string HashPassword(string clearData, string saltValue, HashAlgorithm hash)
{
UnicodeEncoding encoding = new UnicodeEncoding();
if (clearData != null && hash != null && encoding != null)
{
// If the salt string is null or the length is invalid then
// create a new valid salt value.
if (saltValue == null)
{
// Generate a salt string.
saltValue = GenerateSaltValue();
}
// Convert the salt string and the password string to a single
// array of bytes. Note that the password string is Unicode and
// therefore may or may not have a zero in every other byte.
byte[] binarySaltValue = new byte[SaltValueSize];
//FormatException raised here
binarySaltValue[0] = byte.Parse(saltValue.Substring(0, 2), System.Globalization.NumberStyles.HexNumber, CultureInfo.InvariantCulture.NumberFormat);
binarySaltValue[1] = byte.Parse(saltValue.Substring(2, 2), System.Globalization.NumberStyles.HexNumber, CultureInfo.InvariantCulture.NumberFormat);
binarySaltValue[2] = byte.Parse(saltValue.Substring(4, 2), System.Globalization.NumberStyles.HexNumber, CultureInfo.InvariantCulture.NumberFormat);
binarySaltValue[3] = byte.Parse(saltValue.Substring(6, 2), System.Globalization.NumberStyles.HexNumber, CultureInfo.InvariantCulture.NumberFormat);
//...
//Some more code
//...
}
}
I only have changed one line :
string saltValueString = utf16.GetString(saltValue);
to
string saltValueString = utf16.GetString(saltValue, 0, SaltValueSize);
because the first version of the method doesn't seem to be available for embedded C#. But anyway I've tested without changing this line (on a non-embedded environment), and it still was raising a FormatException.
I've copied the SaltValueSize value from that other msdn code sample (which is related) :
How to validate passwords
The test that raises the exception :
HashPassword("youpi", null, new SHA1CryptoServiceProvider());
The problem lies in the fact that your GenerateSaltValue method does not return string of hexademical numbers.
It returns string of some random symbols, that may or usually may not be valid hexademical symbols - for me it created string of mostly Chinese hieroglyphs that for sure aren't parseable by Byte.Parse method.
Also, your example pertains to Microsoft Commerce Server - I have no idea whatsoever it is.
"SOLUTION:"
I am not sure what all this examples wants to accomplish with this string-tohex-tobinary conversions, but for it to successfully execute the GenerateSaltValue should be something like:
public static string ByteArrayToString(byte[] byteArray)
{
StringBuilder hex = new StringBuilder(byteArray.Length * 2);
foreach (byte b in byteArray)
hex.AppendFormat("{0:x2}", b);
return hex.ToString();
}
// Renamed GenerateSaltValue method
private static string GenerateHexSaltString()
{
Random random = new Random();
// Create an array of random values.
byte[] saltValue = new byte[SaltValueSize];
random.NextBytes(saltValue);
string saltValueString = ByteArrayToString(saltValue);
// Return the salt value as a string.
return saltValueString;
}
And your program will "work", thanks to How do you convert Byte Array to Hexadecimal String, and vice versa?
BUT:
Using Random for Salt creation is a bad idea.
string-tohex-tobinary conversion looks even worser.
And other problems...
SO:
Read some articles that really pertains to C# password hashing and encryption, like:
Hash and salt passwords in C#
And be very attentive while searching for code examples - they could use another version, platform or even language. Good luck.
I was provided the following code sample in Java and I'm having trouble converting it to C#. How would I go about converting this so it'll work in .NET 4.5?
public static String constructOTP(final Long counter, final String key)
throws NoSuchAlgorithmException, DecoderException, InvalidKeyException
{
// setup the HMAC algorithm, setting the key to use
final Mac mac = Mac.getInstance("HmacSHA512");
// convert the key from a hex string to a byte array
final byte[] binaryKey = Hex.decodeHex(key.toCharArray());
// initialize the HMAC with a key spec created from the key
mac.init(new SecretKeySpec(binaryKey, "HmacSHA512"));
// compute the OTP using the bytes of the counter
byte[] computedOtp = mac.doFinal(
ByteBuffer.allocate(8).putLong(counter).array());
//
// increment the counter and store the new value
//
// return the value as a hex encoded string
return new String(Hex.encodeHex(computedOtp));
}
Here is the C# code that I've come up with thanks to Duncan pointing out the HMACSHA512 class, but I'm unable to verify the results match without installing java, which I can't do on this machine. Does this code match the above Java?
public string ConstructOTP(long counter, string key)
{
var mac = new HMACSHA512(ConvertHexStringToByteArray(key));
var buffer = BitConverter.GetBytes(counter);
Array.Resize(ref buffer, 8);
var computedOtp = mac.ComputeHash(buffer);
var hex = new StringBuilder(computedOtp.Length * 2);
foreach (var b in computedOtp)
hex.AppendFormat("{0:x2", b);
return hex.ToString();
}
A SecretKeySpec is used to convert binary input into something that is recognised by Java security providers as a key. It does little more than decorate the bytes with a little post-it note saying "Pssst, it's an HmacSHA512 key...".
You can basically ignore it as a Java-ism. For your .NET code, you just need to find a way of declaring what the HMAC key is. Looking at the HMACSHA512 class, this seems quite straight-forward. There is a constructor that takes a byte array containing your key value.
I have a method that currently returns a string converted from a byte array:
public static readonly UnicodeEncoding ByteConverter = new UnicodeEncoding();
public static string Decrypt(string textToDecrypt, string privateKeyXml)
{
if (string.IsNullOrEmpty(textToDecrypt))
{
throw new ArgumentException(
"Cannot decrypt null or blank string"
);
}
if (string.IsNullOrEmpty(privateKeyXml))
{
throw new ArgumentException("Invalid private key XML given");
}
byte[] bytesToDecrypt = Convert.FromBase64String(textToDecrypt);
byte[] decryptedBytes;
using (var rsa = new RSACryptoServiceProvider())
{
rsa.FromXmlString(privateKeyXml);
decryptedBytes = rsa.Decrypt(bytesToDecrypt, FOAEP);
}
return ByteConverter.GetString(decryptedBytes);
}
I'm trying to update this method to instead return a SecureString, but I'm having trouble converting the return value of RSACryptoServiceProvider.Decrypt from byte[] to SecureString. I tried the following:
var secStr = new SecureString();
foreach (byte b in decryptedBytes)
{
char[] chars = ByteConverter.GetChars(new[] { b });
if (chars.Length != 1)
{
throw new Exception(
"Could not convert a single byte into a single char"
);
}
secStr.AppendChar(chars[0]);
}
return secStr;
However, using this SecureString equality tester, the resulting SecureString was not equal to the SecureString constructed from the original, unencrypted text. My Encrypt and Decrypt methods worked before, when I was just using string everywhere, and I've also tested the SecureString equality code, so I'm pretty sure the problem here is how I'm trying to convert byte[] into SecureString. Is there another route I should take for using RSA encryption that would allow me to get back a SecureString when I decrypt?
Edit: I didn't want to convert the byte array to a regular string and then stuff that string into a SecureString, because that seems to defeat the point of using a SecureString in the first place. However, is it also bad that Decrypt returns byte[] and I'm then trying to stuff that byte array into a SecureString? It's my guess that if Decrypt returns a byte[], then that's a safe way to pass around sensitive information, so converting one secure representation of the data to another secure representation seems okay.
A char and a byte can be used interchangeably with casting, so modify your second chunk of code as such:
var secStr = new SecureString();
foreach (byte b in decryptedBytes)
{
secStr.AppendChar((char)b);
}
return secStr;
This should work properly, but keep in mind that you're still bringing the unencrypted information into the "clear" in memory, so there's a point at which it could be compromised (which sort of defeats the purpose to a SecureString).
** Update **
A byte[] of your sensitive information is not secure. You can look at it in memory and see the information (especially if it's just a string). The individual bytes will be in the exact order of the string, so 'read'ing it is pretty straight-forward.
I was (actually about an hour ago) just struggling with this same issue myself, and as far as I know there is no good way to go straight from the decrypter to the SecureString unless the decryter is specifically programmed to support this strategy.
I think the problem might be your ByteConvert.GetChars method. I can't find that class or method in the MSDN docs. I'm not sure if that is a typo, or a homegrown function. Regardless, it is mostly likely not interpreting the encoding of the bytes correctly. Instead, use the UTF8Encoding's GetChars method. It will properly convert the bytes back into a .NET string, assuming they were encrypted from a .NET string object originally. (If not, you'll want to use the GetChars method on the encoding that matches the original string.)
You're right that using arrays is the most secure approach. Because the decrypted representations of your secret are stored in byte or char arrays, you can easily clear them out when done, so your plaintext secret isn't left in memory. This isn't perfectly secure, but more secure than converting to a string. Strings can't be changed and they stay in memory until they are garbage collected at some indeterminate future time.
var secStr = new SecureString();
var chars = System.Text.Encoding.UTF8.GetChars(decryptedBytes);
for( int idx = 0; idx < chars.Length; ++idx )
{
secStr.AppendChar(chars[idx]);
# Clear out the chars as you go.
chars[idx] = 0
}
# Clear the decrypted bytes from memory, too.
Array.Clear(decryptedBytes, 0, decryptedBytes.Length);
return secStr;
Based on Coding Gorilla's answer, I tried the following in my Decrypt method:
string decryptedString1 = string.Empty;
foreach (byte b in decryptedBytes)
{
decryptedString1 += (char)b;
}
string decryptedString2 = ByteConverter.GetString(decryptedBytes);
When debugging, decryptedString1 and decryptedString2 were not equal:
decryptedString1 "m\0y\0V\0e\0r\0y\0L\0o\0n\0g\0V\03\0r\0y\05\03\0c\0r\03\07\0p\04\0s\0s\0w\00\0r\0d\0!\0!\0!\0"
decryptedString2 "myVeryLongV3ry53cr37p4ssw0rd!!!"
So it looks like I can just go through the byte[] array, do a direct cast to char, and skip \0 characters. Like Coding Gorilla said, though, this does seem to again in part defeat the point of SecureString, because the sensitive data is floating about in memory in little byte-size chunks. Any suggestions for getting RSACryptoServiceProvider.Decrypt to return a SecureString directly?
Edit: yep, this works:
var secStr = new SecureString();
foreach (byte b in decryptedBytes)
{
var c = (char)b;
if ('\0' == c)
{
continue;
}
secStr.AppendChar(c);
}
return secStr;
Edit: correction: this works with plain old English strings. Encrypting and then attempting to decrypt the string "標準語 明治維新 english やった" doesn't work as expected because the resulting decrypted string, using this foreach (byte b in decryptedBytes) technique, does not match the original unencrypted string.
Edit: using the following works for both:
var secStr = new SecureString();
foreach (char c in ByteConverter.GetChars(decryptedBytes))
{
secStr.AppendChar(c);
}
return secStr;
This still leaves a byte array and a char array of the password in memory, which sucks. Maybe I should find another RSA class that returns a SecureString. :/
What if you stuck to UTF-16?
Internally, .NET (and therefore, SecureString) uses UTF-16 (double byte) to store string contents. You could take advantage of this and translate your protected data two bytes (i.e. 1 char) at a time...
When you encrypt, peel off a Char, and use Encoding.UTF16.GetBytes() to get your two bytes, and push those two bytes into your encryption stream. In reverse, when you are reading from your encrypted stream, read two bytes at a time, and UTF16.GetString() to get your char.
It probably sounds awful, but it keeps all the characters of your secret string from being all in one place, AND it gives you the reliability of character "size" (you won't have to guess if the next single byte is a char, or a UTF marker for a double-wide char). There's no way for an observer to know which characters go with which, nor in which order, so guessing the secret should be near impossible.
Honestly, this is just a suggested idea... I'm about to try it myself, and see how viable it is. My goal is to produce extension methods (SecureString.Encrypt and ICrypto.ToSecureString, or something like that).
Use System.Encoding.Default.GetString
GetString MSDN