How to UNHEX() MySQL binary string in C# .NET? - c#

I need to use HEX() in MySQL to get data out of the database and process in C# WinForm code. The binary string needs to be decoded in C#, is there an equivalent UNHEX() function?
From MySQL Doc:
For a string argument str, HEX() returns a hexadecimal string
representation of str where each byte of each character in str is
converted to two hexadecimal digits. (Multi-byte characters therefore
become more than two digits.) The inverse of this operation is
performed by the UNHEX() function.
For a numeric argument N, HEX() returns a hexadecimal string
representation of the value of N treated as a longlong (BIGINT)
number. This is equivalent to CONV(N,10,16). The inverse of this
operation is performed by CONV(HEX(N),16,10).
mysql> SELECT 0x616263, HEX('abc'), UNHEX(HEX('abc'));
-> 'abc', 616263, 'abc' mysql> SELECT HEX(255), CONV(HEX(255),16,10);
-> 'FF', 255

You can use this not-widely-known SoapHexBinary class to parse hex string
string hex = "616263";
var byteArr = System.Runtime.Remoting.Metadata.W3cXsd2001.SoapHexBinary.Parse(hex).Value;
var str = Encoding.UTF8.GetString(byteArr);

After fetching the binary string from the database, you can "unhex" it this way:
public static string Hex2String(string input)
{
var builder = new StringBuilder();
for(int i = 0; i < input.Length; i+=2){ //throws an exception if not properly formatted
string hexdec = input.Substring(i, 2);
int number = Int32.Parse(hexdec, NumberStyles.HexNumber);
char charToAdd = (char)number;
builder.Append(charToAdd);
}
return builder.ToString();
}
The method builds a string from the hexadecimal format of the numbers, their char representation being concatenated to the builder branch.

Related

How to convert a number (larger than 0xFFFF) that represents a Unicode character to its equivalent string in C#

For example, a character '𠀀' in CJK Unified Ideographs Extension A; its unicode value is 0x20000, as a char in C# can't represent such character, so I wonder if I could convert it to string, my question is:
If I give you a number like 0x20000, how to convert it and let me get its equivalent string like "𠀀"
You can use char.ConvertFromUtf32 for that:
int utf32 = 0x20000;
string text = char.ConvertFromUtf32(utf32);
string itself is a sequence of UTF-16 code units, in this case U+D840 and U+DC00, which you can see by printing out the individual char values:
int utf32 = 0x20000;
string text = char.ConvertFromUtf32(utf32);
Console.WriteLine(((int) text[0]).ToString("x4")); // d840
Console.WriteLine(((int) text[1]).ToString("x4")); // dc00

Is there a different way to convert to hexadecimal?

I'm trying to write some information to a special device that requires me to encode the string and I quote " an even number of bytes to write (1-32, base 10) "
The example string provided "DE AD BE EF CA FE" (works).
I have converted my string to decimal and from decimal to hexadecimal.
string TextToConvert = "Test Andrei";
TextToConvert=ConvertStringToHex(TextToConvert, Encoding.UTF8);
List<char> Chars = TextToConvert.ToCharArray().ToList();
string CharValue = "";
string secondHexConvert = "";
foreach(char c in Chars)
{
CharValue+=Convert.ToInt32(c);
secondHexConvert+=Convert.ToString(c, 16)+" ";
}
string hexValue = String.Format("{0:X}", CharValue)+" ";
I have found on internet a tool that converts to hexadecimal that works. The problem is that I can't figure what type of encoding is that. The site is this: https://codebeautify.org/decimal-hex-converter
from decimal "841011151163265110100114101105" to hex = "a9d741e82c990000000000000"
To convert such a big integer to a hexadecimal string, use the aptly named BigInteger type:
var num = BigInteger.Parse("841011151163265110100114101105");
string hex = num.ToString("X");
Console.WriteLine(hex);
will output:
0A9D741E82C98FC6A137B75371
but here's a snag, the output you showed in your question is somewhat different, let me show it together with what the code above produces:
0A9D741E82C98FC6A137B75371
a9d741e82c990000000000000
As you can see, the numbers start the same but your example then ends up with lots of zeroes.
The only way I understand this could happen is that they're in fact not using a type that can hold that many significant digits, so you get a rounding error.
Many of the dynamic programming languages allows you to use floating point numbers and integers interchangeably, I guess this is what happened, a floating point type that can only hold 17-18 significant digits or some such was used, and you lost precision. .NET, however, doesn't have built-in support for converting floating point types to hexadecimal.
You can see that .NET produces the exact value by converting back:
Console.WriteLine(BigInteger.Parse(hex, System.Globalization.NumberStyles.HexNumber));
outputs:
841011151163265110100114101105
In other words, I'm not sure you can get the exact same results in .NET.
Corollary: Don't use that site for this kind of conversion!
You can use the following code to convert a string to hexadecimal:
public static string ConvertStringToHex(String input, System.Text.Encoding encoding)
{
Byte[] stringBytes = encoding.GetBytes(input);
StringBuilder sbBytes = new StringBuilder(stringBytes.Length * 2);
foreach (byte b in stringBytes)
{
sbBytes.AppendFormat("{0:X2}", b);
}
return sbBytes.ToString();
}
And you just call it using:
string testString = "11111111";
string hex = ConvertStringToHex(testString, System.Text.Encoding.Unicode);

Reading A Long From A File After Parsing [C#]

I am trying to read from a file that is structured as such:
VariableName:14326A6AC
Value:Long
Value:Long
I am trying to read it doing it as listed below, but I get a format error. When I add the formatting for hexadecimal (the format the longs are in) they are converted to decimal. Is there a way to keep them as a long so I don't have to do the long conversion from decimal to hex?
public static long returnLineValue(string lineName)
{
var lines = File.ReadLines(filePath);
foreach (var line in lines)
{
if (line != null)
{
char split = ':';
if(line.Contains(lineName))
{
string[] s = line.Split(split);
return Int64.Parse(s[1]);
}
}
}
return 0;
}
This is what you need:
return Convert.ToInt64(s[1], 16)
16 is base 16 (hexadecimal). This function convert from a hexadecimal string to a long.
You have to allow hexadecimal values in Parse:
...
// The same Parse but with hexadecimals allowed
return Int64.Parse(s[1], NumberStyles.AllowHexSpecifier, CultureInfo.InvariantCulture);
...
whenever you want to represent Int64 in hexadecimal form, use formatting:
Int64 value = 255;
String result = value.ToString("X"); // "X" for hexadeimal, capital letters
// "FF"
Console.Write(result);
Try this
return Int64.Parse(s[1],System.Globalization.NumberStyles.HexNumber)

C# Converting Problems with String Array to Int Array

I`m working with C# and I have a problem at converting a string array to a int array.
First I created a string number with the Console
Console.WriteLine("Geben Sie die Nummer ein:");
string wert = Console.ReadLine();
Then I converted the string to a array
char[] wertarray = wert.ToCharArray();
wertarray1 = new string(wertarray);*
And now comes the problem. I want to convert the string array to a int array, but e.g. for string wertarray1[0]=1, the int array has the value 49.
int wertarray2 = Convert.ToInt16(wertarray1[0]);
Normal the Int value should be 1, but I don`t know where the problem is.
I tried the solutions for "convert a string array to a int array" from this forum, but i still had the problem that the int value get a strange number.
I´m looking forward for help.
Thanks :-).
Convert.ToInt16(Char) takes the numeric value of the char (i.e. its Unicode code-point value) and returns that number. While you might think Convert.ToInt16('1') should return 1, consider what would happen if you tried Convert.ToInt16('#') for example.
Use Int16.Parse (or TryParse) to actually parse a string to numbers. As you're working with individual characters to represent 0-9 you might as well do it using simple arithmetic without the need to call any Parse function:
String line = Console.ReadLine();
List<Int16> numbers = new List<Int16>( line.Length );
foreach(Char c in line) {
Int16 charValue = (Int16)c;
if( charValue < 48 || charValue > 57 ) throw new Exception("char is not a digit");
Int16 value = charValue - 48;
numbers.Add( value );
}
Answers provided already explain your problem and provide solution too.
In general, you can convert the char to string and parse them to integer (not the best performance though).
If you have all numeric string
var numStr = "136";
var numbers = numStr.Select(n => int.Parse(n.ToString())).ToList(); // {1, 3, 6}
If your string contains non numbers too
var mixStr = "1.k78Tj_n";
int temp;
var numbers2 = new List<int>();
mixStr.ToList().ForEach(n =>
{
if (int.TryParse(n.ToString(), out temp))
numbers2.Add(temp);
}); //{ 1, 7, 8 }
You're getting the Unicode code point value of the characters in your input stream -- 49 is the unicode value for the character 1.
If you want to convert a unicode character that is a digit to the numeric value of that digit, you can use System.Globalization.CharUnicodeInfo.GetDecimalDigitValue(char c):
var wertarray2 = wert.Select(c => (short)CharUnicodeInfo.GetDecimalDigitValue(c)).ToArray();
This handles all digits (including superscripted numbers) not just the standard ASCII digits.

How to convert Hex to Chinese ASCII character in c#? [duplicate]

I have the following code to convert from HEX to ASCII.
//Hexadecimal to ASCII Convertion
private static string hex2ascii(string hexString)
{
MessageBox.Show(hexString);
StringBuilder sb = new StringBuilder();
for (int i = 0; i <= hexString.Length - 2; i += 2)
{
sb.Append(Convert.ToString(Convert.ToChar(Int32.Parse(hexString.Substring(i, 2), System.Globalization.NumberStyles.HexNumber))));
}
return sb.ToString();
}
input hexString = D3FCC4A7B6FABBB7
output return = Óüħ¶ú»·
The output that I need is 狱魔耳环, but I am getting Óüħ¶ú»· instead.
How would I make it display the correct string?
First, convert the hex string to a byte[], e.g. using code at How do you convert Byte Array to Hexadecimal String, and vice versa?. Then use System.Text.Encoding.Unicode.GetString(myArray) (use proper encoding, might not be Unicode, but judging from your example it is a 16-bit encoding, which, incidentally, is not "ASCII", which is 7-bit) to convert it to a string.

Categories