Need to convert string/char to ascii values - c#

I need to convert char to hex values. Refer to the Ascii table but I have a few examples listed below:
char 1 = 31
2 = 32
3 = 33
4 = 34
5 = 35
A = 41
a = 61 etc
Therefore string str = "12345";
Need to get the converted str = "3132333435"

I think this is all you'll need:
string finalValue;
byte[] ascii = Encoding.ASCII.GetBytes(yourString);
foreach (Byte b in ascii)
{
finalValue += b.ToString("X");
}
More info on MSDN: http://msdn.microsoft.com/en-us/library/system.text.encoding.ascii.aspx
Edit: To Hex:
string finalValue;
int value;
foreach (char c in myString)
{
value = Convert.ToInt32(c);
finalValue += value.ToString("X");
// or finalValue = String.Format("{0}{1:X}", finalValue, value);
}
// use finalValue

string.Join("", from c in "12345" select ((int)c).ToString("X"));

string s = "abc123";
foreach(char c in s)
{
Response.Write((int)c + ",");
}

To get it in a single line, and more readable (imo)
var result = "12345".Aggregate("", (res, c) => res + ((byte)c).ToString("X"));
this returns "3132333435", just as you requested :)

Related

Convert a binary string to int c#

I have this code:
string result = "";
foreach(char item in texte)
{
result += Convert.ToString(item, 2).PadLeft(8, '0');
}
So I have string named result which is conversion of a word like 'bonjour' in binary.
for texte = "bonjour" I have string result = 01100010011011110110111001101010011011110111010101110010 as type integer.
And when I do
Console.writeLine(result[0])
I obtain 0, normal, what I expected, but if I do
Console.WriteLine((int)result[0])
or
Console.WriteLine(Convert.ToInt32(result[0]))
I obtain 48!
I don't want 48, I want 0 or 1 at the type integer.
Could you help me please?
You can just subtract 48 from it!
Console.WriteLine(result[0] - 48);
because the characters digits 0-9 are encoded as 48 to 57.
If you want to access each bit by index, I suggest using a BitArray instead:
var bytes = Encoding.ASCII.GetBytes("someString");
var bitArray = new BitArray(bytes);
// now you can access the first bit like so:
bitArray.Get(0) // this returns a bool
bitArray.Get(0) ? 1 : 0 // this gives you a 1 or 0
string a = "23jlfdsa890123kl21";
byte[] data = System.Text.Encoding.Default.GetBytes(a);
StringBuilder result = new StringBuilder(data.Length * 8);
foreach (byte b in data)
{
result.Append(Convert.ToString(b, 2).PadLeft(8, '0'));
}
you can try this code.
Just Do this
Console.WriteLine(Convert.ToInt32(Convert.ToString(result[0])));
You're expecting it to behave the same as Convert.ToInt32(string input) but actually you're invoking Convert.ToInt32(char input) and if you check the docs, they explicitly state it will return the unicode value (in this case the same as the ASCII value).
http://msdn.microsoft.com/en-us/library/ww9t2871(v=vs.110).aspx

How to sum existing byte array value with another HEX value in C#?

i need to decode a string in C#. Algorithm peformed in HEX values. So in C# i think i need to convert to byte array?am i right?. So I did a byte array from a string:
string Encoded = "ENCODEDSTRINGSOMETHING";
byte[] ba = Encoding.Default.GetBytes (Encoded);
Now i need to modify each byte in byte array first starting from summing hex value (0x20) to first byte and for the each next byte in array i should substitute 0x01 hex from starting 0x20 hex value and sum it with following bytes in my ba array. Then i need to convert my byte array result to string again and print. In Python this is very easy:
def decode ():
strEncoded = "ENCODEDSTRINGSOMETHING"
strDecoded = ""
counter = 0x20
for ch in strEncoded:
ch_mod = ord(ch) + counter
counter -= 1
strDecoded += chr(ch_mod)
print ("%s" % strDecoded)
if __name__ == '__main__':
decode()
How can i do it in C#? Thank you very much.
Here's a rough outline of how to do what you are trying to do. Might need to change it a bit to fit your problem/solution.
public string Encode(string input, int initialOffset = 0x20)
{
string result = "";
foreach(var c in input)
{
result += (char)(c + (initialOffset --));
}
return result;
}
Try this code:
string Encoded = "ENCODEDSTRINGSOMETHING";
byte[] ba = Encoding.Default.GetBytes(Encoded);
string strDecoded = "";
int counter = 0x20;
foreach (char c in Encoded)
{
int ch_mod = (int)c+counter;
counter -= 1;
strDecoded += (char)ch_mod;
}

Decoding string Char.Parse issue

I have the following code to encode a plain text:
int AddNumber;
int AsciiNumber;
string OneChar;
string String1 = "DAVE SMITH";
string String2 = "";
for (int i = 0; i < String1.Length; i++)
{
AddNumber = i + 95;
AsciiNumber = (int)Char.Parse(String1.Substring(i,1));
byte[] NewAscii = new byte[] { Convert.ToByte( AsciiNumber + AddNumber ) };
// Get string of the NewAscii
OneChar = Encoding.GetEncoding(1252).GetString(NewAscii);
String2 = String2 + OneChar;
}
The problem I have is how to decode the string back to plain text. Here is my attempt code:
String1 = "";
for (int i = 0; i < String2.Length; i++)
{
AddNumber = i + 95;
AsciiNumber = (int)Char.Parse(String2.Substring(i,1));
byte[] NewAscii = new byte[] { Convert.ToByte( AsciiNumber - AddNumber ) };
// Get string of the NewAscii
OneChar = Encoding.GetEncoding(1252).GetString(NewAscii);
String1 = String1 + OneChar;
}
The problem is that above, on processing the encoded empty space (between DAVE and SMITH), the value AsciiNumber = (int)Char.Parse(String2.Substring(i,1)) is 402 where it should be 131.
Do you see what I am misunderstanding?
By adding 95 to a space (ASCII 36) you end up with byte 131. You then ask for the Windows-1252 text at 131, which is a Latin ƒ and store that into C#'s native Unicode string. C# is going to map that Latin ƒ back to UTF-16 for storage into memory. Later, you ask for that character back - it's Unicode code point is U+0192; convert that from hex and you get decimal 402. Trying to get that back to Windows-1252 will obviously fail, since it's not a byte.
What you probably want to do, is to use Encoding.GetBytes to have the Unicode text converted to Windows-1252 before manipulating the characters.
For the decoding part
String1 = "";
for (int i = 0; i < String2.Length; i++)
{
var charByte = System.Text.Encoding.GetEncoding(1252).GetBytes(String2.Substring(i, 1));
AddNumber = i + 95;
AsciiNumber = Convert.ToInt32(charByte[0]) - AddNumber;
String1 += Convert.ToChar(AsciiNumber);
}

RC4 Encryption non-alphanumeric wrong

Background: I'm trying to convert Mike Shaffer's VB RC4 encryption to C# (https://web.archive.org/web/20210927195845/https://www.4guysfromrolla.com/articles/091802-1.3.aspx). See a previous question of mine at Converting Mike Shaffer's RC4Encryption to C#.
It seems my encryption is not working.
Using the demo page at: https://web.archive.org/web/20000303125329/http://www.4guysfromrolla.com:80/demos/rc4test.asp, with password of "abc":
Plain text: og;|Q{Fe should result in
A2 FA E2 55 09 A4 AB 16
However, my code is generating the 5th char as 9, instead of 09:
A2 FA E2 55 9 A4 AB 16
Another example - Plain text: cl**z!Ss should result in
AE F1 F3 03 22 FE BE 00
However, my code is generating:
AE F1 F3 3 22 FE BE 0
It seems it's only a problem with certain non-alphanumeric characters.
Here's my code:
private static string EnDeCrypt(string text)
{
int i = 0;
int j = 0;
string cipher = "";
// Call our method to initialize the arrays used here.
RC4Initialize(password);
// Set up a for loop. Again, we use the Length property
// of our String instead of the Len() function
for (int a = 1; a <= text.Length; a++)
{
// Initialize an integer variable we will use in this loop
int itmp = 0;
// Like the RC4Initialize method, we need to use the %
// in place of Mod
i = (i + 1) % 256;
j = (j + sbox[i]) % 256;
itmp = sbox[i];
sbox[i] = sbox[j];
sbox[j] = itmp;
int k = sbox[(sbox[i] + sbox[j]) % 256];
// Again, since the return type of String.Substring is a
// string, we need to convert it to a char using
// String.ToCharArray() and specifying that we want the
// first value, [0].
char ctmp = text.Substring(a - 1, 1).ToCharArray()
[0];
itmp = ctmp; //there's an implicit conversion for char to int
int cipherby = itmp ^ k;
cipher += (char)cipherby; //just cast cipherby to a char
}
// Return the value of cipher as the return value of our
// method
return cipher;
}
public static string ConvertAsciiToHex(string input)
{
return string.Join(string.Empty, input.Select(c => Convert.ToInt32(c).ToString("X")).ToArray());
}
public static string Encrypt(string text)
{
return ConvertAsciiToHex(EnDeCrypt(text));
}
Here's how I get my encrypted result:
var encryptedResult = RC4Encrypt.Encrypt(valuetoencrypt);
The output is correct (leading zeros don't change the value), your code is simply not padding values that fit into a single hex digit (such as 9 or 3 or 0). Use .ToString("X2") instead of .ToString("X").

C#: How do i modify this code to split the string with hyphens?

Hello i need some help modifying this code so it splits this string with hyphens:
string KeyString = "abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ1234567890";
I would like to lay it out like this:
1234-1234-1234-1234
with a char length of 4 per segment and max chars about 16
Full Code
private static string GetKey()
{
char[] chars = new char[62];
string KeyString = "abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ1234567890";
chars = KeyString.ToCharArray();
RNGCryptoServiceProvider crypto = new RNGCryptoServiceProvider();
byte[] data = new byte[1];
crypto.GetNonZeroBytes(data);
data = new byte[8];
crypto.GetNonZeroBytes(data);
StringBuilder result = new StringBuilder(8);
foreach (byte b in data)
{
result.Append(chars[b % (chars.Length - 1)]);
}
return result.ToString();
}
the code is used for generating random ids
any help would be appreciated
Can't... resist... regex... solution
string KeyString = "abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ1234567890";
Console.WriteLine(
new System.Text.RegularExpressions.Regex(".{4}")
.Replace(KeyString, "$&-").TrimEnd('-'));
outputs
abcd-efgh-ijkl-mnop-qrst-uvwx-yzAB-CDEF-GHIJ-KLMN-OPQR-STUV-WXYZ-1234-5678-90
But yes, if this is a Guid, by all means, use that type.
How about:
StringBuilder result = new StringBuilder();
for (var i = 0; i < data.Length; i++)
{
if (i > 0 && (i % 4) == 0)
result.Append("-");
result.Append(chars[data[i] % (chars.Length - 1)]);
}
result.Length -= 1; // Remove trailing '-'
It might be worthwhile to have a look into Guid, it produces strings in the format xxxxxxxx-xxxx-xxxx-xxxxxxxx (32 chars) if you definitely want 16 chars, you could take some 16 chars in the middle of the 32 generated by Guid.
Guid guid = Guid.NewGuid(); /* f.e.: 0f8fad5b-d9cb-469f-a165-70867728950e */
string randomid = guid.ToString().Substring(4, 19);

Categories