When I use split.string and Convert.ToInt32 the value changes, why? - c#

For example,
string a = "4,3,2";
a.Split(',');
int one = Convert.ToInt32(a[0]);
int two = Convert.ToInt32(a[2]);
If I were to Console.WriteLine(a[0]); it will give me 4, and Console.WriteLine(a[2]) will give me 2. However, Console.WriteLine(one) and Console.WriteLine(two) gives me 52 and 50 respectively. Why is this so?

Character '4' is unicode codepoint 52 and character '3' unicode codepoint 51. You're converting characters instead of strings. The problem is that you're ignoring the result of a.Split(','); and then dereference the individual characters from a, and Convert.ToInt32(char) does:
Converts the value of the specified Unicode character to the
equivalent 32-bit signed integer.
and
The ToInt32(Char) method returns a 32-bit signed integer that
represents the UTF-16 encoded code unit of the value argument.
Instead use the strings after splitting:
string[] split = a.Split(',');
int one = Convert.ToInt32(split[0]);
int two = Convert.ToInt32(split[1]);

Related

c# - How to convert from int to short?

Doing an exercise where the output is different from the input.
For example if I input the number "6" into the console the output will be 54. What is the reason behind the output and how could I get the same numer as input?
{
int intVal;
short ShortVal;
intVal = Console.Read();
ShortVal = (short)intVal;
Console.WriteLine("{0}", ShortVal);
}
}
The answer is in the documentation for Console.Read:
Returns [...] The next character from the input stream, or negative one (-1) if there are currently no more characters to be read.
It is written a bit sloppily, because technically it returns the Unicode character code of the character.[1]
If you type in the character 6, what is the ordinal Unicode character value for this character? It's 54 (hex 0x0036; think of ASCII codes, but the encoding not being ASCII but rather Unicode). And that's precisely what you are seeing/getting here.
If you want to get the numeric value of the digit the character is representing (which is not the same as the numerical Unicode character code), take a look at the Unicode character codes for the characters 0...9. They occupy the character code range from 0x0030 (for the '0' character) to 0x0039 (for the 9 character). It shouldn't be hard to notice that you could simply subtract the Unicode character code of the 0 character to get the value of the digit these characters represent:
intVal = Console.Read();
if (intVal < '0' || intVal > '9')
{
Console.WriteLine("Not a numerical digit.");
}
else
{
var digitValue = intVal - '0';
Console.WriteLine("{0}", digitValue);
}
Now, this hopefully helps furthering understanding of character codes. However, the example code i have given here could be written differently, as the char type has a convenient method which provides the digit value of characters which represent digits (or represent numerical values, like roman numerals for example):
intVal = Console.Read();
if (intVal <= -1)
{
Console.WriteLine("No character entered");
}
else
{
var digitValue = char.GetNumericValue((char) intVal);
if (digitValue < 0)
{
Console.WriteLine("Not a numerical digit.");
}
else
{
Console.WriteLine("{0}", digitValue);
}
}
[1] If the method were to return a character, its return type would be char and not int. Although, the char type can be converted/cast to a numeric value such as int or short and back to get a characters Unicode character value and vice versa, so this distinction of characters vs. Unicode character code values seldomly matters in practical situations. However, here it may be useful to make this distinction to get a better understanding of what you observed.

Convert convert int to alphabetical value

i would like to convert int's between 1-26 to the corresponding alphabetical char letter
i saw somewhere i could use char outPut = (char)currentValue; or Convert.ToChar(currentValue however both do not seem to work. i am testing this by having the program print the char to the console. both attempts have resulted in a empty character being printed
i am making a program that takes in a char letter, converts it to an int to do some maths to it and then converts the final int back to a char letter and prints the char to the console
char outPut = (char)currentValue;
would give you a char that represents some whitespace character (since those are represented as integers up to 32.)
What you actually want is a char representing a letter. The following code will give you the ith letter of the alphabet in lowercase:
char c = (char)((int)'a' + i - 1);
Or in uppercase:
char c = (char)((int)'A' + i - 1);
It uses the fact that letters are represented by consecutive integers.
You can offset by a character you want. In this case, I use 'a'.
Console.WriteLine((char)(5 + 'a'));
As your range starts with 1, you need to use 'a'-1.
int and char can be converted back and forth by using the ASCII table
ASCII value for A is 65. So you can get char A by doing
char a = (char) 65;
to convert it back using the 1-26 representation just do
int num = a - 65 + 1;
You would need to do some boundary check to make sure the number is within range of course.

How can I get third and fourth letters from string?

I want to get third and fourth letters from PlayerPrefs.GetString("String")
and Parse to int.
for example;
string playerLevelstr = PlayerPrefs.GetString("Player")[2] + PlayerPrefs.GetString("Player")[3];
//My Player string is "0012000000" but when I plus third and fourth letter, playerLevelstr should be "12" but it is "96".
int playerLevelint = int.Parse(playerLevelstr);
The indexer on a string returns a char.
If you use the + operator on chars together, it essentially does integer math on the two chars.
See this question for more information on that.
Though that means you should get 99 ('1' is 49, '2' is 50') not 96. But maybe that was a typo on one end or the other?
Regardless, you should either convert the chars to strings (.ToString() on them) or use the Substring function on the string instead. And don't forget your length/null checks!
This code PlayerPrefs.GetString("Player")[2] returns a char, which is being being converted to int (being the ASCII value of the character) when you add it to another char.
Do this instead:
string playerLevelstr = PlayerPrefs.GetString("Player").Substring(2,2);

Why does Visual Studio add a slash at the start of an int, when it's converted to char?

Why does Visual Studio add a slash at the start of an int, when it's converted to char?
When I convert int 0 to char, it changes to \0. Why? I need char 0.
\0 is just how Visual Studio shows the character null in debug windows.
The \ is an escape sequence, so \0 tells you this is the null character.
If you want the character 0, you need to use the correct ASCII value for that character - 48.
char n = (char)0; // The _null_ character, displayed \0
char z = (char)48; // The character 0
char z2 = '0'; // The character 0, using a character literal
To recive char '0' convert int 48 to char or assign simply by
var c = '0';
When you convert int 0 to char you receive ASCII symbol at 0 decimal position, which is null. Char '0' has ASCII value as 48, so converting 48 to char will result in '0' char.
Check ASCII symbols and their decimal representation at ASCII Table. Remember, that anytime you convert int to char, it takes this ASCII table into account
If you want the char 0, you need to use either a character literal:
'0'
or, you can use
0.ToString()[0]
Depending on where the value is coming from.
\0 is an escape sequence for the character with the ASCII value 0, and this is different from the character for the digit 0.

Convert Integer to Ascii string in C#

I want to convert an integer to 3 character ascii string. For example if integer is 123, the my ascii string will also be "123". If integer is 1, then my ascii will be "001". If integer is 45, then my ascii string will be "045". So far I've tried Convert.ToString but could not get the result. How?
int myInt = 52;
string myString = myInt.ToString("000");
myString is "052" now. Hope it will help
Answer for the new question:
You're looking for String.PadLeft. Use it like myInteger.ToString().PadLeft(3, '0'). Or, simply use the "0" custom format specifier. Like myInteger.ToString("000").
Answer for the original question, returning strings like "0x31 0x32 0x33":
String.Join(" ",myInteger.ToString().PadLeft(3,'0').Select(x=>String.Format("0x{0:X}",(int)x))
Explanation:
The first ToString() converts your integer 123 into its string representation "123".
PadLeft(3,'0') pads the returned string out to three characters using a 0 as the padding character
Strings are enumerable as an array of char, so .Select selects into this array
For each character in the array, format it as 0x then the value of the character
Casting the char to int will allow you to get the ASCII value (you may be able to skip this cast, I am not sure)
The "X" format string converts a numeric value to hexadecimal
String.Join(" ", ...) puts it all back together again with spaces in between
It depends on if you actually want ASCII characters or if you want text. The below code will do both.
int value = 123;
// Convert value to text, adding leading zeroes.
string text = value.ToString().PadLeft(3, '0');
// Convert text to ASCII.
byte[] ascii = Encoding.ASCII.GetBytes(text);
Realise that .Net doesn't use ASCII for text manipulation. You can save ASCII to a file, but if you're using string objects, they're encoded in UTF-16.

Categories