Why does Parse work and Convert doesn't in C#? - c#

I want to square every digit in str and concatenate it to pow.
I have this simple code:
string str = "1234";
string pow = "";
foreach(char c in str)
{
pow += Math.Pow(Convert.ToInt32(c), 2);
}
It should return 14916 - instead it returns: 2401250026012704
But if I use int.Parse(c), it returns the correct number.
foreach(char c in str)
{
int i = int.Parse(c.ToString());
pow += Math.Pow(i, 2);
}
Why does Parse work and Convert doesn't?

From the documentation of Convert.ToInt32(char):
The ToInt32(Char) method returns a 32-bit signed integer that represents the UTF-16 encoded code unit of the value argument.
Therefore, for example, the char '1' will be converted to the integer value 49, as defined in the UTF-16 encoding: https://asecuritysite.com/coding/asc2.
An alternative approach to the int.Parse(c.ToString()) example, would be Char.GetNumericValue:
foreach(char c in str)
{
pow += Math.Pow(char.GetNumericValue(c), 2);
}
This converts the char to the numeric equivalent of that value.

Related

C# - Are type conversions (numbers to char) made based on ASCII values?

I'm new to C#, and I'm trying to understand more about data types and conversion. I have the following console program:
using System;
namespace First_lesson
{
class Program
{
static void Main(string[] args)
{
char singleLetter = 'B';
short num = 291;
Console.WriteLine("{0} {1}", num, singleLetter);
singleLetter = Convert.ToChar(num);
Console.WriteLine(singleLetter);
Console.WriteLine("{0} {1}", sizeof(short), sizeof(char));
}
}
}
I don't understand why, when I convert num to a char, I get a lowercase g instead of the hashtag symbol #.
Isn't the conversion made based on ASCII values?
Convert.ToChar returns a char (which is a UTF-16 code unit) that has the specified numeric value.
291 (decimal) is the same as 0x123 (hex), so Convert.ToChar(291) returns a char containing the character U+0123, which is LATIN SMALL LETTER G WITH CEDILLA, i.e., ģ.
When you print this to a non-Unicode console, the accent is stripped off and an ASCII g is printed instead.
The char keyword is used to declare an instance of the System.Char structure that the .NET Framework uses to represent a Unicode character. The value of a Char object is a 16-bit numeric (ordinal) value.
Actually, one char represents a Unicode character. 35 in decimal is the representation of # sign
public static void Main()
{
char singleLetter = 'B';
short num = 35;
Console.WriteLine("{0} {1}", num, singleLetter);
singleLetter = Convert.ToChar(num);
Console.WriteLine(singleLetter);
Console.WriteLine("{0} {1}", sizeof(short), sizeof(char));
Console.ReadLine();
}
Try this:
short singleLetter = Convert.ToInt16('B');
Console.WriteLine(singleLetter);
B is the 66 decimal value
You can use this page as a reference for the Unicode table

how to change a char to the previous one of it (alphabetically) in string using c#?

hello i am searching for a good approach to change a single char in a string to the previous char of it . i mean if i have this string = "abcd" i want to change the 'd' char to 'c' ? how to change the char to to the one before it (alphabetically) ?
i want to use the approach here:
int StringSize=0;
string s=" ";
s = Console.ReadLine();
StringSize = s.Length;
s.Replace(s[StringSize-1],the previous char);
i want to change the char s [StringSize-1] to the previous char of it.
I've tried to do this depending on the ASCII code of the character but i did't find a method to convert form char to ASCII.
char is already ASCII, but to do math on it, you need a number.
So:
Cast to int
Do your math (subtract 1)
Cast back to char
char newChar = (char)((int)oldChar - 1);
Or in your code:
s = s.Replace(s[StringSize-1], (char)((int)s[StringSize-1] - 1));
Caveats:
This won't work with 'a' or 'A'
Strings are immutable you can't just change a character. You can create a new string with the replaced character, but that isn't technically the same thing.
Replace return string to object, but not change values on it. The solution's:
s = s.Replace(s[StringSize-1], the previous char);
var str = "abcd";
for (int i = 0; i < str.Length; i++)
{
str = str.Replace(str[i], (char)((byte)str[i] - 1));
}

How to UNHEX() MySQL binary string in C# .NET?

I need to use HEX() in MySQL to get data out of the database and process in C# WinForm code. The binary string needs to be decoded in C#, is there an equivalent UNHEX() function?
From MySQL Doc:
For a string argument str, HEX() returns a hexadecimal string
representation of str where each byte of each character in str is
converted to two hexadecimal digits. (Multi-byte characters therefore
become more than two digits.) The inverse of this operation is
performed by the UNHEX() function.
For a numeric argument N, HEX() returns a hexadecimal string
representation of the value of N treated as a longlong (BIGINT)
number. This is equivalent to CONV(N,10,16). The inverse of this
operation is performed by CONV(HEX(N),16,10).
mysql> SELECT 0x616263, HEX('abc'), UNHEX(HEX('abc'));
-> 'abc', 616263, 'abc' mysql> SELECT HEX(255), CONV(HEX(255),16,10);
-> 'FF', 255
You can use this not-widely-known SoapHexBinary class to parse hex string
string hex = "616263";
var byteArr = System.Runtime.Remoting.Metadata.W3cXsd2001.SoapHexBinary.Parse(hex).Value;
var str = Encoding.UTF8.GetString(byteArr);
After fetching the binary string from the database, you can "unhex" it this way:
public static string Hex2String(string input)
{
var builder = new StringBuilder();
for(int i = 0; i < input.Length; i+=2){ //throws an exception if not properly formatted
string hexdec = input.Substring(i, 2);
int number = Int32.Parse(hexdec, NumberStyles.HexNumber);
char charToAdd = (char)number;
builder.Append(charToAdd);
}
return builder.ToString();
}
The method builds a string from the hexadecimal format of the numbers, their char representation being concatenated to the builder branch.

Convert ASCII to HEX keeping line breaks

OK so I'm making a ASCII to HEX converter and it works fine, but when i insert line breaks it replaces them with this character -> Ú
ie
turns this
1
2
3
to this
1Ú2Ú3
Code under command buttons
private void asciiToHex_Click(object sender, EventArgs e)
{
HexConverter HexConvert =new HexConverter();
string sData=textBox1.Text;
textBox2.Text = HexConvert.StringToHexadecimal(sData);
}
private void hexToAscii_Click(object sender, EventArgs e)
{
HexConverter HexConvert = new HexConverter();
string sData = textBox1.Text;
textBox2.Text = HexConvert.HexadecimalToString(sData);
}
Code under HexConverter.cs
public class HexConverter
{
public string HexadecimalToString(string Data)
{
string Data1 = "";
string sData = "";
while (Data.Length > 0)
//first take two hex value using substring.
//then convert Hex value into ascii.
//then convert ascii value into character.
{
Data1 = System.Convert.ToChar(System.Convert.ToUInt32(Data.Substring(0, 2), 16)).ToString();
sData = sData + Data1;
Data = Data.Substring(2, Data.Length - 2);
}
return sData;
}
public string StringToHexadecimal(string Data)
{
//first take each charcter using substring.
//then convert character into ascii.
//then convert ascii value into Hex Format
string sValue;
string sHex = "";
foreach (char c in Data.ToCharArray())
{
sValue = String.Format("{0:X}", Convert.ToUInt32(c));
sHex = sHex + sValue;
}
return sHex;
}
}
Any Ideas?
The problem is that String.Format("{0:X}", Convert.ToUInt32(c)) does not zero-pad its output to two digits, so \r\n becomes DA instead of 0D0A. You'll get a similar problem, but worse, with \t (which becomes 9 instead of 09, which will cause misalignment for subsequent characters as well).
To zero-pad to two digits, you can use X2 instead of bare X; or, more generally, you can use Xn to zero-pad to n digits. (See the "Standard Numeric Format Strings" page on MSDN.)
Instead of
System.Convert.ToUInt32(hexString), use
uint.Parse(hexString, System.Globalization.NumberStyles.AllowHexSpecifier);
MSDN says the "AllowHexSpecifier flag indicates that the string to be parsed is always interpreted as a hexadecimal value"
How to: Convert Between Hexadecimal Strings and Numeric Types
the laziest thing you could do is do a string.replace("Ú","\r\n") on the result. Unless there were a compelling reason not to do it this way, I would start here.
Otherwise, in your Char loop, look for the NewLine char and add it as-is to your string.

how can I convert a char to a char* in c#?

how can I convert a char to a char* in c#?
I'm initializeing a String object like this:
String test=new String('c');
and I'm getting this error:
Argument '1': cannot convert from 'char' to 'char*'
That is a bit of a strange way to initialize a string, if you know beforehand what you want to store in it.
You can simply use:
String test="c";
If you have a specific need to convert a char variable to a string, you can use the built in ToString() function:
String test = myCharVariable.ToString();
unsafe
{
char c = 'c';
char *ch = &c;
}
Your example has a String and a compile error from using one of the String constructor overloads, so I'm guessing you really just want an array of chars, aka a String and maybe not a char*.
In which case:
char c = 'c';
string s = c.ToString(); // or...
string s1 = "" +c;
Also available:
unsafe
{
char c = 'c';
char* ch = &c;
string s1 = new string(ch);
string s2 = new string(c, 0);
}
string myString1 = new string(new char[] {'a'});
string myString2 = 'a'.ToString();
string myString3 = "a";
string myString4 = new string('a', 1);
unsafe {
char a = 'a';
string myString5 = new string(&a);
}
There is no overload of the public constructor for String that accepts a single char as a parameter. The closest match is
public String(char c, int count)
which creates a new String that repeats the char c count times. Thus, you could say
string s = new string('c', 1);
There are other options. There is a public constructor of String that accepts a char[] as a parameter:
public String(char[] value)
This will create a String that is initialized with the Unicode characters in value. Thus you could say
char c = 'c';
string s = new String(new char[] { c });
Another option is to say
char c = 'c'
string s = c.ToString();
But the most straightforward approach that most will expect to see is
string s = "c";
As for converting a char to a char * you can not safely do this. If you want to use the overload of the public constructor for String that accepts a char * as a parameter, you could do this:
unsafe {
char c = 'c';
char *p = &c;
string s = new string(p);
}
Can't hurt to have yet another answer:
string test = string.Empty + 'c';
The String class has many constructors, if all you're after is to create a string containing one character, you can use the following:
String test = new String(new char[] { 'c' });
If you are hard coding it, is there a reason you cant just use:
String test = "c";
How about:
var test = 'c'.ToString()
When using a char in the String constructor, you should also give a count parameter to specify how many times that character should be added to the string:
String test=new String('c', 1);
See also here.
use
String test("Something");
String test = new String(new char[]{'c'});
The easiest way to do this conversion from your example is just change the type of quotes you are using from single quotes
String test = new String('c');
to double quotes and remove the constructor call:
String test = "c";
char c = 'R';
char *pc = &c;
Using single quotes (as in your question: 'c') means that you are creating a char. Using double quotes, e.g. "c", means you creating a string. These are not interchangable types in c#.
A char*, as you might be aware, is how strings are represented in c++ to some extent, and c# supports some of the conventions of c++. This means that a char* can easily (for the programmer at least) be converted to a string in c#. Unfortunately a char is not inherently a char*, so the same cannot be done.

Categories