Convert chars corresponding to digits to uint - c#

I need to travers the string ,which should be the string of digits and make some arithmetic operations on these digits
for (int i = data.Length - 1; i >= 0; --i)
{
uint curDigit;
//Convert to uint the current rightmost digit, if convert fails return false (the valid data should be numeric)
try
{
curDigit = Convert.ToUInt32(data[i]);
//arithmetic ops...
}
catch
{
return false;
}
I test it with the following input data string.
"4000080706200002"
For i = 15,corresponding to the rightmost digit 2,I get 50 as an output from
curDigit = Convert.ToUInt32(data[i]);
Can someone please explain me what is wrong?and how to correct the issue

50 is the ascii code for '2'. what you need is '2' - '0' (50-48)
byte[] digits = "4000080706200002".Select(x => (byte)(x - '0')).ToArray();
http://www.asciitable.com/

What you are getting back is the ascii value of character 2, You can use call ToString on the character item and then call Convert.ToUnit32, Consider the example:
char x = '2';
uint curDigit = Convert.ToUInt32(x.ToString());
this will give you back 2 as curDigit
For your code you can just use:
curDigit = Convert.ToUInt32(data[i].ToString());
Another option is to use char.GetNumericValue like:
uint curDigit = (UInt32) char.GetNumericValue(data[i]);
char.GetNumericValue returns double and you can cast the result back to UInt32

The problem is that data[i] returns a char variable, that essentialy is an integer holding the ASCII code of the character. So '2' corresponds to 50.
There are 2 things you can do to overcome this behaviour:
Better curDigit = Convert.ToUInt32(data[i] - '0'); //Substract the ASCII representation of '0' from the char
curDigit = Convert.ToUInt32(data.Substring(i,1)); //Use substring to return a string instead of char. Note, that this method is less efficient, as Convert from string essentially splits the string into chars, and substract '0' from each and every one of them.

Your getting the ASCII (or Unicode) values for those characters. The problem is that the code points for the characters '0' … '9' are not 0 … 9, but 48 … 57. To fix this, you need to adjust by that offset. For example:
curDigit = Convert.ToUInt32(data[i] - '0');
Or
curDigit = Convert.ToUInt32(data[i] - 48);

Rather than messing around with ASCII calculations you could use UInt32.TryParse as an alternative solution. However, this method requires a string input not char, so you would have to modify your approach a little:
string input = "4000080706200002";
string[] digits = input.Select(x => x.ToString()).ToArray();
foreach(string digit in digits)
{
uint curDigit = 0;
if(UInt32.TryParse(digit, out curDigit))
{
//arithmetic ops...
}
//else failed to parse
}

Related

C#--Counting the number of occurrences of a single digit(int) in a string--not working with int to char converted Variable

I'm trying to count the number of times an int appears in a string
int count = numbers
.Where(x => x == '0')
.Count();
If I type in the literal number I want to check for as seen above '0' it works.
But I want to use this as a method where I can check for other digits. This, unfortunately, doesn't work when I convert an int to a char and insert the digit variable
char digit = Convert.ToChar(0);
int count= numbers
.Where(x => x == digit)
.Count();
What am I doing wrong?
Convert.ToChar(Int32) returns Unicode character equivalent to the value of passed int. For example it will convert 65 to A:
Console.WriteLine(Convert.ToChar(65)); // prints "A"
If I understand your requirement correctly you can call ToString and take first char of resulting string instead of Convert.ToChar(0):
char digit = 0.ToString()[0]
Or as this answer suggests:
char c = (char)(i + 48);
Convert.ToChar(0) converts you bit representation to the according to the character encoding table (e.g. ASCII) . 0 in this case is a Null character. Whereas bit representation of a '0' character is 48 according to the table
You can use this snippet to compare you int digit to the chars
string numbers = "33";
int digit = 3;
int count = numbers
// '0' -'0' (48 - 48) will give you 0 int value,
// '1' - '0' (49 - 48) will give you 1 etc.
.Where(x => x - '0' == digit)
.Count();
but keep in mind that you digit might be greater than 9, then it would take more than one character and this is going a bit different problem.

Display the sum of digits in a number entered by the user

I need to get a number from the user and display the sum of that number's digits. For example, the sum of the digits in the number 12329 is 17.
Here's what I tried to do and it is giving me the ASCII code instead:
Console.WriteLine("please enter a number: ");
string num = Console.ReadLine();
int len = num.Length;
int[] nums = new int[len];
int sum = 0;
int count = 0;
while (count < len)
{
nums[count] = Convert.ToInt32(num[count]);
count++;
}
for (int i = 0; i < len; i++)
sum += nums[i];
Console.WriteLine(sum);
This is a very common mistake. char is really just a number - the encoding value of the character represented by the char. When you do Convert.ToInt32 on it, it sees the char as a number and says "alright let's just convert this number to 32 bits and return!" instead of trying to parse the character.
"Wait, where have I used a char in my code?" you might ask. Well, here:
Convert.ToInt32(num[count]) // 'num[count]' evaluates to 'char'
To fix this, you need to convert the char to a string:
nums[count] = Convert.ToInt32(num[count].ToString());
^^^^^^^^^^^^^^^^^^^^^
Now you are calling a different overload of the ToInt32 method, which actually tries to parse the string.
When you access your string with a index (in your case num[count]) you get a char type and because of that you are getting ASCII values. You can convert char to string with .ToString() in your case nums[count] = Convert.ToInt32(num[count].ToString());.I posted here another approach to your problem:
string number = Console.ReadLine();
int sum = 0;
foreach (var item in number)
{
sum += Convert.ToInt32(item.ToString());
}
Console.WriteLine(sum);
As you noticed the Convert.ToInt32(num[count]) will only return the Unicode code of the char you want to convert, because when you use the [] operator on a string, you will get readonly access to [the] individual characters of a string [1].
And so you are using Convert.ToInt32(Char), which
Converts the value of the specified Unicode character to the equivalent 32-bit signed integer.
One way to cast the numeric value from a char to a digit, is using Char.GetNumericValue(), which
Converts a specified numeric Unicode character to a double-precision floating-point number.
By using System.Linq; you can cut your code to just a few lines of code.
Console.WriteLine("please enter a number: ");
string num = Console.ReadLine(); // "12329"
int sum = (int)num.Select(n => Char.GetNumericValue(n)).Sum();
Console.WriteLine(sum); // 17
What does this line of code?
The num.Select(n => Char.GetNumericValue(n)) will iterate over each char in your string, like your while and converts each value to a double value and return an IEnumerable<double>. The Sum() itself will iterate over each value of the IEnumerable<double> and calculate the sum as a double. And since you want an integer as result the (int) casts the double into an integer value.
Side-Note:
You could check your input, if it is really an integer.
For example:
int intValue;
if(Int32.TryParse(num, out intValue))
{
// run the linq
}
else
{
// re-prompt, exit, ...
}
If you are using Char.GetNumericValue() on an letter it will return -1 so for example the sum of string num = "123a29"; will be 16.

Array of chars in hex format to integer?

I have an API which returns a byte[] over the network which represents information about a device.
It is in format 15ab1234cd\r\n where the first 2 characters are a HEX representation of the amount of data in the message.
I am aware I can convert this to a string via ASCIIEncoding.ASCII.GetString, and then use Convert.ToInt32(string.Substring(0, 2), 16) to achieve this. However the whole thing stays a byte array throughout the life of the whole program I am writing, and I don't want to convert to a string just for the purpose of getting the packet length.
Any suggestions of converting array of chars in hex format to an int in C#?
There is no .Net provided function that does it. Converting first 2 bytes to string with Encoding.GetString is very readable (possibly not most performant):
var hexValue = ASCIIEncoding.ASCII.GetString(byteData, 0, 2);
var intValue = Convert.ToInt32(hexValue, 16);
You can easily write conversion code (map '0'-'9' and 'a'-'f' / 'A'-'F' ranges to corresponding integer value and add together.
Here is one-statement conversion strictly for entertainment purposes. The resulting lambda (before ((byte)'0',(byte)'A') in sample takes 2 byte arguments assuming them to be ASCII characters and convert into integer.
((Func<Func<char,int>, Func<byte, byte, int>>)
(charToInt=> (c, c1)=>
charToInt(char.ToUpper((char)c)) * 16 + charToInt(char.ToUpper((char)c1))))
((Func<char, int>)(
c => c >= '0' && c <='9' ? c-'0' : c >='A' && c <= 'F' ? c - 'A' + 10 : 0))
((byte)'0',(byte)'A')
If you know the first two values are valid hexadecimal characters (0-9, A-Z, a-z), it is possible to convert to a hex value using logical operators.
int GetIntFromHexBytes(byte[] s, int start, int length)
{
int ret = 0;
for (int i = start; i < start+length; i++)
{
ret <<= 4;
ret |= (byte)((s[i] & 0x0f) + ((s[i] & 0x40) >> 6) * 9);
}
return ret;
}
(This works because c & 0x0f returns the 4 least significant bits, and will range from 0-9 for the values '0'-'9', and from 1 - 6 for both capital and lowercase letters ('a' - 'z' and 'A' - 'Z'). s[i] & 0x40 is 0 for numeric characters, and 0x40 for alpha characters; shifting right six characters provides a value of 0 for numeric characters and 1 for alphabetic characters. Shifting left and multiplying by 9 will add a bias of 9 for alpha characters to map A-F and a-f from 1-6 to 10-15.)
Given the byte array:
byte[] b = { (byte)'7', (byte)'f', (byte)'1', (byte)'c' };
Calling GetIntFromHexBytes(b, 0, 2) will return 127 (0x7f), the first two bytes of the array, as required.
As a caution: this approach does no bounds checking. A check can be added in the loop if needed to ensure that the input bytes are valid hex characters.

Convert int32 to string in base 16

I'm currently trying to convert a .NET JSON Encoder to NETMF but have hit a problem with Convert.ToString() as there isn't such thing in NETMF.
The original line of the encoder looks like this:
Convert.ToString(codepoint, 16);
And after looking at the documentation for Convert.ToString(Int32, Int32) it says it's for converting an int32 into int 2, 8, 10 or 16 by providing the int as the first parameter and the base as the second.
What are some low level code of how to do this or how would I go about doing this?
As you can see from the code, I only need conversion from an Int32 to Int16.
EDIT
Ah, the encoder also then wants to do:
PadLeft(4, '0');
on the string, is this just adding 4 '0' + '0' + '0' + '0' to the start of the string?
If you mean you want to change a 32-bit integer value into a string which shows the value in hexadecimal:
string hex = intValue.ToString("x");
For variations, please see Stack Overflow question Convert a number into the hex value in .NET.
Disclaimer: I'm not sure if this function exists in NETMF, but it is so fundamental that I think it should.
Here’s some sample code for converting an integer to hexadecimal (base 16):
int num = 48764; // assign your number
// Generate hexadecimal number in reverse.
var sb = new StringBuilder();
do
{
sb.Append(hexChars[num & 15]);
num >>= 4;
}
while (num > 0);
// Pad with leading 0s for a minimum length of 4 characters.
while (sb.Length < 4)
sb.Append('0');
// Reverse string and get result.
char[] chars = new char[sb.Length];
sb.CopyTo(0, chars, 0, sb.Length);
Array.Reverse(chars);
string result = new string(chars);
PadLeft(4, '0') prepends leading 0s to the string to ensure a minimum length of 4 characters.
The hexChars value lookup may be trivially defined as a string:
internal static readonly string hexChars = "0123456789ABCDEF";
Edit: Replacing StringBuilder with List<char>:
// Generate hexadecimal number in reverse.
List<char> builder = new List<char>();
do
{
builder.Add(hexChars[num & 15]);
num >>= 4;
}
while (num > 0);
// Pad with leading 0s for a minimum length of 4 characters.
while (builder.Count < 4)
builder.Add('0');
// Reverse string and get result.
char[] chars = new char[builder.Count];
for (int i = 0; i < builder.Count; ++i)
chars[i] = builder[builder.Count - i - 1];
string result = new string(chars);
Note: Refer to the “Hexadecimal Number Output” section of Expert .NET Micro Framework for a discussion of this conversion.

How to convert an integer to fixed length hex string in C#?

I have an integer variable with max value of 9999.
I can convert to fixed length string (4-characters):
value.ToString("0000");
and I can convert it to hex:
value.ToString("X");
I want to convert it to a hex string of four characters (padded with 0 at the left if the value is converted to less than four digits hex value). I tried the following which didn't work.
value.ToString("0000:X");
OK, I can check the length of hex string and pad left with zeros.
But is there any straightforward way?
Use a number after the X format specifier to specify the left padding : value.ToString("X4")
String.Format( "{0:X2}", intValue)
Here is another method,
You can define a function and pass it 2 values, one the actual number and the second is the max length to fix.
i.e.
public string FixZero(string str, int maxlength)
{
string zero = "000000000000000000000000000000000000000";
int length = str.Length;
int diff = maxlength- length;
string z = zero.Substring(1, diff);
z = z + str;
return z;
}
you need integers in the format 0012, FixZero("12", 4)
or for 0001234, FixZero("1234", 7)

Categories