Convert.FromBase64String(...) throws a FormatException - c#

The following line of code runs fine in IIS Express:
Convert.FromBase64String("dmVoaWNsZUlkPTE0MTM=??");
But when run on my local IIS 8 server, it throws the following exception:
System.FormatException: The input is not a valid Base-64 string as it contains a non-base 64 character, more than two padding characters, or an illegal character among the padding characters.
Why is this happening?

The last two characters "??" are not valid in a base 64 string.
Have a read here: https://en.wikipedia.org/wiki/Base64
The string should end in an alphanumeric character or be padded with one or more = characters.
Edit — Decoding the string without the ? characters returns "vehicleId=1413", so I guess it's just a case of removing them.

Related

The \U Escape Sequence in C#

I am experimenting with the Escape sequences and can not really use the \U sequence (UTF-32)
It does not compile as it can not recognize the sequence for some reason.
It recognizes it as UTF-16.
Could you please help me?
Console.WriteLine("\U00HHHHHH");
Your problem is that you copied \U00HHHHHH from the documentation page Strings (C# Programming Guide): String Escape Sequences:
But \U00HHHHHH is not itself a valid UTF-32 escape sequence -- it's a mask where each H indicates where a Hex character must be typed. The reason it's not valid is that hexadecimal numbers consist of the digits 0-9 and the letters A–F or a–f -- and H is not one of these characters. And the literal mentioned in comments, "\U001effff", does not work because it falls outside the range the range of valid UTF-32 characters values specified immediately thereafter in the docs:
(range: 000000 - 10FFFF; example: \U0001F47D = "👽")*
The c# compiler actually checks to see if the specified UTF-32 character is valid according to these rules:
// These compile because they're valid Hex numbers in the range 000000 - 10FFFF padded to 8 digits with leading zeros:
Console.WriteLine("\U0001F47D");
Console.WriteLine("\U00000000");
Console.WriteLine("\U0010FFFF");
// But these don't.
// H is not a valid Hex character:
// Compilation error (line 16, col 22): Unrecognized escape sequence
Console.WriteLine("\U00HHHHHH");
// This is outside the range of 000000 - 10FFFF:
// Compilation error (line 19, col 22): Unrecognized escape sequence
Console.WriteLine("\U001effff");
See https://dotnetfiddle.net/KezdTG.
As an aside, to properly display Unicode characters in the Windows console, see How to write Unicode characters to the console?.

How do I get the Unicode values of a multipoint character in C#?

𝐀 = (\ud835\udc00) in UTF16
How do I do this conversion in C#?
The following code doesn't allow me to enter the character above. It throws the error "Too many characters in character literal" I guess it's because the char is multi point.
string hex = ((int)'𝐀').ToString("X4");
string hex = ((int)'A').ToString("X4");// This works
A help would be much appreciated!

Decode base64 string C#

I tried to decode a following base64 string in C#:
PGlmcmFtZSBzcmM9Imh0dHA6Ly9lbWJlZC5yZWR0dWJlLmNvbS8\/aWQ9Mzg1NjAmYmdjb2x
vcj0wMDAwMDAiIGZyYW1lYm9yZGVyPSIwIiB3aWR0aD0iNDM0IiBoZWlnaHQ9IjM0NCIgc2Nyb2xsaW5n
PSJubyIgYWxsb3dmdWxsc2NyZWVuPjwvaWZyYW1lPg==
But i'm getting an error:
The input is not a valid Base-64 string as it contains a non-base 64 character, more than two padding characters, or an illegal character among the padding characters.
Even if i remove last
=
in the string above but still the same error.
Here is the code i use:
byte[] decodedBytes = Convert.FromBase64String(embedCode);
string decodedText = Encoding.UTF8.GetString(decodedBytes);
Why is that?
Thank you.
the correct Base64String is:
PGlmcmFtZSBzcmM9Imh0dHA6Ly9lbWJlZC5yZWR0dWJlLmNvbS8/aWQ9Mzg1NjAmYmdjb2x
vcj0wMDAwMDAiIGZyYW1lYm9yZGVyPSIwIiB3aWR0aD0iNDM0IiBoZWlnaHQ9IjM0NCIgc2Nyb2xsaW5n
PSJubyIgYWxsb3dmdWxsc2NyZWVuPjwvaWZyYW1lPg==
Well this is not a valid Base64String. Base64String can not have \ character. remove this character and it will work

What is the difference between Convert.ToBase64String(byte[]) and HttpServerUtility.UrlTokenEncode(byte[])?

I'm trying to remove a dependence on System.Web.dll from a Web API project, but have stumbled on a call to HttpServerUtility.UrlTokenEncode(byte[] input) (and its corresponding decode method) that I don't know what to replace with to ensure backwards compatibility. The documentation says that this method
Encodes a byte array into its equivalent string representation using base 64 digits, which is usable for transmission on the URL.
I tried substituting with Convert.ToBase64String(byte[] input) (and its corresponding decode method), which is very similarly described in the docs:
Converts an array of 8-bit unsigned integers to its equivalent string representation that is encoded with base-64 digits.
However, they don't seem to be entirely equivalent; when using Convert.FromBase64String(string input) to decode a string encoded with HttpServerUtility, I get an exception stating
The input is not a valid Base-64 string as it contains a non-base 64 character, more than two padding characters, or an illegal character among the padding characters.
What is the difference between these two conversion utilities? What's the correct way to remove this dependence on System.Web.HttpServerUtility?
Some users have suggested that this is a duplicate of this one, but I disagree. That question is about base-64-encoding a string in a url-safe manner in general, but I need to reproduce the exact behavior of HttpServerUtility but without a dependency on System.Web.
I took DGibbs on their word and Used the Source. It turns out the following happens in the HttpServerUtility methods:
Encoding to Base64
Use System.Convert to convert the input to Base64.
Replace + by - and / by _. Example: Foo+bar/=== becomes Foo-bar_===.
Replace any number of = at the end of the string, with an integer denoting how many they were. Example: Foo-bar_=== becomes Foo-bar_3.
Decoding from Base64
Replace the digit at the end of the string by the same number of = signs. Example: Foo-bar_3 becomes Foo-bar_===.
Replace - by + and _ by /. Example: Foo-bar_=== becomes Foo+bar/===.
Use System.Convert to decode the preprocessed input from Base64.
HttpServerUtility.UrlTokenEncode(byte[] input) will encode a URL safe Base64 string. In Base64 +, / and = characters are valid, but they are not URL safe, this method will replace these characters whereas the Convert.ToBase64String(byte[] input) will not. You can probably drop the reference and do it yourself.
Usually, '+' is replaced with '-' and '/' with '_' padding '=' is just removed.
Accepted answer here gives a code example: How to achieve Base64 URL safe encoding in C#?

What is the difference between Console.WriteLine('single quote'); and Console.WriteLine("double quote");

Why Visual Studio throw an error on
Console.WriteLine('string with single quote');
And not on :
Console.WriteLine("string with double quote");
?
Thank you.
Single quotes (') are used for the char data type which can only take a single character, hence the name, exempting escaped values such as '\n', '\r', etc., which are still represent a single char when compiled.
A double quote ("), is used to denote a UTF-16 encoded string (2 bytes generally per character, not ASCII, and is not .NET's default - UTF-16), but cannot handle all known character sets (UTF-8).
Console.WriteLine('') will accept char literal. so when you will try to pass more than 1 chars it will generates an error.
While Console.WriteLine("") will accept a string which can contain words.

Categories