Bytes from string from bytes not equals origin bytes - c#

I have an encryption algorithm (RSA) and trying to make it able to encrypt and decrypt text of any length, the problem I faced is somewhy after encryption of block of bytes, if I convert is to string (using Encoding.ASCII.GetString) and then go back (with GetBytes) - I don't get the same array of bytes, same with UTF8, I'm not really into encodings, can someone help how can I solve this problem, so I can convert encrypted bytes into string and pass it to decryption algorithm and it will get proper bytes?
byte[] bytes = new byte[] { 0xe1, 0xde, 0x4a, 0x10, 0xea, 0x74, 0x8f, 0x18, 0xd7, 0x93, 0x04, 0x7a, 0x10, 0xb2, 0xa8, 0xfa, 0x11, 0x00, 0x7a, 0xfb, 0xcb,
0x19, 0xb7, 0xf5, 0x25, 0x26, 0x6d, 0xa0, 0x0d, 0xdc, 0xe5, 0x0a };
Console.WriteLine(string.Join(" ", bytes));
// 255 222 74 16 234 116 143 24 215 147 4 122 16 178 168 250 17 0 122 251 203 25 183 245 37 38 109 160 13 220 229 10
byte[] bytes2 = Encoding.ASCII.GetBytes(Encoding.ASCII.GetString(bytes));
Console.WriteLine(string.Join(" ", bytes2));
// 63 63 74 16 63 116 63 24 63 4 122 16 63 63 63 17 0 122 63 63 25 63 63 37 38 109 63 13 63 63 10
byte[] bytes3 = Encoding.UTF8.GetBytes(Encoding.ASCII.GetString(bytes));
Console.WriteLine(string.Join(" ", bytes3));
// 63 63 74 16 63 116 63 24 63 4 122 16 63 63 63 17 0 122 63 63 25 63 63 37 38 109 63 13 63 63 10
Array bytes I got from encrypting "hello world!" with 32 bytes key from my encryption algorithm, as you see, ascii nor utf8 to string and then back to bytes doesn't gives me back my original array of bytes somewhy

You're not supposed to use the Encoding.XYZ objects for converting sequences of bytes into strings unless those bytes actually make up a string in that encoding. Their purpose is to do the bytes-to-string conversion when you're reading text from any form of medium that serves bytes, such as FileStream or similar. However, those bytes actually have to be correctly encoded for the encoding you choose. You cannot convert arbitrary sequences of bytes to strings using these encodings. As you've already observed, they will mangle the result. You might get lucky for quite a few byte sequences, but if you're using any of the cryptographic secure encryption algorithms, that luck will run out immediately.
Instead, use something like Base64 or 85. Base64 is built into .NET, and if you have this code:
byte[] original = ...
string encoded = Encoding.ASCII.GetString(original);
byte[] decoded = Encoding.ASCII.GetBytes(encoded);
all you have to do is change to this:
byte[] original = ...
string encoded = Convert.ToBase64String(original);
byte[] decoded = Convert.FromBase64String(encoded);

Related

Remove 00 from Byte Array in C# (Encoding to String)

I use a NFC-Reader to read data. The data will be stores as byte[] bytes_credentials. Receiving data is working fine but now I want to encode the 16 byte Array to String. I use string str = Encoding.Default.GetString(new_data); for decoding the array. The string output is working fine if the byte Array is fully "booked up" (16 of 16 bytes used), for example: This is an test! -> 54 68 69 73 20 69 73 20 61 6e 20 74 65 73 74 21. The problem i am facing is, that if the NFC-Array is not fully "booked up", the Array look like this: This is! -> 54 68 69 73 20 69 73 21 00 00 00 00 00 00 00 00. The output in console looks like this: This is!????????.
How can I CHECK and REMOVE the empty bytes (00) from the Array. So that the Output looks like this: This is!.
Thank you for helping!!!
I have checked previous posts but no of them are working for me...
You can use LINQ to filter your array further:
using System.Text;
var bytes_credentials = new byte[]
{
0x54,
0x68,
0x69,
0x73,
0x20,
0x69,
0x73,
0x21,
0,
0,
0,
0,
0,
0,
0,
0
};
Console.WriteLine(Encoding.Default.GetString(
bytes_credentials.Where(b => b != 0).ToArray()
));
To learn more about LINQ, take a look here: https://learn.microsoft.com/en-us/dotnet/csharp/programming-guide/concepts/linq/

Openssl Command Line for Triple DES HMAC like C# MACTripleDES

Can anyone explain how to make a TDES MAC in OpenSSL command line?
I am trying to duplicate some functionality of a working C# program in C for the OpenSSL API, and am having trouble duplicating the .Net MACTripleDES.ComputeHash function in openssl. Here is an example with bogus data and key:
using (MACTripleDES hmac = new MACTripleDES(Utilities.HexStringToByteArray("112233445566778899aabbccddeeff00")))
{
// Compute the hash of the input file.
byte[] hashValue = hmac.ComputeHash(Utilities.HexStringToByteArray("001000000000000000000000000000008000000000000000"));
string signature = Utilities.ByteArrayToHexString(hashValue);
PrintToFeedback("Bogus Signature = " + signature);
}
The result is "Bogus Signature = A056D11063084B3E" My new C program has to provide the same hash of that data in order to interoperate with its wider environment. But the way to do this in openSSL eludes me. This shows that the openssl data starts out the same as the C# data:
cmd>od -tx1 bsigin
0000000 00 10 00 00 00 00 00 00 00 00 00 00 00 00 00 00
0000020 80 00 00 00 00 00 00 00
stringified, 001000000000000000000000000000008000000000000000 MATCHes the c# string.
cmd>openssl dgst -md5 -mac hmac -macopt hexkey:112233445566778899aabbccddeeff00 bsigin
HMAC-MD5(bsigin)= 7071d693451da3f2608531ee43c1bb8a
That data is too long, and my expected data is not a substring. Same for -sha1 etc. I tried encrypting and making the digest separately, no good. MS does not say what kind of hash it does, and I can't find documentation of how to set up a MAC with TDES in openssl.
So I'm hoping someone here knows enough about both platforms to give me a decent hint.
Command line answer:
cmd>openssl enc -des-ede-cbc -K 112233445566778899aabbccddeeff00 -iv 0000000000000000 -in bsigin -out bsigout
cmd>od -tx1 bsigout
0000000 7c de 93 c6 5f b4 03 21 aa c0 89 b8 ae f3 da 5d
0000020 a0 56 d1 10 63 08 4b 3e 4c 03 41 d6 dd 9e e4 32
^^^^^^^^^^^^^^^^^^^^^^^
That is, the command line form returns 32 bytes, and bytes 16..23 contain the hmac.
API answer:
DES_key_schedule SchKey1,SchKey2;
DES_cblock iv = { 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00 };
DES_set_key((C_Block *)Key1, &SchKey1);
DES_set_key((C_Block *)Key2, &SchKey2);
DES_ede3_cbc_encrypt( (unsigned char*)input_data, (unsigned char*)cipher, inputLength, &SchKey1, &SchKey2, &SchKey1, &iv, DES_ENCRYPT);
Where Key1 is the Lkey or left 8 bytes of the 16 byte TDES key, and Key2 is the Rkey or right 8 bytes of the 16 byte TDES key. This call only populates 24 bytes of cipher, as opposed to the 32 byte return of the command line version. You still take bytes 16..23. Hopefully the supporting declarations are intuitive.

File_Decryption - JUNK Character found in decrypted file

I have a encryption tool to encrypt the file, when I study the encrypted file, found it is writing name of .PEM inside the encrypted file.
I found encryption logic is commonly used as below,
it supporting encryption of any file, it means RSA keys can not be use for encryption so here
it is creating a key(K) and encrypt it with RSA public key and then using key(K) for encrypting the file.
I write C# Code as below, it is fine but for big file am getting some junk character in middle like,
aaaaaaaaaaaaaaaa
??M'yaaaaaaaaaa?
my decryption code is like:-
System.Security.Cryptography.TripleDESCryptoServiceProvider tripleDES = new System.Security.Cryptography.TripleDESCryptoServiceProvider();
tripleDES.Key = result; // 16 byte of key
tripleDES.Mode = System.Security.Cryptography.CipherMode.CBC;
byte[] IV = { (byte)0x00, (byte)0x00, (byte)0x00, (byte)0x00, (byte)0x00, (byte)0x00, (byte)0x00, (byte)0x00 };
tripleDES.IV = IV;
tripleDES.Padding = System.Security.Cryptography.PaddingMode.Zeros;
System.Security.Cryptography.ICryptoTransform cTransform = tripleDES.CreateDecryptor();
byte[] resultArray = cTransform.TransformFinalBlock(enc_data, 0, enc_data.Length);
//string s = Convert.ToBase64String(resultArray);
string x = System.Text.Encoding.ASCII.GetString(resultArray);
System.IO.File.WriteAllText(#"D:\570_f.txt", x);
tripleDES.Clear();
1) -code is working fine almost but somewhere I found 8 byte of junk character replacing real text. [ main problem]
.................okokokookokok8bytejunkokokokokokokookko..............8bytjunkokokokokokokokokokoko............
2) - don't know what padding scheme is using during encryption process, I tried decryption with zero padding mode.
----testing with different length file-----
(A)
input_file |encrypted_file with_tool | decrypted_file_with_above_code
10224 byte | x | 10232 byte
ok data + last 8 hex byte 3F 00 00 00 00 00 00 00
(b)
input_file |encrypted_file with_tool | decrypted_file_with_above_code
10242 byte | x | 10248 byte
ok data + last 8 hex byte 0D 3F 3F 3F 3C 56 31 65
(C)
input_file |encrypted_file with_tool | decrypted_file_with_above_code
10258 byte | x | 10264 byte
ok data + last 24 hex byte
0A 3F 3F 14 4D 27 79 0F 61 61 61 61 61 61 61 61
61 61 3F 00 00 00 00 00
NOTE - FILE CONTAIN only character a (HEX VALUE = 61)
any advice here would be great to hear
Finally found this encyption tool is taking input in the form of block of n byte. For n byte block there was no padding while any block less than nbyte is padded by 80 followed by 00 to make it multiple of 8.
I try to decrypt in same way, devide full file in form of n byte block and then decrypt each block and save output in a buffer,
and at last convert full buffer into string and paste in a file.

setting the BluetoothLEAdvertisement.Flags in C# for iBeacon advertisemet

MS provided samples to send and receive Bluetooth Low Energy advertisements.
I saw this very helpful answer for breaking down the iBeacon packet. There's also an example for setting BluetoothLEAdvertisement.ManufacturerData as the ibeacon standards.
May I ask how can I set the Flags of the BluetoothLEAdvertisement?
For example set the value to:
02-01-06
Thanks
Edit 1:
Here's the code:
using System;
using System.Management;
using System.Text.RegularExpressions;
using Windows.Devices.Bluetooth.Advertisement;
using System.Runtime.InteropServices.WindowsRuntime;
namespace BLE_iBeacon
{
class IBeacon
{
static void Main()
{
Console.WriteLine("Advertising as iBeacon. Press Enter to exit");
// Create and initialize a new publisher instance.
BluetoothLEAdvertisementPublisher publisher = new BluetoothLEAdvertisementPublisher();
// Add a manufacturer-specific section:
var manufacturerData = new BluetoothLEManufacturerData();
// Set the company ID for the manufacturer data.
// 0x004C Apple, Inc.
manufacturerData.CompanyId = 0x004C;
byte[] dataArray = new byte[] {
// last 2 bytes of Apple's iBeacon
0x02, 0x15,
// UUID E4 C8 A4 FC F6 8B 47 0D 95 9F 29 38 2A F7 2C E7
0xE4, 0xC8, 0xA4, 0xFC,
0xF6, 0x8B, 0x47, 0x0D,
0x95, 0x9F, 0x29, 0x38,
0x2A, 0xF7, 0x2C, 0xE7,
// Major
0x00, 0x00,
// Minor
0x00, 0x01,
// TX power
0xC5
};
manufacturerData.Data = dataArray.AsBuffer();
// Add the manufacturer data to the advertisement publisher:
publisher.Advertisement.ManufacturerData.Add(manufacturerData);
publisher.Advertisement.Flags = BluetoothLEAdvertisementFlags.GeneralDiscoverableMode;
publisher.Start();
Console.Read();
publisher.Stop();
}
}
}
Edit 2:
In the C# code if I do not set the Flags, my windows laptop would advertise raw packet like:
04 3E 27 02 01
02 01 0D 45 84 D3 68 21 1B 1A FF 4C 00
02 15 E4 C8 A4 FC F6 8B 47 0D 95 9F 29 38 2A F7 2C E7
00 00 00 01 C5 BA
My purpose is to use raspberry pi's as BLE receivers. I used the Radius Networks's code here. You can see in the ibeacon_scan script, they check the packet of the advertisement to see if it's an iBeacon by:
if [[ $packet =~ ^04\ 3E\ 2A\ 02\ 01\ .{26}\ 02\ 01\ .{14}\ 02\ 15 ]]; then
So the previous raw packet would not be recognized, for missing the flag part. I am wondering if I can advertise the packet with the Flags, like:
04 3E 2A 02 01
02 01 0D 45 84 D3 68 21 1B **02 01 1A** 1A FF 4C 00
02 15 E4 C8 A4 FC F6 8B 47 0D 95 9F 29 38 2A F7 2C E7
00 00 00 01 C5 BA
instead of changing the scan script in the pi.
iBeacon on Windows
The following code publishes an iBeacon on Windows 10 machines:
// Create and initialize a new publisher instance.
BluetoothLEAdvertisementPublisher publisher = new BluetoothLEAdvertisementPublisher();
// Add a manufacturer-specific section:
var manufacturerData = new BluetoothLEManufacturerData();
// Set the company ID for the manufacturer data.
// 0x004C Apple, Inc.
manufacturerData.CompanyId = 0x004c;
// Create the payload
var writer = new DataWriter();
byte[] dataArray = new byte[] {
// last 2 bytes of Apple's iBeacon
0x02, 0x15,
// UUID e2 c5 6d b5 df fb 48 d2 b0 60 d0 f5 a7 10 96 e0
0xe2, 0xc5, 0x6d, 0xb5,
0xdf, 0xfb, 0x48, 0xd2,
0xb0, 0x60, 0xd0, 0xf5,
0xa7, 0x10, 0x96, 0xe0,
// Major
0x00, 0x00,
// Minor
0x00, 0x01,
// TX power
0xc5
};
writer.WriteBytes(dataArray);
manufacturerData.Data = writer.DetachBuffer();
// Add the manufacturer data to the advertisement publisher:
publisher.Advertisement.ManufacturerData.Add(manufacturerData);
publisher.Start();
Proximity UUID
While testing this out, my iOS device would not recognize the Proximity UUID you provided. I'm guessing this is because you generated it yourself, so the app doesn't know what to look for. Instead, I used the proximity UUID from this answer which identifies the Windows 10 device as an AirLocate iBeacon.
Flags
Windows 10 does not currently allow developers to set the flags for a Bluetooth LE advertisement. Luckily, for the Windows device to be recognized as an iBeacon, you don't need those flags!
Ideally, you want to set the flags byte to 0x1a, but other values may still work. The important flag to set is General Discoverable (0x02).
You can use BluetoothLEAdvertisementFlags is an enumeration of bit values here.
My C# is very rusty, but you might try setting the flags hex value directly with: publisher.Advertisement.Flags = 0x1A;

Encoding/Decoding hex packet

I want to send this hex packet:
00 38 60 dc 00 00 04 33 30 3c 00 00 00 20 63 62
39 62 33 61 36 37 34 64 31 36 66 32 31 39 30 64
30 34 30 63 30 39 32 66 34 66 38 38 32 62 00 06
35 2e 31 33 2e 31 00 00 02 3c
so i build the string:
string packet = "003860dc0000" + textbox1.text+ "00000020" + textbox2.text+ "0006" + textbox3.text;
then "convert" it to ascii:
conn_str = HexString2Ascii(packet);
then i send the packet... but i have this:
00 38 60 **c3 9c** 00 00 04 33 30 3c 00 00 00 20 63
62 39 62 33 61 36 37 34 64 31 36 66 32 31 39 30
64 30 34 30 63 30 39 32 66 34 66 38 38 32 62 00
06 35 2e 31 33 2e 31 00 00 02 3c **0a**
why??
Thank you!
P.S.
the function is:
private string HexString2Ascii(string hexString)
{
byte[] tmp;
int j = 0;
int lenght;
lenght=hexString.Length-2;
tmp = new byte[(hexString.Length)/2];
for (int i = 0; i <= lenght; i += 2)
{
tmp[j] =(byte)Convert.ToChar(Int32.Parse(hexString.Substring(i, 2), System.Globalization.NumberStyles.HexNumber));
j++;
}
return Encoding.GetEncoding(1252).GetString(tmp);
}
EDIT:
if i convert directly in byte, the hex packet in coded as string:
00000000 30 30 33 38 36 30 64 63 30 30 30 30 30 34 33 33 003860dc 00000433
00000010 33 30 33 43 30 30 30 30 30 30 32 30 33 34 33 32 303C0000 00203432
00000020 36 33 36 33 33 35 33 39 33 32 33 34 36 36 33 39 63633539 32346639
00000030 36 33 33 39 33 31 33 39 33 30 33 36 33 33 36 35 63393139 30363365
00000040 33 35 36 33 36 35 36 35 36 35 33 31 33 39 33 38 35636565 65313938
00000050 36 33 33 31 36 34 33 34 36 33 33 30 30 30 30 36 63316434 63300006
00000060 33 35 32 65 33 31 33 33 32 65 33 31 30 30 30 30 352e3133 2e310000
00000070 30 32 33 43 023C
You cannot convert raw binary data to string data and expect things to just work. They are not the same. This is especially true when you mix up your character encodings.
C# characters are not ASCII characters. They are Unicode characters, represented by Unicode code points. When you then turn around and write those characters out, you need to specify what kind of data to write out. When you read your byte array into a string, using Encoding.GetEncoding(1252), you are getting the characters corresponding to code page 1252, in which 0xdc is a Ü.
But when your string is being converted back into bytes to send over the network, it is being written out as UTF-8. In UTF-8, UTF-00DC cannot be encoded as a single byte, since that byte value is used to indicate the start of a multi-byte sequence. Instead, it's encoded as the multi-byte sequence 0xc3 0x9c. As far as C# is concerned, those two values are the same character. (I don't know where that extra 0x0a is coming from, but my guess is an errant line feed from one of your text boxes and/or some other part of your process).
Its not clear what exactly you're trying to do, but I suspect you are converting way too many times for it to work out correctly. If you know the byte sequence you want to send, why not just encode that as a byte[] directly? For example, use a MemoryStream and write the constant bytes you need into it.
To get the values out of your text boxes, your original code to "convert" the string of hex digits into a string of ASCII characters had the right idea. You just need to stop at the point where you have a byte array, since ultimately the byte array is what you want.
public byte[] GetBytesFrom(string hex)
{
var length = hex.Length / 2;
var result = new byte[length];
for (var i = 0; i < length; i++)
{
result[i] = byte.Parse(hex.Substring(i, 2), NumberStyles.HexNumber);
}
return result;
}
// Variable portions of packet structure.
var byte[] segment2 = GetBytesFrom(textbox1.Text);
var byte[] segment4 = GetBytesFrom(textbox2.Text);
var byte[] segment6 = GetBytesFrom(textbox3.Text);
MemoryStream output = new MemoryStream();
output.Write(new[] { 0x00, 0x38, 0x60, 0xdc, 0x00, 0x00 }, 0, 6);
output.Write(segment2, 0, segment2.Length);
output.Write(new[] { 0x00, 0x00, 0x00, 0x20 }, 0, 4);
output.Write(segment4, 0, segment4.Length);
output.Write(new[] { 0x00, 0x06 }, 0, 2);
output.Write(segment6, 0, segment6.Length);
From here, you could use MemoryStream.CopyTo() to copy it to another stream, or MemoryStream.Read() to read the entire packet into a new byte array, or MemoryStream.GetBuffer() to get the underlying buffer (though that last one is rarely what you want -- it includes unused padding bytes)

Categories