I am converting a Java class that converts BCD data to ASCII
I am converting it to .Net BCD Convertor
Following is the converted from java but it is giving wrong converted value e.g. for 123456789 it is giving 123456153153
public static string GetStringFromBcd(int[] b)
{
StringBuilder buffer = new StringBuilder();
foreach (var t in b)
{
if ((t & 0x0000000F) == 0x0000000F && ((t >> 4) & 0x0000000F) == 0x0000000F)
{
break;
}
buffer.Append((t & 0x0000000F) + "");
if ((t & 0x000000F0) != 0x000000F0)
{
buffer.Append(((t >> 4) & 0x0000000F) + "");
}
}
}
What could be the problem?
EDIT: ANSWER:
I got the source program where the data has been BCD encoded.
I found that nothing was wrong in that logic, then I discovered the source of the function where the data was converting from network stream to string and later converted to byte/int array.
following is the code
int bytesRead = tcpClient.Receive(message);//, 0, bytetoReadSize);
if (bytesRead == 0)
{
break;
//the client has disconnected from the server
}
//message has successfully been received
data += new ASCIIEncoding().GetString(message, 0, bytesRead);
here is the problem ASCIIEncoding does not convert many encoded character and gives '?'63 instead of those character , when putting 63 in BCD conversion logic it gives 153.
To resolve this error, I Modified the last line and instead of decoding , I am simply casting the received byte to char.
foreach (byte b in message)
{
data += ((char) b);
}
Here's a Similar Question that comes about it a few different ways.
Here's a Site that has an excellent detailed description of what your facing.
It shouldn't be that hard, but processing them as int's is going to be much harder.
Zoned Decimal (BCD) is pretty straight forward to convert, but you have to be careful if your taking files from a mainframe that have been converted via an ASCII transfer. It can still be converted, but the byte values change due to the ebcdic to ascii conversion during the FTP.
If your processing binary files, it's much easier to deal with.
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
namespace TestZoned
{
class Program
{
public static String zoneToString(byte[] zoneBytes)
{
Encoding ascii = Encoding.ASCII;
Encoding ebcdic = Encoding.GetEncoding("IBM037");
byte[] asciiBytes = null;
String str = null;
int zoneLen = zoneBytes.Length;
int i = zoneLen - 1;
int b1 = zoneBytes[i];
b1 = (b1 & 0xf0) >> 4;
switch (b1)
{
case 13:
case 11:
zoneBytes[i] = (byte)(zoneBytes[i] | 0xf0);
asciiBytes = Encoding.Convert(ebcdic, ascii, zoneBytes);
str = "-" + ASCIIEncoding.ASCII.GetString(asciiBytes);
break;
default:
zoneBytes[i] = (byte)(zoneBytes[i] | 0xf0);
asciiBytes = Encoding.Convert(ebcdic, ascii, zoneBytes);
str = ASCIIEncoding.ASCII.GetString(asciiBytes);
break;
}
return (str);
}
static void Main(string[] args)
{
byte[] array = { 0xf0, 0xf0, 0xf1 }; // 001
byte[] pos = { 0xf1, 0xf2, 0xf3, 0xf4, 0xf5, 0xf6, 0xf7, 0xf8, 0xf9 }; // 123456789
byte[] neg = { 0xf1, 0xf2, 0xf3, 0xf4, 0xf5, 0xf6, 0xf7, 0xf8, 0xd9 }; // -123456789
Console.WriteLine("Converted: {0}", zoneToString(array));
Console.WriteLine("Converted: {0}", zoneToString(pos));
Console.WriteLine("Converted: {0}", zoneToString(neg));
}
}
}
If you wanted to stick to something similar
public static String GetStringFromBcd(byte[] zoneBytes)
{
StringBuilder buffer = new StringBuilder();
int b1 = (zoneBytes[zoneBytes.Length - 1] & 0xf0) >> 4;
if ( (b1 == 13) || (b1 == 11) ) buffer.Append("-");
for (int i = 0; i < zoneBytes.Length; i++)
{
buffer.Append((zoneBytes[i] & 0x0f));
}
return buffer.ToString();
}
Related
I'm using this sample for getting mail from server. Problem is that response contains cyrillic symbols I cannot decode.
Here is a header:
Content-type: text/html; charset="koi8-r"
Content-Transfer-Encoding: quoted-printable
And receive response function:
static void receiveResponse(string command)
{
try
{
if (command != "")
{
if (tcpc.Connected)
{
dummy = Encoding.ASCII.GetBytes(command);
ssl.Write(dummy, 0, dummy.Length);
}
else
{
throw new ApplicationException("TCP CONNECTION DISCONNECTED");
}
}
ssl.Flush();
byte[] bigBuffer = new byte[1024*16];
int bites = ssl.Read(bigBuffer, 0, bigBuffer.Length);
byte[] buffer = new byte[bites];
Array.Copy(bigBuffer, 0, buffer, 0, bites);
sb.Append(Encoding.ASCII.GetString(buffer));
string result = sb.ToString();
// here is an unsuccessful attempt at decoding
result = Regex.Replace(result, #"=([0-9a-fA-F]{2})",
m => m.Groups[1].Success
? Convert.ToChar(Convert.ToInt32(m.Groups[1].Value, 16)).ToString()
: "");
byte[] bytes = Encoding.Default.GetBytes(result);
result = Encoding.GetEncoding("koi8r").GetString(bytes);
}
catch (Exception ex)
{
throw new ApplicationException(ex.ToString());
}
}
How to decode stream correctly? In result string I got <p>=F0=D2=C9=D7=C5=D4 =D1 =F7=C1=CE=D1</p> instead of <p>Привет я Ваня</p>.
As #Max pointed out, you will need to decode the content using the encoding algorithm declared in the Content-Transfer-Encoding header.
In your case, it is the quoted-printable encoding.
You will need to decode the text of the message into an array of bytes and then you’ll need to convert that array of bytes into a string using the appropriate System.Text.Encoding. The name of the encoding to use will typically be specified in the Content-Type header as the charset parameter (in your case, koi8-r).
Since you already have the text as bytes in the buffer variable, simply perform the deciding on that:
byte[] buffer = new byte[bites];
int decodedLength = 0;
for (int i = 0; i < bites; i++) {
if (bigBuffer[i] == (byte) '=') {
if (bites > i + 1) {
// possible hex sequence
byte b1 = bigBuffer[i + 1];
byte b2 = bigBuffer[i + 2];
if (IsXDigit (b1) && IsXDigit (b2)) {
// decode
buffer[decodedLength++] = (ToXDigit (b1) << 4) | ToXDigit (b2);
i += 2;
} else if (b1 == (byte) '\r' && b2 == (byte) '\n') {
// folded line, drop the '=\r\n' sequence
i += 2;
} else {
// error condition, just pass it through
buffer[decodedLength++] = bigBuffer[i];
}
} else {
// truncated? just pass it through
buffer[decodedLength++] = bigBuffer[i];
}
} else {
buffer[decodedLength++] = bigBuffer[i];
}
}
string result = Encoding.GetEncoding ("koi8-r").GetString (buffer, 0, decodedLength);
Custom functions:
static byte ToXDigit (byte c)
{
if (c >= 0x41) {
if (c >= 0x61)
return (byte) (c - (0x61 - 0x0a));
return (byte) (c - (0x41 - 0x0A));
}
return (byte) (c - 0x30);
}
static bool IsXDigit (byte c)
{
return (c >= (byte) 'A' && c <= (byte) 'F') || (c >= (byte) 'a' && c <= (byte) 'f') || (c >= (byte) '0' && c <= (byte) '9');
}
Of course, instead of writing your own hodge podge IMAP library, you could just use MimeKit and MailKit ;-)
I have a DSC IT_100 . I want to send data command to invoke alarm. However i couldnt handle it.
SerialPort myPort = new SerialPort("COM6");
myPort.BaudRate = 9600;
myPort.Parity = Parity.None;
myPort.StopBits = StopBits.One;
myPort.DataBits = 8;
myPort.Handshake = Handshake.None;
myPort.Open();
I am sending data as HEX. Here is my program
var sendData = GetBytes("6541D2CRLF");
myPort.WriteLine(sendData);
private static string GetBytes(string input)
{
string result = "";
char[] values = input.ToCharArray();
foreach (char letter in values)
{
int value = Convert.ToInt32(letter);
result += String.Format("{0:X}", value);
}
return result;
}
In programming sheet it says that :
Could you please help me how to send this data to COM..
I believe you need to follow method:
public static byte[] BuildPacket(string Command, string DataBytes)
{
List<byte> output = new List<byte>();
// Add command text
output.AddRange(Encoding.ASCII.GetBytes(Command));
// Add data bytes
output.AddRange(Encoding.ASCII.GetBytes(DataBytes));
// Add checksum
byte chkSum = 0;
foreach(byte b in output )
{
chkSum += b;
}
output.AddRange(Encoding.ASCII.GetBytes(chkSum.ToString("X")));
output.AddRange(Encoding.ASCII.GetBytes(Environment.NewLine));
return output.ToArray();
}
To get the bytes you need to send to the port use the method like this:
var sendData = BuildPacket("654","3");
myPort.WriteLine(sendData);
Having looked at your programming sheet there seems to an error:
The data bytes shown is incorrect character 1 is NOT 33 in hex, it is 31. I believe that the data bytes should be 3 in you case. You can play around with this but when I use the value 3 for the data bytes I get the check sum they have calculated when I use 1 I do not.
Please note I don't have a COM port or your device to test this so your results may vary.
I read the manual for the device here.
I have this working. After a night of headscratching.
The key is the checksum calculation. The code below connects to my IT-100 (running on a IP/Serial adapter), and sends it both a "Poll" command and a "Status" command.
The hard part (that took me several hours of trial and error) was the bizzare checksum calculation. The algorithm, from the manual, is really strange:
Add all the ASCII Codes together
Truncate to 8 bits
Create a high and low nibble
Convert the values of the nibbles (0x01 -> 0x0F) to the ASCII code for the character.
For example, in step 4 above: 0x01 = '1' = 0x31 = Beyond Stupid
My complete code is below.
using System;
using System.Net.Sockets;
using System.Text;
namespace IT100TestDriver
{
class Program
{
static void Main(string[] args)
{
Socket s = new Socket(AddressFamily.InterNetwork, SocketType.Stream, ProtocolType.Tcp);
var address = System.Net.IPAddress.Parse("192.168.1.250");
System.Net.EndPoint ep = new System.Net.IPEndPoint(address, 1024);
s.Connect(ep);
byte[] buffer = new byte[1024];
NetworkStream ns = new NetworkStream(s);
AsyncCallback callback = null;
callback = ar =>
{
int bytesRead = ns.EndRead(ar);
string fromIT100 = Encoding.ASCII.GetString(buffer, 0, bytesRead);
Console.Write(fromIT100);
ns.BeginRead(buffer, 0, 1024, callback, null);
};
Console.WriteLine("Connected to: {0}", ep.ToString());
ns.BeginRead(buffer, 0, 1024, callback, null);
while (true)
{
var ki = Console.ReadKey(true);
byte[] command = null;
switch (ki.KeyChar)
{
case 'p':
{
Console.WriteLine("Sending Ping");
command = Encoding.ASCII.GetBytes("000"); // ping
}
break;
case 's':
{
Console.WriteLine("Sending Status Request");
command = Encoding.ASCII.GetBytes("001"); // status request
}
break;
}
if (command != null)
{
byte[] crlf = { 0x0D, 0x0A };
ns.Write(command, 0, command.Length);
byte[] checksum = calculateChecksum(command);
ns.Write(checksum, 0, checksum.Length);
ns.Write(crlf, 0, crlf.Length);
}
}
}
private static byte[] calculateChecksum(byte[] dataToSend)
{
int sum = 0;
foreach(byte b in dataToSend)
{
sum += b;
}
int truncatedto8Bits = sum & 0x000000FF;
byte upperNibble = (byte)(((byte)(truncatedto8Bits & 0x000000F0)) >> 4);
byte lowerNibble = (byte) (truncatedto8Bits & 0x0000000F);
// value is 0x09, need to treat it as '9' and convert to ASCII (0x39)
byte upperNibbleAsAscii = (byte)nibbleToAscii(upperNibble);
byte lowerNibbleAsAscii = (byte)nibbleToAscii(lowerNibble);
return new byte[] { upperNibbleAsAscii, lowerNibbleAsAscii };
}
private static char nibbleToAscii(byte b)
{
switch (b)
{
case 0x00: return '0';
case 0x01: return '1';
case 0x02: return '2';
case 0x03: return '3';
case 0x04: return '4';
case 0x05: return '5';
case 0x06: return '6';
case 0x07: return '7';
case 0x08: return '8';
case 0x09: return '9';
case 0x0A: return 'A';
case 0x0B: return 'B';
case 0x0C: return 'C';
case 0x0D: return 'D';
case 0x0E: return 'E';
case 0x0F: return 'F';
default:
throw new ArgumentOutOfRangeException("Unknown Nibble");
}
}
}
}
Hi I am using code from this link, Can you let me know why signature is verify is not working?
Java signer is using BouncyCastleProvider with SHA1withRSA, here is dotnet verify code....
using System;
using System.IO;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Security.Cryptography;
using System.Security.Cryptography.X509Certificates;
using Org.BouncyCastle.Asn1;
using Org.BouncyCastle.Crypto;
using Org.BouncyCastle.Crypto.Parameters;
using Org.BouncyCastle.OpenSsl;
using Org.BouncyCastle.Security;
using Org.BouncyCastle.Utilities.Encoders;
namespace ConsoleApplication1
{
class Program
{
static void Main(string[] args)
{
string pubkey = #"MFwwDQYJKoZIhvcNAQEBBQADSwAwSAJBAMf54mcK3EYJn9tT9BhRoTX+8AkqojIyeSfog9ncYEye0VXyBULGg2lAQsDRt8lZsvPioORZW7eB6IKawshoWUsCAwEAAQ==";
String signature = "770bb2610bf6b2602ce2b3ad8489054f4ed59c9b0c9299327f76ecbc60a8bb9a725cfae901fc189d4bafcf73a2f4aed8dffe9842f7b6196ddfcd040c7271c7ca";
String signData = "C2:AE:D6:2B:DF:A4";
byte[] expectedSig = System.Convert.FromBase64String(signature);
byte[] baKey = System.Convert.FromBase64String(pubkey);
byte[] data = Encoding.UTF8.GetBytes(signData);
//Console.WriteLine(p.VerifyData(data, new SHA1CryptoServiceProvider(), expectedSig));
/* Init alg */
ISigner signer = SignerUtilities.GetSigner("SHA1withRSA");
/* Populate key */
signer.Init(false, DecodeX509PublicKey2(baKey));
/* Calculate the signature and see if it matches */
signer.BlockUpdate(data, 0, data.Length);
Console.WriteLine(signer.VerifySignature(expectedSig));
Console.In.ReadLine();
}
public static RsaKeyParameters DecodeX509PublicKey2(byte[] x509key)
{
byte[] SeqOID = { 0x2A, 0x86, 0x48, 0x86, 0xF7, 0x0D, 0x01, 0x01, 0x01 };
MemoryStream ms = new MemoryStream(x509key);
BinaryReader reader = new BinaryReader(ms);
if (reader.ReadByte() == 0x30)
ReadASNLength(reader); //skip the size
else
return null;
int identifierSize = 0; //total length of Object Identifier section
if (reader.ReadByte() == 0x30)
identifierSize = ReadASNLength(reader);
else
return null;
if (reader.ReadByte() == 0x06) //is the next element an object identifier?
{
int oidLength = ReadASNLength(reader);
byte[] oidBytes = new byte[oidLength];
reader.Read(oidBytes, 0, oidBytes.Length);
if (oidBytes.SequenceEqual(SeqOID) == false) //is the object identifier rsaEncryption PKCS#1?
return null;
int remainingBytes = identifierSize - 2 - oidBytes.Length;
reader.ReadBytes(remainingBytes);
}
if (reader.ReadByte() == 0x03) //is the next element a bit string?
{
ReadASNLength(reader); //skip the size
reader.ReadByte(); //skip unused bits indicator
if (reader.ReadByte() == 0x30)
{
ReadASNLength(reader); //skip the size
if (reader.ReadByte() == 0x02) //is it an integer?
{
int modulusSize = ReadASNLength(reader);
byte[] modulus = new byte[modulusSize];
reader.Read(modulus, 0, modulus.Length);
if (modulus[0] == 0x00) //strip off the first byte if it's 0
{
byte[] tempModulus = new byte[modulus.Length - 1];
Array.Copy(modulus, 1, tempModulus, 0, modulus.Length - 1);
modulus = tempModulus;
}
Array.Reverse(modulus); //convert to big-endian
if (reader.ReadByte() == 0x02) //is it an integer?
{
int exponentSize = ReadASNLength(reader);
byte[] exponent = new byte[exponentSize];
reader.Read(exponent, 0, exponent.Length);
Array.Reverse(exponent); //convert to big-endian
//RSAParameters RSAKeyInfo = new RSAParameters();
//RSAKeyInfo.Modulus = modulus;
//RSAKeyInfo.Exponent = exponent;
return MakeKey(BitConverter.ToString(modulus).Replace("-", string.Empty), BitConverter.ToString(exponent).Replace("-", string.Empty), false);
}
}
}
}
return null;
}
public static RsaKeyParameters MakeKey(String modulusHexString, String exponentHexString, bool isPrivateKey)
{
var modulus = new Org.BouncyCastle.Math.BigInteger(modulusHexString, 16);
var exponent = new Org.BouncyCastle.Math.BigInteger(exponentHexString, 16);
return new RsaKeyParameters(isPrivateKey, modulus, exponent);
}
public static RSACryptoServiceProvider DecodeX509PublicKey(byte[] x509key)
{
byte[] SeqOID = { 0x2A, 0x86, 0x48, 0x86, 0xF7, 0x0D, 0x01, 0x01, 0x01 };
MemoryStream ms = new MemoryStream(x509key);
BinaryReader reader = new BinaryReader(ms);
if (reader.ReadByte() == 0x30)
ReadASNLength(reader); //skip the size
else
return null;
int identifierSize = 0; //total length of Object Identifier section
if (reader.ReadByte() == 0x30)
identifierSize = ReadASNLength(reader);
else
return null;
if (reader.ReadByte() == 0x06) //is the next element an object identifier?
{
int oidLength = ReadASNLength(reader);
byte[] oidBytes = new byte[oidLength];
reader.Read(oidBytes, 0, oidBytes.Length);
if (oidBytes.SequenceEqual(SeqOID) == false) //is the object identifier rsaEncryption PKCS#1?
return null;
int remainingBytes = identifierSize - 2 - oidBytes.Length;
reader.ReadBytes(remainingBytes);
}
if (reader.ReadByte() == 0x03) //is the next element a bit string?
{
ReadASNLength(reader); //skip the size
reader.ReadByte(); //skip unused bits indicator
if (reader.ReadByte() == 0x30)
{
ReadASNLength(reader); //skip the size
if (reader.ReadByte() == 0x02) //is it an integer?
{
int modulusSize = ReadASNLength(reader);
byte[] modulus = new byte[modulusSize];
reader.Read(modulus, 0, modulus.Length);
if (modulus[0] == 0x00) //strip off the first byte if it's 0
{
byte[] tempModulus = new byte[modulus.Length - 1];
Array.Copy(modulus, 1, tempModulus, 0, modulus.Length - 1);
modulus = tempModulus;
}
Array.Reverse(modulus); //convert to big-endian
if (reader.ReadByte() == 0x02) //is it an integer?
{
int exponentSize = ReadASNLength(reader);
byte[] exponent = new byte[exponentSize];
reader.Read(exponent, 0, exponent.Length);
Array.Reverse(exponent); //convert to big-endian
RSACryptoServiceProvider RSA = new RSACryptoServiceProvider();
RSAParameters RSAKeyInfo = new RSAParameters();
RSAKeyInfo.Modulus = modulus;
RSAKeyInfo.Exponent = exponent;
RSA.ImportParameters(RSAKeyInfo);
return RSA;
}
}
}
}
return null;
}
public static int ReadASNLength(BinaryReader reader)
{
//Note: this method only reads lengths up to 4 bytes long as
//this is satisfactory for the majority of situations.
int length = reader.ReadByte();
if ((length & 0x00000080) == 0x00000080) //is the length greater than 1 byte
{
int count = length & 0x0000000f;
byte[] lengthBytes = new byte[4];
reader.Read(lengthBytes, 4 - count, count);
Array.Reverse(lengthBytes); //
length = BitConverter.ToInt32(lengthBytes, 0);
}
return length;
}
}
}
Java code used to sign signature data:
private static final java.security.Signature signer;
static final String transformation = "RSA/ECB/PKCS1Padding";
static {
try {
signer = java.security.Signature.getInstance("SHA1withRSA");
} catch (NoSuchAlgorithmException e) {
e.printStackTrace();
}
}
static String sign(String clearText) {
String signed = null;
try {
Security.addProvider(new org.bouncycastle.jce.provider.BouncyCastleProvider());
byte[] data = clearText.getBytes("UTF-8");
signer.initSign(getPrivateKey());
signer.update(data);
byte[] digitalSignature = signer.sign();
//--toHex
signed = org.apache.commons.codec.binary.Hex.encodeHexString(digitalSignature);
} catch (Exception e) {
e.printStackTrace();
}
return signed;
}
KeyPair generateKeyPair() {
KeyPair kp = null;
// Generate a key-pair
KeyPairGenerator kpg;
SecureRandom secureRandom;
try {
kpg = KeyPairGenerator.getInstance("RSA");
secureRandom = SecureRandom.getInstance("SHA1PRNG", "SUN");
secureRandom.setSeed(secureRandomSeed);
kpg.initialize(512, secureRandom);
kp = kpg.generateKeyPair();
} catch (Exception e) {
e.printStackTrace();
}
return kp;
}
Here is code in C# that signs and verifies:
static void test3()
{
AsymmetricCipherKeyPair keys = generateNewKeys();
/* Init alg */
ISigner sig = SignerUtilities.GetSigner("SHA1withRSA");
/* Populate key */
sig.Init(true, keys.Private);
/* Get the bytes to be signed from the string */
var bytes = Encoding.UTF8.GetBytes(signData);
/* Calc the signature */
sig.BlockUpdate(bytes, 0, bytes.Length);
byte[] signature = sig.GenerateSignature();
/* Base 64 encode the sig so its 8-bit clean */
var signedString = Convert.ToBase64String(signature);
Console.WriteLine(signedString);
string expectedSignature = signedString;
/* Init alg */
ISigner signer = SignerUtilities.GetSigner("SHA1withRSA");
/* Populate key */
signer.Init(false, keys.Public);
/* Get the signature into bytes */
var expectedSig = Convert.FromBase64String(expectedSignature);
/* Get the bytes to be signed from the string */
var msgBytes = Encoding.UTF8.GetBytes(signData);
/* Calculate the signature and see if it matches */
signer.BlockUpdate(msgBytes, 0, msgBytes.Length);
/*Verify*/
bool result= signer.VerifySignature(expectedSig);
Console.WriteLine(result);
}
There's a couple problems problems here.
String signature = "770bb ... 1c7ca";
...
byte[] expectedSig = System.Convert.FromBase64String(signature);
You're Base64 decoding the signature, but it's not Base64 encoded, it's Hex encoded.
The second problem is in the DecodeX509PublicKey methods (which admittedly is my mistake because I provided this code in another answer.) The specific problem lines are
Array.Reverse(modulus); //convert to big-endian
and
Array.Reverse(exponent); //convert to big-endian
I repeatedly read that the ASN.1 and the .Net API use opposite endieness for their keys, and so I was under the impression that the endieness needed to be reversed to account for this. (I really should have done a test like your signature verification to be sure, rather than just looking at the key values in memory >.<) Regardless, remove these lines, fix the encoding problem, and your signature will verify properly (successfully tested using your sample data as well as my own).
Also, this line in your sign method isn't quite right:
Security.addProvider(new org.bouncycastle.jce.provider.BouncyCastleProvider());
By the time you get to that point in the code, the signer object has already been instantiated using the default provider. Also, you don't need to be adding the Bouncy Castle provider each time you want to sign some data, it will only actually add the provider the first time you make this call and will ignore it for all subsequent calls.
Further, the signer object is declaired static, but your usage of it is not thread safe.
What you more likely want to do is add the provider in the static block and then instantiate the signer explicitly using the Bouncy Castle provider. If you don't explicitly specify Bouncy Castle as the provider (or add Bouncy Castle as the highest priority using insertProviderAt), the default provider will be used instead.
static {
try {
Security.addProvider(new org.bouncycastle.jce.provider.BouncyCastleProvider());
} catch (Exception e) {
e.printStackTrace();
}
}
...
String signed = null;
try {
java.security.Signature signer = java.security.Signature.getInstance("SHA1withRSA", "BC");
byte[] data = clearText.getBytes("UTF-8");
signer.initSign(getPrivateKey());
...
I'm writing some binary protocol messages in .Net using strings, and it mostly works, except for one particular case.
The message I'm trying to send is:
String cmdPacket = "\xFD\x0B\x16MBEPEXE1.";
myDevice.Write(Encoding.ASCII.GetBytes(cmdPacket));
(to help decode, those bytes are 253, 11, 22, then the ASCII chars: "MBEPEXE1.").
Except when I do the Encoding.ASCII.GetBytes, the 0xFD comes out as byte 0x3F
(value 253 changed to 63).
(I should point out that the \x0B and \x16 are interpreted correctly as Hex 0B & Hex 16)
I've also tried Encoding.UTF8 and Encoding.UTF7, to no avail.
I feel there is probably a good simple way to express values above 128 in Strings, and convert them to bytes, but I'm missing it.
Any guidance?
Ignoring if it's good or bad what you are doing, the encoding ISO-8859-1 maps all its characters to the characters with the same code in Unicode.
// Bytes with all the possible values 0-255
var bytes = Enumerable.Range(0, 256).Select(p => (byte)p).ToArray();
// String containing the values
var all1bytechars = new string(bytes.Select(p => (char)p).ToArray());
// Sanity check
Debug.Assert(all1bytechars.Length == 256);
// The encoder, you could make it static readonly
var enc = Encoding.GetEncoding("ISO-8859-1"); // It is the codepage 28591
// string-to-bytes
var bytes2 = enc.GetBytes(all1bytechars);
// bytes-to-string
var all1bytechars2 = enc.GetString(bytes);
// check string-to-bytes
Debug.Assert(bytes.SequenceEqual(bytes2));
// check bytes-to-string
Debug.Assert(all1bytechars.SequenceEqual(all1bytechars2));
From the wiki:
ISO-8859-1 was incorporated as the first 256 code points of ISO/IEC 10646 and Unicode.
Or a simple and fast method to convert a string to a byte[] (with unchecked and checked variant)
public static byte[] StringToBytes(string str)
{
var bytes = new byte[str.Length];
for (int i = 0; i < str.Length; i++)
{
bytes[i] = checked((byte)str[i]); // Slower but throws OverflowException if there is an invalid character
//bytes[i] = unchecked((byte)str[i]); // Faster
}
return bytes;
}
ASCII is a 7-bit code. The high-order bit used to be used as a parity bit, so "ASCII" could have even, odd or no parity. You may notice that 0x3F (decimal 63) is the ASCII character ?. That is what non-ASCII octets (those greater than 0x7F/decimal 127) are converted to by the CLR's ASCII encoding. The reason is that there is no standard ASCII character representation of the code points in the range 0x80–0xFF.
C# strings are UTF-16 encoded Unicode internally. If what you care about are the byte values of the strings, and you know that the strings are, in fact, characters whose Unicode code points are in the range U+0000 through U+00FF, then its easy. Unicode's first 256 codepoints (0x00–0xFF), the Unicode blocks C0 Controls and Basic Latin (\x00-\x7F) and C1 Controls and Latin Supplement (\x80-\xFF) are the "normal" ISO-8859-1 characters. A simple incantation like this:
String cmdPacket = "\xFD\x0B\x16MBEPEXE1.";
byte[] buffer = cmdPacket.Select(c=>(byte)c).ToArray() ;
myDevice.Write(buffer);
will get you the byte[] you want, in this case
// \xFD \x0B \x16 M B E P E X E 1 .
[ 0xFD , 0x0B , 0x16 , 0x4d , 0x42 , 0x45, 0x50 , 0x45 , 0x58 , 0x45 , 0x31 , 0x2E ]
With LINQ, you could do something like this:
String cmdPacket = "\xFD\x0B\x16MBEPEXE1.";
myDevice.Write(cmdPacket.Select(Convert.ToByte).ToArray());
Edit: Added an explanation
First, you recognize that your string is really just an array of characters. What you want is an "equivalent" array of bytes, where each byte corresponds to a character.
To get the array, you have to "map" each character of the original array as a byte in the new array. To do that, you can use the built-in System.Convert.ToByte(char) method.
Once you've described your mapping from characters to bytes, it's as simple as projecting the input string, through the mapping, into an array.
Hope that helps!
I use Windows-1252 as it seems to give the most bang for the byte
And is compatible with all .NET string values
You will probably want to comment out the ToLower
This was built for compatibility with SQL char (single byte)
namespace String1byte
{
/// <summary>
/// Interaction logic for MainWindow.xaml
/// </summary>
public partial class MainWindow : Window
{
public MainWindow()
{
InitializeComponent();
String8bit s1 = new String8bit("cat");
String8bit s2 = new String8bit("cat");
String8bit s3 = new String8bit("\xFD\x0B\x16MBEPEXE1.");
HashSet<String8bit> hs = new HashSet<String8bit>();
hs.Add(s1);
hs.Add(s2);
hs.Add(s3);
System.Diagnostics.Debug.WriteLine(hs.Count.ToString());
System.Diagnostics.Debug.WriteLine(s1.Value + " " + s1.GetHashCode().ToString());
System.Diagnostics.Debug.WriteLine(s2.Value + " " + s2.GetHashCode().ToString());
System.Diagnostics.Debug.WriteLine(s3.Value + " " + s3.GetHashCode().ToString());
System.Diagnostics.Debug.WriteLine(s1.Equals(s2).ToString());
System.Diagnostics.Debug.WriteLine(s1.Equals(s3).ToString());
System.Diagnostics.Debug.WriteLine(s1.MatchStart("ca").ToString());
System.Diagnostics.Debug.WriteLine(s3.MatchStart("ca").ToString());
}
}
public struct String8bit
{
private static Encoding EncodingUnicode = Encoding.Unicode;
private static Encoding EncodingWin1252 = System.Text.Encoding.GetEncoding("Windows-1252");
private byte[] bytes;
public override bool Equals(Object obj)
{
// Check for null values and compare run-time types.
if (obj == null) return false;
if (!(obj is String8bit)) return false;
String8bit comp = (String8bit)obj;
if (comp.Bytes.Length != this.Bytes.Length) return false;
for (Int32 i = 0; i < comp.Bytes.Length; i++)
{
if (comp.Bytes[i] != this.Bytes[i])
return false;
}
return true;
}
public override int GetHashCode()
{
UInt32 hash = (UInt32)(Bytes[0]);
for (Int32 i = 1; i < Bytes.Length; i++) hash = hash ^ (UInt32)(Bytes[0] << (i%4)*8);
return (Int32)hash;
}
public bool MatchStart(string start)
{
if (string.IsNullOrEmpty(start)) return false;
if (start.Length > this.Length) return false;
start = start.ToLowerInvariant(); // SQL is case insensitive
// Convert the string into a byte array
byte[] unicodeBytes = EncodingUnicode.GetBytes(start);
// Perform the conversion from one encoding to the other
byte[] win1252Bytes = Encoding.Convert(EncodingUnicode, EncodingWin1252, unicodeBytes);
for (Int32 i = 0; i < win1252Bytes.Length; i++) if (Bytes[i] != win1252Bytes[i]) return false;
return true;
}
public byte[] Bytes { get { return bytes; } }
public String Value { get { return EncodingWin1252.GetString(Bytes); } }
public Int32 Length { get { return Bytes.Count(); } }
public String8bit(string word)
{
word = word.ToLowerInvariant(); // SQL is case insensitive
// Convert the string into a byte array
byte[] unicodeBytes = EncodingUnicode.GetBytes(word);
// Perform the conversion from one encoding to the other
bytes = Encoding.Convert(EncodingUnicode, EncodingWin1252, unicodeBytes);
}
public String8bit(Byte[] win1252bytes)
{ // if reading from SQL char then read as System.Data.SqlTypes.SqlBytes
bytes = win1252bytes;
}
}
}
This question already has answers here:
Closed 12 years ago.
Possible Duplicate:
How do you convert Byte Array to Hexadecimal String, and vice versa, in C#?
For testing my encryption algorithm I have being provided keys, plain text and their resulting cipher text.
The keys and plaintext are in strings
How do i convert it to a hex byte array??
Something like this : E8E9EAEBEDEEEFF0F2F3F4F5F7F8F9FA
To something like this :
byte[] key = new byte[16] { 0xE8, 0xE9, 0xEA, 0xEB, 0xED, 0xEE, 0xEF, 0xF0, 0xF2, 0xF3, 0xF4, 0xF5, 0xF7, 0xF8, 0xF9, 0xFA} ;
Thanx in advance :)
Do you need this?
static class HexStringConverter
{
public static byte[] ToByteArray(String HexString)
{
int NumberChars = HexString.Length;
byte[] bytes = new byte[NumberChars / 2];
for (int i = 0; i < NumberChars; i += 2)
{
bytes[i / 2] = Convert.ToByte(HexString.Substring(i, 2), 16);
}
return bytes;
}
}
Hope it helps.
Sample code from MSDN:
string hexValues = "48 65 6C 6C 6F 20 57 6F 72 6C 64 21";
string[] hexValuesSplit = hexValues.Split(' ');
foreach (String hex in hexValuesSplit)
{
// Convert the number expressed in base-16 to an integer.
int value = Convert.ToInt32(hex, 16);
// Get the character corresponding to the integral value.
string stringValue = Char.ConvertFromUtf32(value);
char charValue = (char)value;
Console.WriteLine("hexadecimal value = {0}, int value = {1}, char value = {2} or {3}", hex, value, stringValue, charValue);
}
You only have to change it to split the string on every 2 chars instead of on spaces.
did u mean this
StringBuilder Result = new StringBuilder();
string HexAlphabet = "0123456789ABCDEF";
foreach (byte B in Bytes)
{
Result.Append(HexAlphabet[(int)(B >> 4)]);
Result.Append(HexAlphabet[(int)(B & 0xF)]);
}
return Result.ToString();