Why is the endian reversed after sending over TCP - c#

I have a client written in C#, and a server written in python. The messages that I send over the socket are 8 bytes followed by the data, the 8 bytes are the data length.
In C# before sending, I convert the 8-byte data length too big endian as shown:
public void Send(SSLMsg m)
{
string json = m.Serialize();
byte[] data = Encoding.ASCII.GetBytes(json);
ulong dataLen = (ulong)data.Length;
byte[] dataLenPacked = packIt(dataLen);
Log("Sending " + dataLen + " " + json);
sslStream.Write(dataLenPacked);
sslStream.Write(data);
sslStream.Flush();
}
private byte[] packIt(ulong n)
{
byte[] bArr = BitConverter.GetBytes(n);
if (BitConverter.IsLittleEndian)
Array.Reverse(bArr, 0, 8);
return bArr;
}
The message is sent successfully and I am getting tied up in the python server code since the unpack format should be correct here shouldn't it?
(length,) = unpack('>Q', data)
# len(data) is 8 here
# length is 1658170187863248538
Isn't the big-endian character '>'? Why is my length so long?
UPDATE:
There was a bug where I was unpacking the wrong 8 bytes, that has been fixed, now that I am unpacking the correct data I still have the same question.
(length,) = unpack('>Q', data)
# len(data) is 8 here
# length is 13330654897016668160L
The correct length is given only if I unpack using little endian even though I sent the bytes to the server using big-endian... so I am expecting >Q to work, but instead
(length,) = unpack('<Q', data)
# len(data) is 8 here
# length is 185
Here is how I am receiving the bytes in python:
while (True):
r,w,e = select.select(...)
for c in r:
if (c == socket):
connection_accept(c)
else
# c is SSL wrapped at this point
read = 0
data = []
while (read != 8):
bytes = c.recv(min(8-read, 8))
read += len(bytes)
data.append(bytes)
joinedData = ''.join(data)
# the below length is 13330654897016668160L
# I am expecting it to be 185
(length,) = unpack('>Q', joinedData)
# the below length is 185, it should not be however
# since the bytes were sent in big-endian
(length,) = unpack('<Q', joinedData)

Something is wrong with your code:
length is 1658170187863248538
This is in hex 1703010020BB4E9A. This has nothing to do with a length of 8, no matter which endianess is involved. Instead it looks suspiciously like a TLS record:
17 - record type application data (decimal 23)
03 01 - protocol version TLS 1.0 (aka SSL 3.1)
00 20 - length of the following encrypted data (32 byte)
..
Since according to your code your are doing SSL there is probably something wrong in your receiver. My guess is that you read from the plain socket instead of the SSL socket and thus read the encrypted data instead of the decrypted ones.

On client side, when you write data to stream, you're doing two Write calls:
sslStream.Write(dataLenPacked);
sslStream.Write(data);
sslStream.Flush();
MSDN says about NetworkStream.Write: The Write method blocks until the requested number of bytes are sent or a SocketException is thrown. On the server side, there is no guarantee that you will receive all bytes in one receive call - it depends on OS, eth driver/config and etc. So, you have to handle this scenario. As I can see in you're handling it by reading 8 or less bytes, but socket.recv says, it's better to receive by bigger portions. Here is my implementation of the server on Python. It creates binary file in the current folder with received bytes - might be helpful to analyze what's wrong. To set listening port need to use -p/--port argument:
#!/usr/bin/env python
import sys, socket, io
import argparse
import struct
CHUNK_SIZE = 4096
def read_payload(connection, payload_len):
recv_bytes = 0
total_data = ""
while (recv_bytes < payload_len):
data = connection.recv(CHUNK_SIZE)
if not data:
break
total_data += data
recv_bytes += len(data)
if len(total_data) != payload_len:
print >> sys.stderr, "-ERROR. Expected to read {0} bytes, but have read {0} bytes\n".format(payload_len, len(total_data))
return total_data
def handle_connection(connection, addr):
total_received = 0
addrAsStr = "{0}:{1}".format(addr[0], addr[1])
# write receved bytes to file for analyzis
filename = "{0}_{1}.bin".format(addr[0], addr[1])
file = io.FileIO(filename, "w")
print "Connection from {0}".format(addrAsStr)
try:
# loop for handling data transfering for particular connection
while True:
header = connection.recv(CHUNK_SIZE)
header_len = len(header)
total_received += header_len
if header_len == 0:
break
if header_len < 8:
print >> sys.stderr, "-ERROR. Received header with len {0} less than 8 bytes!\n".format(header_len)
break
print("Header len is {0} bytes".format(len(header)))
# extract payload length - it's first 8 bytes
real_header = header[0:8]
file.write(real_header)
# more about unpack - https://docs.python.org/3/library/struct.html#module-struct
# Byte order - network (= big-endian), type - unsigned long long (8 bytes)
payload_len = struct.unpack("!Q", real_header)[0]
print("Payload len is {0} bytes".format(payload_len))
# extract payload from header
payload_in_header = header[8:] if header_len > 8 else ""
if len(payload_in_header) > 0:
print "Payload len in header is {0} bytes".format(len(payload_in_header))
file.write(payload_in_header)
# calculate remains
remains_payload_len = payload_len - len(payload_in_header)
remains_payload = read_payload(connection, remains_payload_len)
payload = payload_in_header + remains_payload
print("Payload is '{0}'".format(payload))
if remains_payload:
file.write(remains_payload)
else:
break
total_received += len(remains_payload)
finally:
file.close()
return total_received
def main():
parser = argparse.ArgumentParser()
parser.add_argument('-p', '--port', required=True)
args = parser.parse_args()
# listen tcp socket on all interfaces
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
s.bind(("0.0.0.0", int(args.port)))
s.listen(1)
# loop for handling incoming connection
while True:
print "Waiting for a connection..."
(connection, addr) = s.accept()
addrAsStr = "{0}:{1}".format(addr[0], addr[1])
try:
total_received = handle_connection(connection, addr)
print "Handled connection from {0}. Received: {1} bytes\n".format(addrAsStr, total_received)
finally:
# Clean up the connection
connection.close()
if __name__ == "__main__":
main()
To make this example full, here is C# client. It uses one external library - Newtonsoft.Json for serialization:
using Newtonsoft.Json;
using System;
using System.Net;
using System.Net.Sockets;
using System.Text;
using System.Threading;
namespace SimpleTcpClient
{
class SimpleTcpClient : IDisposable
{
readonly TcpClient _client;
public SimpleTcpClient(string host, int port)
{
_client = new TcpClient(host, port);
}
public void Send(byte[] payload)
{
// Get network order of array length
ulong length = (ulong)IPAddress.HostToNetworkOrder(payload.LongLength);
var stream = _client.GetStream();
// Write length
stream.Write(BitConverter.GetBytes(length), 0, sizeof(long));
// Write payload
stream.Write(payload, 0, payload.Length);
stream.Flush();
Console.WriteLine("Have sent {0} bytes", sizeof(long) + payload.Length);
}
public void Dispose()
{
try { _client.Close(); }
catch { }
}
}
class Program
{
class DTO
{
public string Name { get; set; }
public int Age { get; set; }
public double Weight { get; set; }
public double Height { get; set; }
public string RawBase64 { get; set; }
}
static void Main(string[] args)
{
// Set server name/ip-address
string server = "192.168.1.101";
// Set server port
int port = 8080;
string[] someNames = new string[]
{
"James", "David", "Christopher", "George", "Ronald",
"John", "Richard", "Daniel", "Kennet", "Anthony",
"Robert","Charles", "Paul", "Steven", "Kevin",
"Michae", "Joseph", "Mark", "Edward", "Jason",
"Willia", "Thomas", "Donald", "Brian", "Jeff"
};
// Init random generator
Random rnd = new Random(Environment.TickCount);
int i = 1;
while (true) {
try {
using (var c = new SimpleTcpClient(server, port)) {
byte[] rawData = new byte[rnd.Next(16, 129)];
rnd.NextBytes(rawData);
// Create random data transfer object
var d = new DTO() {
Name = someNames[rnd.Next(0, someNames.Length)],
Age = rnd.Next(10, 101),
Weight = rnd.Next(70, 101),
Height = rnd.Next(165, 200),
RawBase64 = Convert.ToBase64String(rawData)
};
// UTF-8 doesn't have endianness - so we can convert it to byte array and send it
// More about it - https://stackoverflow.com/questions/3833693/isn-t-on-big-endian-machines-utf-8s-byte-order-different-than-on-little-endian
var bytes = Encoding.UTF8.GetBytes(JsonConvert.SerializeObject(d));
c.Send(bytes);
}
}
catch (Exception ex) {
Console.WriteLine("Get exception when send: {0}\n", ex);
}
Thread.Sleep(200);
i++;
}
}
}
}

According to https://msdn.microsoft.com/en-us/library/z78xtwts(v=vs.110).aspx you are reversing 9 bytes when you invoke:
if (BitConverter.IsLittleEndian)
Array.Reverse(bArr, 0, 8);
and according to https://www.displayfusion.com/Discussions/View/converting-c-data-types-to-c/?ID=38db6001-45e5-41a3-ab39-8004450204b3 a ulong in C# is only 8 bytes.
I don't think that this is necessarily an answer, but maybe it's a clue?

Related

TCP c# Getting specific protocol information from a byte array

When i receive a connection, i get sent a sort of "Connect Request" message with that connection and in that byte array I have 2 Headers and then the data, 1 header that is 16 bytes and one header that is 48 bytes.
But it seems that i am doing something wrong here. I can read the data i receive just fine. But when trying to translate the headers over something seems to be wrong.
I have been giving the documentation that in the first header the lenght of the request is saved in "byte index 2 and with the byte length 2" So byte 2 and 3 from the array. I then know how each bit should function, we know that with Bit 10-15: Each bit will be set to 0. And that Bit 0-9: Contains the actual lenght count.
In my example i receive the data, split it up so i have my 2 headers as their own arrays and try to look at the data, tried to convert the bytes into int for the lenght but that made no sense so i even split up the 2 bytes to try and see but they return data that does not corrospond with what im told. "Byte nr 2 returns 00000000" and "Byte nr 3 returns 00010101"
Heres my code, i hope someone can tell me where ive gone wrong, im certainly getting some data. since i can read the data part of the message without issue.
public static void StartData(TcpListener listener)
{
while (true)
{
TcpClient client = listener.AcceptTcpClient();
Console.WriteLine("Client accepted." + listener.Pending());
NetworkStream stream = client.GetStream();
StreamWriter sw = new StreamWriter(client.GetStream());
byte[] buffer = new byte[client.ReceiveBufferSize];
int bytesRead = stream.Read(buffer, 0, client.ReceiveBufferSize);
byte[] header = new byte[16];
byte[] encHeader = new byte[48];
for (int i = 0; i < 63; i++)
{
if (i <= 15)
{
Console.WriteLine("added to header " + i);
header[i] = buffer[i];
}
else
{
Console.WriteLine("added to headerEnc " + i);
encHeader[i - 15] = buffer[i - 15];
}
}
Console.WriteLine("Byte nr 2 " + Convert.ToString(header[2], 2).PadLeft(8, '0') + " Byte nr 3 " + Convert.ToString(header[3], 2).PadLeft(8, '0'));
//Byte nr 2 00000000 Byte nr 3 00010101
int dataLength = BitConverter.ToInt32(header, 2);
Console.WriteLine("Data lenght int is " + dataLength);
//result for datalenght is 790959360
}
}
As Jereon van Langen commented, it was indeed because it was big endian

How to parse binary data transmitted over GPIB with IVI library

I am working with a Fluke 8588 and communicating with it using Ivi.Visa.Interop I am trying to use the digitizer function to get a large number of samples of a 5V 1k Hz sinewave. To improve the transfer time of the data the manual mentions a setting for using a binary packed data format. It provides 2 and 4 byte packing.
This is the smallest example I could put together:
using System;
using System.Threading;
using Ivi.Visa.Interop;
namespace Example
{
class Program
{
static void Main(string[] args)
{
Console.WriteLine("Initiallizing Equipment");
int timeOut = 3000;
string resourceName = "GPIB0::1::INSTR";
ResourceManager rm = new ResourceManager();
FormattedIO488 fluke8588 = new FormattedIO488
{
IO = (IMessage)rm.Open(resourceName, AccessMode.NO_LOCK, timeOut)
};
Console.WriteLine("Starting Setup");
fluke8588.WriteString("FORMAT:DATA PACKED,4");
fluke8588.WriteString("TRIGGER:COUNT 100000");
Console.WriteLine("Initiate Readings");
fluke8588.WriteString("INITIATE:IMMEDIATE");
Thread.Sleep(3000);
Console.WriteLine("Readings Complete");
Console.WriteLine("Fetching Reading");
fluke8588.WriteString("FETCH?");
string response = fluke8588.ReadString();
Byte[] bytes = System.Text.Encoding.ASCII.GetBytes(response);
fluke8588.WriteString("FORMAT:DATA:SCALE?");
double scale = Convert.ToDouble(fluke8588.ReadString());
int parityMask = 0x8;
for (int i = 0; i < 100000; i += 4)
{
int raw = (int)((bytes[i] << 24) | (bytes[i + 1] << 16) | (bytes[i + 2] << 8) | (bytes[i + 3]));
int parity = (parityMask & bytes[i]) == parityMask ? -1 : 1;
int number = raw;
if (parity == -1)
{
number = ~raw * parity;
}
Console.WriteLine(number * scale);
}
Console.Read();
}
}
}
The resulting data looks like this:
I preformed the steps "manually" using a tool called NI Max. I get a header followed by the 10 4 byte integers and ending with a new line char. the negative integers are 2s complement, which was not specified in the manual but I was able to determine after I had enough samples.
TRIGGER:COUNT was only set to 10 at the time this image was taken.
How can I get this result in c#?
I found that I was using the wrong Encoding, changing from System.Text.Encoding.ASCII.GetBytes(response) to
System.Text.Encoding encoding = System.Text.Encoding.GetEncoding(1252);
Byte[] bytes = encoding.GetBytes(response);
got the desired result.
That said, I also learned there is an alternative option to FormattedIO488.ReadString for binary data, using FormattedIO488.ReadIEEEBlock(IEEEBinaryType.BinaryType_I4) this will return an array of integers and requires no extra effort with twiddling bits, this is the solution I would suggest.
using System;
using System.Linq;
using Ivi.Visa.Interop;
using System.Threading;
using System.Collections.Generic;
namespace Example
{
class Program
{
static void Main(string[] args)
{
Console.WriteLine("Initiallizing Equipment");
int timeOut = 3000;
string resourceName = "GPIB0::1::INSTR";
ResourceManager rm = new ResourceManager();
FormattedIO488 fluke8588 = new FormattedIO488
{
IO = (IMessage)rm.Open(resourceName, AccessMode.NO_LOCK, timeOut)
};
Console.WriteLine("Starting Setup");
fluke8588.WriteString("FORMAT:DATA PACKED,4");
fluke8588.WriteString("TRIGGER:COUNT 100000");
Console.WriteLine("Initiate Readings");
fluke8588.WriteString("INITIATE:IMMEDIATE");
Thread.Sleep(3000);
Console.WriteLine("Readings Complete");
Console.WriteLine("Fetching Reading");
fluke8588.WriteString("FETCH?");
List<int> response = new List<int>(fluke8588.ReadIEEEBlock(IEEEBinaryType.BinaryType_I4));
fluke8588.WriteString("FORMAT:DATA:SCALE?");
double scale = Convert.ToDouble(fluke8588.ReadString());
foreach (var value in response.Select(i => i * scale).ToList())
{
Console.WriteLine(value);
}
Console.Read();
}
}
}
Result data looks like:

Read arduino Serial.Write with C#

I have an Arduino that send on serial port some information revealed by analogical pin. Anyway, in the Arduino code (that I can not modify) is used Serial.write() instead of Serial.print() in order to print a buffer of char. As a consequence, if in my C# software I read the information with a "simple" ReadLine(), the data are incomprehensible. How can I read these type of data with C#?
It is the Arduino code:
#include <compat/deprecated.h>
#include <FlexiTimer2.h>
#define TIMER2VAL (1024/256) // 256Hz - frequency
volatile unsigned char myBuff[8];
volatile unsigned char c=0;
volatile unsigned int myRead=0;
volatile unsigned char mych=0;
volatile unsigned char i;
void setup() {
pinMode(9, OUTPUT);
noInterrupts();
myBuff[0] = 0xa5; //START 0
myBuff[1] = 0x5a; //START 1
myBuff[2] = 2; //myInformation
myBuff[3] = 0; //COUNTER
myBuff[4] = 0x02; //CH1 HB
myBuff[5] = 0x00; //CH1 LB
myBuff[6] = 0x02; //CH2 HB
myBuff[7] = 0x00; //CH2 LB
myBuff[8] = 0x01; //END
FlexiTimer2::set(TIMER2VAL, Timer2);
FlexiTimer2::start();
Serial.begin(57600);
interrupts();
}
void Timer2()
{
for(mych=0;mych<2;mych++){
myRead= analogRead(mych);
myBuff[4+mych] = ((unsigned char)((myRead & 0xFF00) >> 8)); // Write HB
myBuff[5+mych] = ((unsigned char)(myRead & 0x00FF)); // Write LB
}
// SEND
for(i=0;i<8;i++){
Serial.write(myBuff[i]);
}
myBuff[3]++;
}
void loop() {
__asm__ __volatile__ ("sleep");
}
And this is the C# method that read from serial port
public void StartRead()
{
msp.Open(); //Open the serial port
while (!t_suspend)
{
i++;
String r = msp.ReadLine();
Console.WriteLine(i + ": " + r);
}
}
EDIT: I would as output an array of string that correspond to the data of Arduino output. If I record everything as an array of byte, I have not the information about start and the end of the array.
I can edit the code as:
public void StartRead()
{
msp.Open(); //Open the serial port
ASCIIEncoding ascii = new ASCIIEncoding();
while (!t_suspend)
{
i++;
int r = msp.ReadByte();
String s = ascii.getString((byte)r); // here there is an error, it require an array byte[] and not a single byte
Console.WriteLine(i + ": " + r);
}
}
How I can have the same Arduino array value (but as a String) in my C# software, considering that the starting value is every time 0xa5 and the end is 0x01.
Arduino sends a telegram of several bytes. You can read it into a byte array:
byte[] telegram = byte[msp.BytesToRead];
msp.Read(telegram, 0, msp.BytesToRead);
To get the data from the byte array you have to interpret the bytes (See example below).
Of course you could create a string from the properties of the Telegram class:
class Telegram {
public Telegram(byte[] tel) {
// Check start bytes ( 0xa5, 0x5a );
Info = tel[2];
Counter = tel[3];
Channel1 = BitConverter.ToInt16(new byte[] { tel[5], tel[4] }, 0); // Switch lo/hi byte
Channel2 = BitConverter.ToInt16(new byte[] { tel[7], tel[6] }, 0);// Switch lo/hi byte
// check tel[8] == 1 for end of telegram
}
public int Info { get; private set; }
public int Counter { get; private set; }
public int Channel1 { get; private set; }
public int Channel2 { get; private set; }
}

Transferring Base64 string from C# to Python and vice versa

I'm building a project that includes file transferring through sockets. My server is written in python and my client is in C#. Since Python isn't very friendly when it comes to charsets, I'm transferring a file I'm uploading from the client by converting it into base64, and decoding it in python. It works perfectly. For some reason, when I do the opposite, when I encode the text in python and decode after transferring it throws errors.
Have a look -
This is the correct sending from client to server:
List<byte> b = Encoding.ASCII.GetBytes(Convert.ToBase64String(cont)).ToList();
int size = Encoding.ASCII.GetByteCount(st) + b.Count;
string pack = size + ":" + st;
buffer = Encoding.ASCII.GetBytes(pack);
List<byte> a = buffer.ToList();
a.AddRange(b);
connection.Send(a.ToArray());
And python:
base64.b64decode(params[2])
And this works.
When I do opposite(with the same libraries it makes an error):
string res = SendRecv("1?" + basepath + v[0]);
res = res.Remove(res.Length - 1).Substring(1);//because it is sent quoted
byte[] converted = Convert.FromBase64String(res.Replace(" ",String.Empty));
saved.Write(converted, 0, converted.Length);
saved.Close();
The SendRecv(send, and recv) methods:
private void Send(string st)
{
int size = Encoding.ASCII.GetByteCount(st);
string pack = size + ":" + st;
buffer = Encoding.ASCII.GetBytes(pack);
connection.Send(buffer);
}
private string Recv()
{
try
{
buffer = new byte[2];
connection.Receive(buffer, 2, SocketFlags.Partial);
string header = Encoding.ASCII.GetString(buffer, 0, 2);
while (!header.Contains(":"))
{
connection.Receive(buffer, 2, SocketFlags.Partial);
header += Encoding.ASCII.GetString(buffer, 0, 2);
}
int size = int.Parse(header.Split(':')[0]);
string mes0 = header.Split(':')[1];
buffer = new byte[size];
int b = 0;
int s = (size >= 2048) ? 2048 : size;
while(size-s > 0)
{
connection.Receive(buffer,b,s,SocketFlags.None);
size -= s;
s = (size >= 2048) ? 2048 : size;
b += s;
}
connection.Receive(buffer, size, SocketFlags.None);
string fullmes = mes0 + Encoding.ASCII.GetString(buffer);
return fullmes;
}
catch(Exception e)
{
MessageBox.Show(e.ToString());
}
return "";
}
private string SendRecv(string a)
{
Send(a);
return Recv();
}
Python:
return base64.b64encode(self.finalResult.getContent())
And it throws this exception:
The input is not a valid Base-64 string as it contains a non-base 64 character, more than two padding characters, or a non-white space character among the padding characters

BluetoothLEAdvertisementDataSection (ArgumentException)

I'm trying to advertise an Eddystone beacon but my code fails at advertisementData.Data with an ArgumentException:
Value does not fall within the expected range.
Any ideas of what's happening?
// ...
using (var memoryStream = new MemoryStream())
{
byte messageLengthByte = Convert.ToByte(message.Length);
memoryStream.WriteByte(messageLengthByte);
memoryStream.Write(message, 0, message.Length);
fullMessage = memoryStream.ToArray();
}
while (fullMessage.Length < 32)
{
byte[] newArray = new byte[fullMessage.Length + 1];
fullMessage.CopyTo(newArray, 0);
newArray[fullMessage.Length] = 0x00;
fullMessage = newArray;
}
var writer = new DataWriter();
writer.WriteBytes(fullMessage);
var advertisementData = new BluetoothLEAdvertisementDataSection();
advertisementData.Data = writer.DetachBuffer(); // Error!
publisher.Advertisement.DataSections.Add(advertisementData);
publisher.Start();
Most likely you're trying to fit in more bytes than the BLE packet allows. The max size is 32 bytes, but that's including:
3 bytes for the "flags" data section, which I believe is mandatory and might be automatically set by the Windows 10 BLE API
for each additional section, 1 byte for the length of the section, and 1 bytes for the type of the section
If you only broadcast a single section, that leaves you with 27 bytes for that section's actual payload.

Categories