WinPcap equivalent to Wireshark 'dtls' filter - c#

I used to filter packets into Wireshark with the simple dtls argument as filter. (Data Transport Layer Security which is some UDP TLS protocol)
Now, i wanted to do the same using C# and PcapDOTNet wrapper that uses WinPcap filters.
Sadly, i can't find anywhere the equivalent, and dtls is not recognised in the C# app, and so doesn't grab any packet anymore. (Simply it crashes the interpreter since the string is not recognised)
using (PacketCommunicator communicator = selectedDevice.Open(65536, PacketDeviceOpenAttributes.None, 1000))
{
using (BerkeleyPacketFilter filter = communicator.CreateFilter("dtls") {
communicator.SetFilter(filter);
communicator.ReceivePackets(1, packetHandler);
}
}
Is there any equivalent, please ?
EDIT : It looks like dtls is only a DISPLAY filter, and not a CAPTURE one. I could only capture filter by using udp port xx (xx being the port) but since the used ports are always randoms, i can't. So i would be glad to find another filtering workaround if you have one! I prefer only capturing the desired packets, rather than capturing everything then filtering the datas...
Wireshark : DTLS
There is only two packets i would like to capture. The one containing The Server Hello Done message or the one containing handshake message (the one with Record Layer) :
EDIT 2 : Ok, i am close to find what i need, but i need your help.
This answer from here must be the solution. tcp[((tcp[12] & 0xf0) >> 2)] = 0x16 is looking for handshake 22, but dtls is udp and not tcp and so the 12 offset might be different. Can anyone help me figure out what would be the correct formula to adapt it for dtls instead of tcp tls ?
I tried to use this on wireshark, but the filter is invalid and i don't really know why. If at least you could make it to work into wireshark, i could experiment differents value myself and come back with a final answer. udp[((udp[12] & 0xf0) >> 2)] = 0x16 is not a valid filter on wireshark.

So, i gave up on the dynamical way of finding the correct position of the data.
But this is what i ended with :
using (PacketCommunicator communicator = selectedDevice.Open(65536, PacketDeviceOpenAttributes.None, 1000))
{
using (BerkeleyPacketFilter filter = communicator.CreateFilter("udp && ((ether[62:4] = 0x16fefd00) || (ether[42:4] = 0x16fefd00))") {
communicator.SetFilter(filter);
communicator.ReceivePackets(1, packetHandler);
}
}
bytes[62:4] is the position of 16fefd00 for ipv6 packets, (42 for ipv4).
The 16 is for Content type handshake protocol 22 and the following fefd is for DTLS version 1.2. The last two zeros are just because using slice of bytes only works for 1,2 or 4, not 3. So i had to take them in consideration.
This is absolutly not perfect, i know, but for now, it works, since i couldn't find any other workaround yet.

Related

Understanding and creating a protocol for Python

I have documentation on how to use a TCP/IP binary stream API. The attached images show the protocol. I have a working example of this in C#. I want to do this using python instead, as I don't know c# or windows.
I am assuming I would use python sockets, then I have to send the API messages, with payloads looking at this docs.
Could you point me in the right direction to get this going with python?
How would I set it to know this is the authentication message, 0x21 and compress the data etc?
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
s.connect((HOST, PORT))
s.sendall(u"{'username':'username', 'password':'password'}".encode('utf8'))
in OSI model you are above layer 4 (transport, TCP/IP,...) on at least layer 5 (session). there are some examples of implementations of session layer protocols like http, ftp,... i.e. in http://github.com/python-git/python/blob/master/Lib/ftplib.py or http://github.com/python-git/python/blob/master/Lib/httplib.py
as your protocol includes headers maybe http is the better example
to use the TCP/IP API in http protocol see
http://www.stackoverflow.com/questions/8315209/sending-http-headers-with-python
import socket
sock = socket.socket()
sock.bind(('', 8080))
sock.listen(5)
client, adress = sock.accept()
print "Incoming:", adress
print client.recv(1024)
print
client.send('HTTP/1.0 200 OK\r\n')
client.send("Content-Type: text/html\r\n\r\n")
client.send('<html><body><h1>Hello World</body></html>')
client.close()
print "Answering ..."
print "Finished."
sock.close()
as far as i can see you skipped the headers (version, sequence, type, encoding, ...) in your code completely you have to add them whenever you send a frame
so try
self.socket.send(...headers...)
self.socket.send(u"{'username':'username', 'password':'password'}".encode('utf8')) // has to be send as JSON ???
see also http://www.stackoverflow.com/questions/22083359/send-tetx-http-over-python-socket
ftp example (no headers...)
# Internal: send one line to the server, appending CRLF
def putline(self, line):
line = line + CRLF
if self.debugging > 1: print '*put*', self.sanitize(line)
self.sock.sendall(line)
also see scapy
I try to use tcp protocol once. You can send authorization values(username ,password ) in all requests.And other data with it exp({'username':'username', 'password':'password','data':'value'}).With this there is no any current standart for data. Many of tcp clients send that data like this #A#:username;password;#D:value\n (A -authorization ,D -data), example

c# SerialPort: how to send "0 byte" transfers (AKA ZLP: zero length packet)?

I've done some playing around with SerialPort (frustratingly so) and have finally hit a point where I absolutely have no idea why this isn't working. There's a USB CDC device that I'm trying to send hex commands to, the way I'm doing this is over the COM port interface it exposes. I can handshake with the device, when I say HI it replies with HI back, but then I send another command to it which must be followed by a zero byte packet or else the device stops responding altogether. Keep in mind, this zero byte packet has ABSOLUTELY nothing in it, meaning it doesn't have a \0 or a 0x00 or 0 or even a null (SerialPort throws an exception on null).
Now, one way I was able to circumvent this was to use libusbdotnet. I accessed the CDC device directly instead of the COM interface, set the endpoints correctly and sent hex commands like that. I'm able to successfully send "0 byte" packets using this method with the following c# code:
string zlpstring = "";
byte[] zlpbyte = Encoding.Default.GetBytes(zlpstring);
....snip
ecWrite = writer.SubmitAsyncTransfer(zlpbyte, 0, zlpbyte.Length, 100, out usbWriteTransfer);
zlpbyte is the buffer, 0 is the offset, zlpbyte.Length is the packet length in bytes, 100 is the timeout, and out usbWriteTransfer is the transfer context.
When I use this same method on the COM port:
string zlpstring = "";
byte[] zlpbyte = Encoding.Default.GetBytes(zlpstring);
_serialPort.Write(zlpbyte, 0, zlpbyte.Length);
the USB logger reports that absolutely nothing was sent. It's as if the COM port is ignoring the zero byte transfer. Before it's mentioned that "you cannot do this", there's various programs out there that can send a zero-byte packet to this exact device's COM port without doing ANY driver manipulation. This is what I'm going for, which is why I'm trying to ditch libusbdotnet and go straight to the COM port.
EDIT:
After some more toying around and a different USB logger I don't find zero bytes being sent but rather this:
IRP_MJ_DEVICE_CONTROL (IOCTL_SERIAL_WAIT_ON_MASK)
I think this may be the issue. If a 0 byte was being sent then I assume it would show up as:
IRP_MJ_WRITE > UP > STATUS_SUCCESS > (blank) > (blank)
My program is sending back a response of 01 00 00 00, however while logging another successful program it's SETTING the wait mask:
IRP_MJ_DEVICE_CONTROL (IOCTL_SERIAL_SET_WAIT_MASK) DOWN STATUS_SUCCESS 01 00 00 00
If my assumptions are right, this question might've just turned into how do I set a serial port's/COM port's wait mask? There's absolutely nothing about this in the c# SerialPort class...which is why I can now see why so many articles called it "lacking". I also took a look around c++: https://msdn.microsoft.com/en-us/library/aa363214(v=vs.85).aspx this also does not seem to cover the wait mask. Using the USB filter libusb is starting to look a lot more pleasing each minute...(although I'm going to question myself forever why sending a zero byte works there but it doesn't over SerialPort).
SECOND EDIT:
I'm a moron. It was definitely a setting that the manufacturer probably didn't figure anyone would ever touch nor know how to set:
#define EV_RXFLAG 0x0001
SetCommMask(hSerial, EV_RXFLAG);
I then saw this over the USB logs:
IRP_MJ_DEVICE_CONTROL (IOCTL_SERIAL_SET_WAIT_MASK) DOWN STATUS_SUCCESS 01 00 00 00
Bingo. The RXFLAG was originally set to 0x0002. I couldn't find a way to change this in C# yet. So I had to do with some C++ code for now. It totally works, and sends the "zero byte" like it's supposed to without me actually sending it from the code. This setting I assume was the "handshake" method between my device and whatever else it's interacting with in Flash mode. Hope this helps someone else out there whose COM/Serial device is rejecting/discarding zero byte packets yet requiring ZLP at the same time...how goofy?!
have you tried to concatenate an extra new line or carriage return or both at the end of the data?
I would say add a 0xA (new line), or 0xD (carriage return), or both 0xA and 0xD to the end of your byte array and see if you get something.
byte[] zlpbyte = new byte[1] {0};
_serialPort.Write(zlpbyte, 0, 1);
[Update]
Based on our discussions, it appears that you are trying to have control over the control signals of the serial port. I have not tried it before but I can see that it is possible to set the control signals (if i understand the source properly) into certain states.
Try to set the Handshake property
public enum Handshake
{
None,
XOnXOff,
RequestToSend,
RequestToSendXOnXOff,
}
I am not sure exactly how it affects the IOCTL settings but it should be able to affect it somehow I believe

Unable to transfer files from Windows to Linux via Bluetooth using 32feet.Net C# library

First linux service listening process is started using the following command:
obexpushd –B[00:15:83:3D:0A:57]:9 –d –o /home/myfolder
On windows the following code is used to perform the obex transfer:
InTheHand.Net.BluetoothAddress address = peerDevice.DeviceAddress;
System.Uri uri = new Uri("obex://" + address.ToString() + "/" + srcfile.Name);
request = new ObexWebRequest(uri);
startcopy = DateTime.Now;
request.ReadFile(file); // this performs the file read from the hard drive
try
{
response = (ObexWebResponse)request.GetResponse(); // here file should be pushed to the listening service
}
catch (System.InvalidOperationException ex)
{
if (response != null) {
response.Close();
}
return;
}
Devices see each other and their obex services are visible as well.
Transfer seems to be successful, but no data is actually transferred.
The code works between windows and windows without a problem.
Obexpushd process ouput shows:
obexpushd 0.10.2 Copyright (C) 2006-2010 Hendrik Sattler
This software comes with ABSOLUTELY NO WARRANTY.
This is free software, and you are welcome to redistribute it
under certain conditions.
Listening on bluetooth/[00:15:83:3D:0A:57]:9
OBEX_EV_ACCEPTHINT, OBEX_CMD_CONNECT
0: Connection from "bluetooth/[00:09:DD:50:94:0B]:9"
0: OBEX_EV_REQHINT, OBEX_CMD_CONNECT
0: OBEX_EV_REQ, OBEX_CMD_CONNECT
0: Sending response code 0
0: OBEX_EV_REQDONE, OBEX_CMD_CONNECT
0: OBEX_EV_REQHINT, OBEX_CMD_PUT
0.1: OBEX_EV_REQCHECK, OBEX_CMD_PUT
0.1: OBEX_EV_REQDONE, OBEX_CMD_PUT
0.1: OBEX_EV_REQHINT, OBEX_CMD_DISCONNECT
0.1: Sending response code 100
0.1: OBEX_EV_REQ, OBEX_CMD_DISCONNECT
0.1: OBEX_EV_REQDONE, OBEX_CMD_DISCONNECT
I have also tried to disable Authentication in C# code but that did not help.
Does any one have idea how to nail this problem down or where to even start looking?
Ok so it seems that no one is much interested in this topic. :) Fortunately I have found the solution myself.
However, it involved a lot of analysis (of both obex transfer protocol and 32feet library) and a bit of luck.
The difference between Linux obexpushd implementation lies in its interpretation of OBEX transfer packets.
I found the OBEX specification on page: OBEX specification.
After debugging the internals of the 23feet obex transfer I found where the code sends the OBEX PUT command used to send file to the receiver. Obex specification gives the following example for PUT inititialization packet:
PUT Command | length of packet | Name header | Length of Name header | name of object | Length header | Length of object | Object Body chunk header | Length of Body header | bytes of body
The 32feet library sends the first packet without the Body Header which causes error in obexpushd Linux command.
Not shure if it is error in 32feet library, obexpushd or if the OBEX specification is not precise enough, but adding the Body header to the first packet solved the problem. From my experiments it turns out that at least 2 first bytes of the objectt must be sent in the first packet. Moreover, adding the header does not crash anything else and Windows<->Windows transfer still works very well.

How to fix issue sending text to Arduino over bluetooth serial port from C#

I have an Arduino Uno with a Bluesmirf Silver module connected. My Arduino has a temperature sensor which records the temp regularly. The Arduino listens for any string being sent to it over bluetooth and responds with the latest data.
I have written a C# application to fetch this data but I am seeing some strange behaviour. I am using the following code to connect, send a string and get the returned data.
mPort = new SerialPort(mPortName, 115200, Parity.None, 8, StopBits.One);
mPort.Open();
mPort.Write("download");
Thread.Sleep(1000);
while (mPort.BytesToRead > 0)
{
String data = mPort.ReadExisting();
this.BeginInvoke(new Action<String>(AddMessage), data);
}
The data I get back looks like this:
Line added locally within C# application:
Send: download
Lines added based on data received from Arduino:
Read: d???+?
GotData
------
Total Readings, 1069
Num Readings, 360
Lost Readings, 709
Reading Interval, 240000
------
350,19.34
351,19.34
352,19.34
353,20.31
....
All the text looks fine apart from the string which is being echoed back which I sent to the Arduino. Have I done something wrong with the way I sent the data?
FYI - The datasheet for the bluetooth module is here: http://www.sparkfun.com/datasheets/Wireless/Bluetooth/rn-bluetooth-um.pdf
#Jeff - This is the code which I use on my Arduino to receive data: https://github.com/mchr3k/arduino/blob/master/tempsensor/StringReader.cpp
#Jeff - stringDataLen defines the length and I call the overall function from this file: https://github.com/mchr3k/arduino/blob/master/tempsensor/tempsensor.ino
EDIT: Here is the complete source code
Arduino - https://github.com/mchr3k/arduino/tree/master/tempsensor
C# application - https://github.com/mchr3k/arduino/tree/master/serialdownload
The C# code is definitely getting the flow control wrong for some reason. I have switched to use the following code in C# and this gets a string through without corruption.
private void write(SerialPort mPort, string str)
{
foreach (char c in str)
{
mPort.Write(new char[] {c}, 0, 1);
Thread.Sleep(10);
}
}
An incorrect encoding perhaps?
mPort = new SerialPort(mPortName, 115200, Parity.None, 8, StopBits.One);
mPort.Encoding = System.Text.Encoding.ASCII; // Or System.Text.Encoding.UTF8
mPort.Open();
mPort.Write("download");
Read byte-by-byte and check each byte one by one to debug lower level problems. ReadExisting() converts bytes to a String based on the Encoding property.
My issue was caused by me using the SoftwareSerial class to communicate with my Bluetooth module on pins 2 & 3. I was using a baud rate of 115200 which is claimed to be supported on this page: http://arduino.cc/en/Reference/SoftwareSerial
However, this page ( http://arduino.cc/en/Reference/SoftwareSerialBegin ) states that the maximum baud rate supported is actually 9600. I'm not sure whether this is accurate but reducing my baud rate to 9600 has fixed my issue.
I suggest you to decrease communication speed, because there is no reason to use 115200bps (only if your module demand this speed, then it's ok). Also you are sending string "download" which is not good, rather use markers something like "#D" which internally for your Arduino device means, send data to computer. In this way you are sending only two bytes instead eight, and you will decrease probability of error, and Arduino code will be better.
Now, let's try fix the problem. First try use something like this when you are reading data from Arduino device:
ArayList dataReceaved=new ArrayList():
while(serialPort.BytesToRead>0 && serialPort.IsOpen){
dataReceaved.Add(serialPort.ReadByte());
}
So I suggest you to read byte by byte, in this or similar way. Also you shold be careful if you are sending numbers from Arduino device. If you are, then use something like this:
Serial.print(temperatureValue,BYTE);
With this code you explicitly say that data you are sending is byte long. If this not help, please let me know, so we can try something else.

Does .NET 3.5 SP1 SerialPort class append extra 0's on transmission?

Update
Turns out the error was in the crypto processor code, that is fixed. But now running into what seems like it might be a handshaking issue.
On first transmission, we get a single byte back from the device with value 0xFF(don't know why, the engineer I'm working with isn't too experienced with RS-232 either). Then, things run as normal (just sending the device one byte at a time, and waiting for a matching echo). However, neither the device nor the .NET app can send more than a couple of bytes at a time before one of them locks up and refuses to send or receive.
At work I'm writing an app that interfaces over RS232 with a crypto processor inside a device to reprogram flash modules inside the device.
To just take things slow and make sure all our headers are right, we're writing one byte at a time with SerialPort.Write(). However, when we run the code on the crypto processor, it reads an extra NULL in between each byte. When I test the .NET code on my local machine with two serial ports and a crossover cable, I capture the output in HyperTerminal or Putty and there are no extra NULLs when I view the log in Notepad++.
However, to further complicate things, when we manually type messages byte-per-byte via HyperTerminal to the crypto processor, it reads the input as a single byte only, no extra NULLs (as compared to the .NET app). Anybody have any experience with .NET doing mysterious things when it writes to a SerialPort?
We're initializing a test chunk with this:
byte[] testBytes = new byte[] { (byte)'A', (byte)'B', (byte)'C', (byte)'D', (byte)'E', (byte)'F', (byte)'G', (byte)'H' };
byte[] byteArray = new byte[256];
for (int i = 0; i < 32; i++)
{
testBytes.CopyTo(byteArray, i * 8);
}
And sending it with this:
public void StutterSend(byte[] data, long delayMs)
{
bool interactive = false;
if (delayMs < 0)
interactive = true;
for (int i = 0; i < data.Length; i++)
{
serialPort.Write(data, i, 1);
if (interactive)
{
WriteLine("Sent byte " + (i + 1) + " of " + data.Length + ". Press any key to send moar.");
Console.ReadKey();
}
else
{
double timer = DateTime.Now.TimeOfDay.TotalMilliseconds;
do { } while ((DateTime.Now.TimeOfDay.TotalMilliseconds - timer) < delayMs);
}
}
WriteLine("Done sending bytes.");
}
Our SerialPort is configured with all the matching parameters (stop bits, data bits, parity, baud rate, port name), and our handshake is set to None (it's just how our uart driver works).
Regarding your update, it sounds like your crypto processor has some more problems. Getting a 0xff back can be the result of an unexpected glitch of <= 1 bit time on the Tx line of the RS232 port. This is interpreted as a start bit by the PC. After the glitch, the Tx line returns to the mark state and now that the UART on the PC has a start bit, it interprets the "data" bits as all ones (the value for the mark state). The mark state is also the correct value for the stop bit(s) so your PC's UART has received a valid byte with a value of 0xff. Note that the glitch can be very fast relative to the RS232 data rate and still be interpreted as a start bit so have your engineer look at this line with an oscilloscope in normal mode/single sequence trigger to confirm this.
What is the Encoding property for the serialPort set to? The docs for SerialPort.Write( byte[], int, int) say that it runs its data through an Encoder object (which doesn't really make sense to me for a byte[]). It's supposed to default to ASCIIEncoding, but it seems like it might be set to something else. try explicitly setting it to ASCIIEncoding and see if that helps. I can't recall if this was an issue for me back when I did some serial port stuff in .NET to talk to an embedded board...
Note that even with ASCIIEncoding in use, you'll get some (probably unwanted) transformation of data - if you try to send something above value 127, the encoder will convert it to '?' since it's not a valid ASCII character. I can't recall off the top of my head how I got the serial port to simply leave my data alone - I'll have to dig around in some source code...
SerialPort sets the Parity property to Parity.None if you don't specify any. This means in case your receiver expects a Partity bit, it will never get one as long as you don't tell SerialPort explicitely to send along a Parity Bit with the transmitted data.
And the fact that it went well on HyperTerminal could be that HyperTerminal uses a Parity bit by default ( I don't know HyperTerminal well).

Categories