I'm using Jon Skeet's (excellent) port of Google's Protocol Buffers to C#/.Net.
For practice, I have written a dummy Instant Messenger app that sends some messages down a socket. I have a message definition as follows:-
message InstantMessage {<br/>
required string Message = 1;<br/>
required int64 TimeStampTicks = 2; <br/>
}
When the sender serialises the message, it sends it really elegantly:-
...
InstantMessage.Builder imBuild = new InstantMessage.Builder();
imBuild.Message = txtEnterText.Text;
imBuild.TimeStampTicks = DateTime.Now.Ticks;
InstantMessage im = imBuild.BuildPartial();
im.WriteTo(networkStream);
...
This works great. But at the other end, I'm having trouble getting the ParseFrom to work.
I want to use:-
InstantMessage im = InstantMessage.ParseFrom(networkStream);
But instead I have had to read it to bytes and then parse it from here. This is obviously not ideal for a number of reasons. Current code is:-
while (true)
{
Byte[] byteArray = new Byte[10000000];
int intMsgLength;
int runningMsgLength = 0;
DateTime start = DateTime.Now;
while (true)
{
runningMsgLength += networkStream.Read(byteArray, runningMsgLength, 10000000 - runningMsgLength);
if (!networkStream.DataAvailable)
break;
}
InstantMessage im = InstantMessage.ParseFrom(byteArray.Take(runningMsgLength).ToArray());
When I try to use ParseFrom, control does not return to the calling method even when I know a valid GB message is on the wire.
Any advice would be gratefully received,
PW
Sorry for taking a while to answer this. As Marc says, protocol buffers don't have a terminator, and they aren't length prefixed unless they're nested. However, you can put on the length prefix yourself. If you look at MessageStreamIterator and MessageStreamWriter, you'll see how I do this - basically I pretend that I'm in the middle of a message, writing a nested message as field 1. Unfortunately when reading the message, I have to use internal details (BuildImpl).
There's now another API to do this: IMessage.WriteDelimitedTo and IBuilder.MergeDelimitedFrom. This is probably what you want at the moment, but I seem to remember there's a slight issue with it in terms of detecting the end of the stream (i.e. when there isn't another message to read). I can't remember whether there's a fix for it at the moment - I have a feeling it's changed in the Java version and I may not have ported the change yet. Anyway, that's definitely the area to look at.
Protobuf has no terminator - so either close the stream, or use your own length prefix etc. Protobuf-net exposes this easily via SerializeWithLenghtPrefix / DeserializeWithLengthPrefix.
Simply: without this, it can't know where each message ends, so keeps trying to read to the end of the stream.
Related
I am trying to use an MSR606 Magstripe Reader/Writer with an Application I am coding in .NET Winforms.
After a significant amount of research and a few days of trial and error this is where I'm at.
*The reader is working through a COM port, I am able to create a reference to it in the application.
*According to the Manual it expects ANSI encoding
*Commands are listed with a Command Code such as "<ESC><82>" and a hex code such as "1B 82"
This is the code so far
CardReader = new SerialPort("COM4");
CardReader.DataReceived += new SerialDataReceivedEventHandler(DataRecivedHandler);
CardReader.Encoding = System.Text.Encoding.GetEncoding(28591);
CardReader.Handshake = Handshake.None;
CardReader.RtsEnable = true;
CardReader.Open();
I haven't managed to successfully send a command to the reader. I've primarily tested by entering "CardReader.Write()" with commands as strings or byte arrays.
I have never worked with anything through a COM Port before so I've basically been blindly stumbling through this by looking up any guides I can. Any info on how to send commands of that format properly or just explaining where I've gone wrong here would be greatly appreciated.
I eventually figured out that I was formatting the commands wrong. Sending them as hex codes was simply appending "0x" to the code and sending it as a byte[]
CardReader.Write(new byte[] { 0x1b, 0x82 }, 0, 2);
I'm trying to manipulate some network captures (pcap format) using Pcap.net.
I'm opening the pcap file and creating the dumper with:
OfflinePacketDevice selectedDevice = new OfflinePacketDevice(pcapInFile);
using (PacketCommunicator PcapReader = selectedDevice.Open(655360, PacketDeviceOpenAttributes.Promiscuous, 1000))
{
PacketDumpFile PcapWriter = PcapReader.OpenDump(pcapOutFile);
PcapReader.ReceivePackets(count, PacketDispatcher);
}
And the PacketDispatcher would be something like:
private void PacketDispatcher(Packet packet)
{
// Manipulate the packet
PcapWriter.Dump(packet);
}
Everything is ok as far as the pcapInFile Datalink is ethernet type. But i have several captures without ethernet layer (rawip) and i have to build a new ethernet layer. In this kind of caps the datalink type is the same as the pcapInFile (raw ip) and i want to change it to ethernet...
If i store all the manipulated packets in a ienumerable and dump them with:
PacketDumpFile.Dump(pcapOutFile, new PcapDataLink(1), Packets.Count(), Packets);
It works fine... But, this is not very useful if you are dealing with files of several gigas...
Any idea?
Thanks!
Assuming the concern is about having all packets in RAM, you can create an IEnumerable that doesn't contain everything in RAM using yield.
This way as Dumper dumps packets it will call your method to populate the next item using yield.
Well, despite this is not an answer but a workaround, and since nobody wrote a better solution this is how a fixed it:
Use SharPcap instead of PcapDotNet, then you can declare a reader and a writer this way:
public CaptureFileReaderDevice PcapReader;
public CaptureFileWriterDevice PcapWriter;
PcapReader = new CaptureFileReaderDevice(fileIn);
PcapReader.OnPacketArrival += packetDispatcher;
PcapReader.Capture();
In the packetDispatcher function:
RawCapture raw = new RawCapture(LinkLayers.Ethernet, e.Packet.Timeval,RebuildNullLinkLayer(e, offset));
CaptureHandler.PcapWriter.Write(raw);
And in the RebuildNullLinkLayer function you can add the ethernet layer, modify whatever you want, etc...
Note that when you call the RawCapture Constructor you can choose the Link Layer (mi original issue with Pcap.Net...), so if you are parsing a RawIp capture, you can convert the packets to Ethernet.
A java client constructs a Message according to this skeleton :
package tutorial;
option java_package = "com.example.sff.piki2";
option java_outer_classname = "MsgProtos";
message MSG {
required string guid = 1;
required int32 MsgCode = 2;
required int32 From = 3; //sender
...
This message is sent to a C# program ( server side).
The server knows how to read the bytes ( first byte is the number of bytes to read which represents the size of the following message ).
This is how MSG is being constructed by a byte array.
MSG Msg = MSG.CreateBuilder().MergeFrom(buffer).Build();
Where buffer is the byte array which read from the socket.
But now I'm in a situation where a client needs to send "Heartbeat" message( another message) in order to check if the server is alive or not. ( the server should respond : "yes i'm alive")
Sure , I can add another field to the MSG class. but I don't want to because the MSG class has a lot of unnecessary fields - for a Heartbeat operation.
Question :
The server read n bytes. Is there anyway I can know if this is a MSG message or a "Heartbeat" message ?
Is there anyway I can know if this is a MSG message or a "Heartbeat" message?
No. Protocol buffer messages don't contain any such type information. Typically the way round this is to have a "wrapper" type with a field for each message you might want to send. You'd ideally want to express this as a oneof, but my port doesn't support that (yet).
The overhead is minimal - but it will be a protocol change, of course, so you'll need to think about any existing servers etc.
Update
Turns out the error was in the crypto processor code, that is fixed. But now running into what seems like it might be a handshaking issue.
On first transmission, we get a single byte back from the device with value 0xFF(don't know why, the engineer I'm working with isn't too experienced with RS-232 either). Then, things run as normal (just sending the device one byte at a time, and waiting for a matching echo). However, neither the device nor the .NET app can send more than a couple of bytes at a time before one of them locks up and refuses to send or receive.
At work I'm writing an app that interfaces over RS232 with a crypto processor inside a device to reprogram flash modules inside the device.
To just take things slow and make sure all our headers are right, we're writing one byte at a time with SerialPort.Write(). However, when we run the code on the crypto processor, it reads an extra NULL in between each byte. When I test the .NET code on my local machine with two serial ports and a crossover cable, I capture the output in HyperTerminal or Putty and there are no extra NULLs when I view the log in Notepad++.
However, to further complicate things, when we manually type messages byte-per-byte via HyperTerminal to the crypto processor, it reads the input as a single byte only, no extra NULLs (as compared to the .NET app). Anybody have any experience with .NET doing mysterious things when it writes to a SerialPort?
We're initializing a test chunk with this:
byte[] testBytes = new byte[] { (byte)'A', (byte)'B', (byte)'C', (byte)'D', (byte)'E', (byte)'F', (byte)'G', (byte)'H' };
byte[] byteArray = new byte[256];
for (int i = 0; i < 32; i++)
{
testBytes.CopyTo(byteArray, i * 8);
}
And sending it with this:
public void StutterSend(byte[] data, long delayMs)
{
bool interactive = false;
if (delayMs < 0)
interactive = true;
for (int i = 0; i < data.Length; i++)
{
serialPort.Write(data, i, 1);
if (interactive)
{
WriteLine("Sent byte " + (i + 1) + " of " + data.Length + ". Press any key to send moar.");
Console.ReadKey();
}
else
{
double timer = DateTime.Now.TimeOfDay.TotalMilliseconds;
do { } while ((DateTime.Now.TimeOfDay.TotalMilliseconds - timer) < delayMs);
}
}
WriteLine("Done sending bytes.");
}
Our SerialPort is configured with all the matching parameters (stop bits, data bits, parity, baud rate, port name), and our handshake is set to None (it's just how our uart driver works).
Regarding your update, it sounds like your crypto processor has some more problems. Getting a 0xff back can be the result of an unexpected glitch of <= 1 bit time on the Tx line of the RS232 port. This is interpreted as a start bit by the PC. After the glitch, the Tx line returns to the mark state and now that the UART on the PC has a start bit, it interprets the "data" bits as all ones (the value for the mark state). The mark state is also the correct value for the stop bit(s) so your PC's UART has received a valid byte with a value of 0xff. Note that the glitch can be very fast relative to the RS232 data rate and still be interpreted as a start bit so have your engineer look at this line with an oscilloscope in normal mode/single sequence trigger to confirm this.
What is the Encoding property for the serialPort set to? The docs for SerialPort.Write( byte[], int, int) say that it runs its data through an Encoder object (which doesn't really make sense to me for a byte[]). It's supposed to default to ASCIIEncoding, but it seems like it might be set to something else. try explicitly setting it to ASCIIEncoding and see if that helps. I can't recall if this was an issue for me back when I did some serial port stuff in .NET to talk to an embedded board...
Note that even with ASCIIEncoding in use, you'll get some (probably unwanted) transformation of data - if you try to send something above value 127, the encoder will convert it to '?' since it's not a valid ASCII character. I can't recall off the top of my head how I got the serial port to simply leave my data alone - I'll have to dig around in some source code...
SerialPort sets the Parity property to Parity.None if you don't specify any. This means in case your receiver expects a Partity bit, it will never get one as long as you don't tell SerialPort explicitely to send along a Parity Bit with the transmitted data.
And the fact that it went well on HyperTerminal could be that HyperTerminal uses a Parity bit by default ( I don't know HyperTerminal well).
I'm trying to get a C++ service to load an XML document from a MSMQ message generated by C#. I can't really change the C++ side of things because I'm trying to inject test messages into the queue. The C++ service is using the following to load the XML.
CComPtr<IXMLDOMDocument2> spDOM;
CComPtr<IXMLDOMNode> spNode;
CComBSTR bstrVal;
if(_FAILED(hr = spDOM.CoCreateInstance(CLSID_DOMDocument30)))
{
g_infoLog->LogCOMError(hr, "CWorker::ProcessBody() Can't Create DOM");
pWork->m_nFailure = WORKFAIL_BADXML;
goto Exit;
}
hr = spDOM->loadXML(bstrBody, &vbResult);
The C# code to send the MSMQ message looks like this (just test code not pretty):
// open the queue
var mq = new MessageQueue(destinationQueue)
{
// store message on disk at all intermediaries
DefaultPropertiesToSend = { Recoverable = true },
// set the formatter to Binary, default is XML
Formatter = new BinaryMessageFormatter()
};
// send message
mq.Send(messageContent, "TestMessage");
mq.Close();
I tried to send the same message using BinaryMessageFormatter but it puts what I think are unicode characters at the top before the XML starts.
.....ÿÿÿ
ÿ.......
......À)
If I use the default XML formatter the message has the following top element. The C++ service doesn't seem to handle this.
<?xml version="1 .0"?>..<string>& lt;
Do you know of a way I could easily clean up the unicode characters when using the binary formatter? If so I think it might work.
Have you tried the ActiveXMessageFormatter? It might not compile with it as the formatter, i have no way to test here, but it might.
EDIT: just tried and it compiles ok, whether the result is any better i still couldn't say for sure.