c++ deserialize object sent from c# application via UDP - c#

I have an application in c# which connects to another application, written in c++, via UDP. I'd prefer a performant solution as I am looking to measure the time between an event client-side, some processing server-side and a response of processing complete back on the client-side. Essentially: Round trip delay.
I have an object in c#, for example:
public class Package{
public Package(){}
public float DeliveryTime = 1.5;
public int NumberOfItems = 2
}
I then serialize my package to a byte array:
Package package = new Package();
var serializedPackage = ObjectToByteArray(package);
Next, I send my serializedPackage via UDP (I know, no guarantee of delivery or order, but I don't need those guarantees).
client.Send(serializedPackage , serializedPackage .Length, ip);
My c++ application then received the UDP message...
the question is:
How do I now convert my received byte array back into a usable object with property values maintained?
I need to deserialize into some class Package c++ side and then use the properties... for example some sudo code:
DeserializedPackage deserializedPackage = someDeserializeFunction(bytes);
if(deserializedPackage.NumberOfItems > 0){
cout << deserializedPackage.DeliveryTime;
}
Where do I start? Is this even possible?
I am well versed with C#, but absolutely new to c++, so no mixing c and c++, please. Also, I do not want to use Boost or another library.
I am also open to suggestions such as "convert your package object to JSON and send that. Then deserialize your JSON and populate a c++ object called package."

Your problem is the same, how to serialize types between two languages.
Protocol buffers are one way, here is a similar question about what to do between c++ and java. Google's PBs are suggested there, although there are others.
However, my favorite way to serialize objects in a language agnostic way is JSON. It's easy to reason with and probably the quickest cleanest thing you can implement. So I'd go with that first, unless you have any evidenced performance objection. You seem to already know how to do this?

Related

Sending an integer array over tcp in c#

This may be a stupid question, i apologise if it is. But I'm rewriting some c code into c# to use as part of a GUI, the original c programs transmit data buffers backwards and forwards to a microcontroller via:
n = write(sockfd, buf, sizeof(buf));
In the transmitter program and
n = read(sockfd, (void *)buf[idx]+numread,sizeof(buf[0])-numread);
In the receiver program. I am trying to find the c# equivalent of these functions above, but the only one i have found only takes byte data.
The server on the microcontroller runs software i didn't design, hence why i can't simply serialise or convert to byte etc (as in previous similar questions: How to send integer array over a TCP connection in c# and then decompose at the other end, or at least id rather check if theres a solution before i get into trying to edit code i didn't write.
any help greatly appreciated !
You can use NetworkStream.Read/Write to read and write byte arrays to a socket. BinaryReader/Writer are abstractions on top of that that make it easier to write in certain formats.

Exact object to byte conversion, similar to memcpy in C#

I have been looking for a solution for this for a little while and non of the stuff I've found quite matches. It is great that C# has a built in serialization library but that is not quite what I am looking for. I need to serialize objects in such a way that I can serialize them, append header data to the packet ie. ID number, timestamp, object type etc. and then be able to send it out without having to keep in mind the platform I am sending it to. In other words, I should be able to unwrap my packet in C++ or in Java without much more knowledge than what object type I am casting into and the order of header data. The binary formatter in C# creates a problem because it is designed to be deserialized on the other end using the same library. It also creates bloated packets which I would rather not have to deal with. I would rather format my packet as such
|========|========|=========|===|==============|
| packetID | datatype | timeStamp | etc | serializedObject |
|========|========|=========|===|==============|
It would be nice if I had access to something along the lines of memcpy in order to achieve this because then as long as the data type order of the object being deserialized into matches (Yes, assuming the other language has the same byte sizes for datatypes), it is easy to grab data from the server on a new platform (say I want to grab data of an Android (Java) or an iPhone (obj. C)) with little hassle
I've used Google's protocol buffers to good effect. It's small, fast, cross platform, backwards compatible and serializes to a binary format. It doesn't support the custom header information you're looking for, but if you can frame your own packets you can tack a custom header onto the binary stream as you see fit. Protobuf can compile down into C#, iOS, Java and C++.

Serialize/deserialize between MONO-client and ASP.NET server - possible or not?

I have different objects I would like to be able to load and save on a ASP.NET (IIS webserver) from my MONO client. So I have made them as Class objects.
As example I have [Score] which I will be using for producing a hiscore on a server.
Score.cs
using System.Runtime.Serialization;
[System.Serializable]
public class Score
{
public long Total;
public string Name;
public Score(){}
public Score(string name, long total)
{
Total = total;
Name = name;
}
}
This code I want to use as data structure (n-tier architecture) in both MONO as clientside structure and in my ASP.NET server I use it as serverside object too.
What I do is that I want to build a list-array like this:
List<Score> hiscores = new List<Score>();
hiscores.add (new Score("Player 1", 10000));
hiscores.add (new Score("Player 2", 20000));
hiscores.add (new Score("Player 3", 30000));
hiscores.add (new Score("Player 4", 40000));
... then I want to serialize it into a bytearray so I can POST it as binaryfile:
MemoryStream membytes = new MemoryStream();
BinaryFormatter binFormatter = new BinaryFormatter();
binFormatter.Serialize(membytes, hiscores );
...
BUT - on the webserver, in ASP.NET/C# I get an error 500 telling me that it cannot load the assemble for deserializing the data???
HUH?! I've read someone suggesting to make the assemble as external? and then link it on both platforms, is that possible in my situation or am I on the wrong path somehow?
I can easily transfer a simle bytearray with values from 0x00 to 0xff to my ASP.NET and save it. But once I start using serialize/deserialize on my List objects and transfer them into a Byte[] things messes up?
QUESTION
Is deserialization not possible if the serializer isnt made on the same platform?
is it possible to place the dataobjects (business objects) in a DLL project and then link/use this on both platforms and that way gaining the same assemble value?
BinaryFormatter produces a non-interoperable binary format. Use it only if both the client and the server run the same version of the CLR. Otherwise you could use an interoperable format such as XML (XmlSerializer) or JSON (JavaScriptSerializer) to communicate between your client and server.
Your big mistake is using BinaryFormatter for this. This has many issues.
You should trust the data you deserialize. This is rarely the case in client-server applications.
It uses some internal format, and I'm not sure how compatible that is between different versions of .net or even between .net and mono. Use a serializer with a well defined format.
Good alternatives for this use-case are Json or Xml serializers. Personally usually use Json.net, but .net has also a built in Json serializier.
BinaryFormatter might be the right choice if you serialize a complex object graph, don't need versioning and only read it back on the same system, so you can trust the data. And even in that scenario I'd only use it because I know no better alternative, not because I like BinaryFormatter.
Since you now stated that you transfer a lot of data I'd look into protobuf-net. It's a binary serializer that is very fast and produces compact data. But you should annotate all your properties with numeric IDs. It's author Marc Gravell is active here on SO, so you can expect good support.
Posting my own answer as I actually managed to get it working without extras
First of all, thanks for all the great suggestions for extra libraries, but my initial question was about if it was possible to make MONO and ASP.NET communicate through serialization.
The answer is: YES!
All I had to to was to build a seperate Class-project (DLL) where I put all the datastructures I need to serialize.
Then make it a reference on both platforms and voila, same DLL = same assembly and both platforms agrees that serialization and deserialization is no problem.
Note on binary result/traffic/space
I agree, the binary data seems to be a bit "overkill" compared to some of the other suggested plugins/libraries in here. But it IS possible without any extra libs.
That was my initial goal.
I have used Mike Talbot's Silverlight Serializer http://whydoidoit.com/2010/04/08/silverlight-serialization/
which is very lightweight and should work with both MONO and .NET

send data struct over network between C/C++ server, C/C++ and C# clients

My client will communicate with clients using a struct/class called annoucement. I suppose I will write the server in C++. There will be many different classes that inherit annoucement. My problem is to send these classes over network to a client
I thought maybe I should use iovec. However I don't know if there is anything corresponding to iovec in C#.
The idea is:
I would send a struct with writev, a client would use readv and read the data from the socket like it would be an annoucement class, check uint16 variable in the received struct to deduce the exact type and read the rest of data with apprioprite iovec/readv.
C# can send whole objects as an array but how to interpret it in C/C++?
I want both C/C++ and C# clients to work with the same protocol but that is the first time I designed the protocol and I'm not sure all my ideas will work together.
Any hint'd be appreciated.
Don't try to define your protocol in terms of structs.
Define the format of an announcement, i.e. what bytes in what order constitute an announcement, and then define how multiple announcements are transmitted in sequence.
Once you've got this wire format defined, you can implement parsers and writers in each programming language, using the most appropriate programming language constructs, libraries and frameworks available in each.
A good example for this is protobuf: format specification, C++, Java, Python implementation, .NET implementation.
The question is a bit chaotic, but I have a feeling that you might be interested in Protocol Buffers for efficiently serializing data and transferring it over the network.
There are a number of options when it comes to serialization. This FAQ at parashift.com discusses serialization/deserialization in detail: http://www.parashift.com/c++-faq-lite/serialization.html.

Is it smart to output data from embedded device in xml format?

Our company makes many embedded devices that communicate with PC's via applications that I write in C#.net. I have been considering different ways of improving the data transfer so that the PC application can be more easily synchronized with the devices current state (which in some cases is continually changing).
I have been thinking about an approach where the device formats it's description and state messages into an xml formatted message before sending them across either the serial port, USB, Ethernet Socket, etc. I was thinking that it may make the process of getting all of this data into my C# classes more simple.
The alternative is an approach where the host application sends a command like GETSTATUS and the device responds with an array of bytes, each representing a different property, sensor reading, etc.
I don't have a great deal of experience with xml but from what I have seen can be done with LINQ to XML it seems like it might be a good idea. What do you guys think? Is this something that is done commonly? Is it a horrible idea?!?
First, which ever way you go, make sure the returned data has a version number embedded so that you can revise the data structure.
Is both an option? Seriously, there are always situations where sending data in a more readable form are preferable, and others where a more dense representation is best (these are fewer than most people think, but I don't want to start a religious war about it). People will passionately argue for both, because they are optimizing for different things. Providing both options would satisfy both camps.
A nice, clear XML status could definitely lower the bar for people who are starting to work with your devices. You could also build a C# object that can be deserialized from the binary data that is returned.
It isn't a terrible idea, but it is probably an overdedesign. I would prefer to use a format that the embedded device will generate easier and faster. Then at the PC side I would insert a layer to conver it to a convenient format. You can also use LINQ with objects. Why don't send the data in binary form or in a simple ASCII protocol and then convert it to C# objects? You can use LINQ to access the data. In my opinion, in this case XML introduces an unnecessary complexity.
There are tradeoffs either way, so the right choice depends on your application, how powerful your device is and who is going to be using this protocol.
You mention that the alternative is a binary-serialized, request-response approach. I think that there are two separate dimensions here: the serialization format (binary or XML) and the communication style. You can use whatever serialization format you want in either a push protocol or in a request-response protocol.
XML might be a good choice if
Readability is important
If there is variation between devices, i.e. if you have different devices that have different properties, since XML tends to be self-describing.
Or if you want to publish your device's data to the Internet.
Of course, XML is verbose and there are certainly ways to accomplish all of the above with a binary protocol (e.g. with tagged values can be used to make your binary protocol more descriptive).
One of the founders of this very site has some sane and amusing opinions on XML in XML: The Angle Bracket Tax
I did something very similar in a previous design with PC to microprocessor communications using an XML format. It worked very well on the PC side since what Adobe Flex (what we were using) could interpret XML very easily, and I suspect .Net can do the same thing very easily.
The more complicated part of it was on the microprocessor side. The XML parsing had to be done manually, which was not really that complicated, but just time intensive. Creating the XML string can also be quite a lot of code depending on what you're doing.
Overall - If I had to do it again, I still think XML was a good choice because it is a very flexible protocol. RAM was not that much of an issue with regards to storing a few packets in our FIFO buffer on the microprocessor side but that may be something to consider in your application.
It's a waste of precious embedded CPU time to generate and transmit XML files. Instead, I would just use an array of binary bytes represent the data, but I would use structs to help interpret the data. The struct feature of C# lets you easily interpret an array of bytes as meaningful data. Here's an example:
[StructLayout(LayoutKind.Sequential, Pack = 1)]
public struct DeviceStatus
{
public UInt16 position; // Byte 0 and 1
public Byte counter; // Byte 2
public Fruit currentFruit; // Byte 3
};
enum Fruit : Byte
{
Off = 0,
Apple = 1,
Orange = 2,
Banana = 3,
}
Then you would have a function that converts your array of bytes to this struct:
public unsafe DeviceStatus getStatus()
{
byte[] dataFromDevice = fetchStatusFromDevice();
fixed (byte* pointer = dataFromDevice)
{
return *(DeviceStatus*)pointer;
}
}
Compared to XML, this method will save CPU time on the device and on the PC, and it is easier to maintain than an XML schema, with complementary functions for building and parsing the XML file. All you have to do is make sure that the struct and enum definitions in your embdedded device are the same as the definitions in your C# code, so that the C# program and device agree on the protocol to use.
You'll probably want to use the "packed" attribute on both the C# and embedded side so that all the struct elements are positioned in a predictable way.

Categories