C# Sending Hex string to tcp socket - c#

I'm trying to send a hex string to a tcp socket. I have some problems with the format or conversion of this string, cause I'm not very sure what the format its using.
I've written a WindowsPhone app which is working fine based on Socket Class.
This app emulates request, that are normaly send from a desktop program to a device which hosts a webservice.
Via wireshark, I found out, that the webservice will accept an input stream (think its in hex) and returns a 2nd. hex stream which contains the data I need.
So the desktop app is sending a stream
and Wireshark shows when :
Data (8 bytes)
Data: 62ff03fff00574600
Length: 8
Now I've tried a lot to reproduce this stream. I thougt, it used to be a UTF8 string and converted this stream to this format. But every time I send it, is see in Wireshark the following output: 62c3bf03c3bf00574600
As far as i've investigated 62 = b but ff send always c3bf.
Does somebody know how to send this stream in the right format?
Cheers,
Jo

The socket transport shouldn't care, the content of a TCP packet is binary representing "whatever".
From the code you pointed to in the comments:
byte[] payload = Encoding.UTF8.GetBytes(data);
socketEventArg.SetBuffer(payload, 0, payload.Length);
...
response = Encoding.UTF8.GetString(e.Buffer, e.Offset, e.BytesTransferred);
response = response.Trim('\0');
At the end of the socket send/receive (data == response). If that isn't occurring you need to figure how where the problem is. The first step is to write some very simple code like so:
string source = "your problem text string";
byte[] encode = Encoding.UTF8.GetBytes(source);
target = Encoding.UTF8.GetString(encode, 0, encode.Length);
Debug.Assert(source == target);
If that works, then output the 'encode' array can check to make sure that is contained in the packet data where it is being send, then verify that that is what is being received. If you are sending the right data but receiving it corrupted you have serious problems ... I doubt you find that but if so write a very simple test program that sends and receives on the same machine (localhost) to see if it is repeatable.
If I had to guess I would say that the characters being encoded are not Unicode or that Win phone doesn't properly support it (Proper unicode support).

As long as you don't know the protocol / the encoding the server expects you can only replay the known messages, like the bytes you provided in your question.
Therefore you just define directly the byte array like this:
byte[] payload = new byte[] {0x62, 0xff, 0x03, 0xff, 0xf0, 0x05, 0x74, 0x60};
and send it over the socket like you did with the encoded string before. The server should now accept the message like it was sent by the client you sniffed.

Related

How to read binary data from TCP stream?

I have a device that sends data to another device via TCP. When I receive the data and try the Encoding.Unicode.GetString() method on the Byte array, it turns into unreadable text.
Only the first frame of the TCP packet (the preamble in the header) can be converted to text. (sender TCP docs, packet data).
This is my code so far. I have tried encoding as ASCII and there are no results either.
NetworkStream stream = tcpClient.GetStream();
int i;
Byte[] buffer = new Byte[1396];
while ((i = stream.Read(buffer, 0, buffer.Length)) != 0)
{
data = System.Text.Encoding.Unicode.GetString(buffer, 0, i);
data = data.ToUpper();
Console.WriteLine($"Data: {data}");
}
This just prints the same unreadable string seen in the "packet data" link above. Why is this happening? The official device doc says it is encoded in little endian. Am I missing something? I am new in handling TCP data transmission.
There is nothing in the linked documentation to indicate that there is any textual data at all, with exception for the "preamble", that is a fixed, four letter ascii-string, or an integer with the equivalent value, whatever you prefer.
It specifies a binary header with a bunch of mostly 32-bit integers, followed by a sequence of frames, where each frame has 3 32-bit numbers.
So I would suggest using wrapping your buffer in a memory stream and use BinaryReader to read values, according to the format specification.
Note that network communication typically uses big-endian encoding, but both windows and your device uses little-endian, so you should not have to bother with endianess.

Decoding a string c#

I created TCP server that is distributing client's messages and run on a problem. When I'm sending Cyrillic messages through stream they're not decoding properly. Anyone knows how can I repair that?
Here's the code for sending the message:
var message = Console.ReadLine().ToCharArray().Select(x => (byte)x).ToArray();
stream.Write(message);`
Here's the code for receiving:
var numberOfBytes = stream.Read(buffer,0,1024);
Console.WriteLine($"{numberOfBytes} bytes received");
var chars = buffer.Select(x=>(char)x).ToArray();
var message = new string(chars);
The problem is that a character in C# represents a 2-byte UTF-16 character. A cyrillic character is bigger than 255 in UTF-16, so you lose information when converting it to a byte.
To convert a string to a byte array, use the Encoding class:
byte[] buffer = System.Text.Encoding.UTF8.GetBytes(Console.ReadLine());
To convert it back to a string on the receiver's end, write:
string message = System.Text.Encoding.UTF8.GetString(buffer);
Another problem is that Stream.Read does not guarantee to read all bytes of your message at once (Your stream does not know that you send packets with a certain size). So it could happen, for example, that the last byte of the received byte array is only the first byte of a 2-byte character, and you receive the other byte the next time you call Stream.Read.
There are several solutions to this issue:
Wrap the Stream in a StreamWriter at the sender's end and in a StreamReader at the receiver's end. This is probably the simplest method if you transmit only text.
Transmit the length of your message at the beginning of your message as an integer. This number tells the receiver how many bytes he has to read.
To convert a string to bytes, use System.Text.Encoding.GetBytes(string). I suggest you change the sending code to:
// using System.Text;
var messageAsBytes = Encoding.UTF8.GetBytes(Console.ReadLine());
To convert bytes to a string, use System.Text.Encoding.GetString(byte[]). If you receive UTF-8-encoded bytes:
// using System.Text;
var messageAsString = Encoding.UTF8.GetString(buffer);
Some suggested reading:
https://www.joelonsoftware.com/2003/10/08/the-absolute-minimum-every-software-developer-absolutely-positively-must-know-about-unicode-and-character-sets-no-excuses/
https://learn.microsoft.com/en-us/dotnet/api/system.text.encoding?view=netframework-4.7.2

Play MPEG-2 TS using MseStreamSource

I need to display a live video stream in a UWP application.
The video stream comes from a GoPro. It is transported by UDP messages. It is a MPEG-2 TS stream. I can play it successfully using FFPlay with the following command line :
ffplay -fflags nobuffer -f:v mpegts udp://:8554
I would like to play it with MediaPlayerElement without using a third party library.
According to the following page :
https://learn.microsoft.com/en-us/windows/uwp/audio-video-camera/supported-codecs
UWP should be able to play it. (I installed the "Microsoft DVD" application in the Windows Store).
I receive the MPEG-2 TS stream with a UdpClient. It works well.
I receive in each UdpReceiveResult a 12 bytes header, followed by 4, 5, 6, or 7 MPEGTS packets (each packet is 188 bytes, beginning with 0x47).
I created a MseStreamSource :
_mseStreamSource = new MseStreamSource();
_mseStreamSource.Opened += (_, __) =>
{
_mseSourceBuffer = _mseStreamSource.AddSourceBuffer("video/mp2t");
_mseSourceBuffer.Mode = MseAppendMode.Sequence;
};
_mediaPlayerElement.MediaSource = MediaSource.CreateFromMseStreamSource(_mseStreamSource);
This is how I send the messages to the MseStreamSource :
UdpReceiveResult receiveResult = await _udpClient.ReceiveAsync();
byte[] bytes = receiveResult.Buffer;
mseSourceBuffer.AppendBuffer(bytes.AsBuffer());
The MediaPlayerElement displays the message "video not supported or incorrect file name". (not sure of the message, my Windows is in French).
Is it a good idea to use the MseAppendMode.Sequence mode ?
What should I pass to the AppendBuffer method ? The raw udp message including the 12 bytes header or each MPEGTS 188 bytes packet ?
I finally got the video working !
Here are the steps I follow to extract the MPEG-TS packets and correctly send them to the MseStreamSource :
The MseSourceBuffer needs to be in "Sequence" mode :
_mseSourceBuffer.Mode = MseAppendMode.Sequence;
For each received UDP datagram, I extract the MPEG-TS packets. To do that, I ignore the first 12 bytes of the UDP datagram. Then I extract each 188 bytes packet in a separate array (each packet starts with 0x47).
I send each packet to a synchronized queue.
I dequeue the packets from the queue and send them grouped to the MseSourceBuffer. I create a new group for each PAT packet (pid = 0) :
byte[] bytes;
// [...] combine the packets of the group
mseSourceBuffer.AppendBuffer(bytes.AsBuffer());
I tried to use a MemoryStream and call the AppendStream() method, but with no success.
Also care about threads synchronization : packets order should not be lost. That is the reason for the synchronized queue.
Hope it can help someone else.
This wikipedia MPEG-TS page helped me a lot.

What is the exact method to send an Image over network [An attempt using Sockets]

This is my Server side code:
public void ReceivingData(object sender, EventArgs e)
{
while (mysocket.Connected)
{
buffer = new byte[accepted.SendBufferSize];
int bytesRead = accepted.Receive(buffer);
MemoryStream Data = new MemoryStream(buffer);
if ( picbox.InvokeRequired)
{
picbox.Invoke(new MethodInvoker(delegate { picbox.Image = Image.FromStream(Data); }));
}
}
}
The connection gets established and the file is being received without any issue. However the image gets distorted on Transfer. I do not understand why this is happening. Here is the screenshot:
I remember i had to format the strings which i used to send over sockets using Encoding.ASCII.GetString(StringToFormat). What do i need to do in case of Images?
In your ReceivingData callback you may not receive all the data back in one pop. Some data can be partially received and the rest of it in a subsequent (or multiple) callbacks and it will be your task to re-assemble the original message.
You will need to define a protocol to ensure that you have read all necessary data.
You could for example use base64 to encode the image on the server and decode it on the client. You would need to know how many bytes you should anticipate. This can be done, either by prefixing your response with the total bytes that the client should anticipate or by having a special marker (such as byte value 0x00) to distinguish message boundaries.
Using Base64 will also have the effect of increasing file sizes by 33% since base64 basically encodes every 6bits of the incoming stream to an 8bit readable character. So for every 3 'real' bytes you would like to transfer you will need 4 encoded bytes.

Encoding problem between C# TCP server and Java TCP Client

i'm facing some encoding issue which i'm not able to find the correct solution.
I have a C# TCP server, running as a window service which received and respond XML, the problem comes down when passing special characters in the output such as spanish characters with accents (like á,é,í and others).
Server response is being encoded as UTF-8, and java client is reading using UTF-8. But when i print its output the character is totally different.
This problem only happens in Java client(C# TCP client works as expected).
Following is an snippet of the server code that shows the encoding issue:
C# Server
byte[] destBytes = System.Text.Encoding.UTF8.GetBytes("á");
try
{
clientStream.Write(destBytes, 0, destBytes.Length);
clientStream.Flush();
}catch (Exception ex)
{
LogErrorMessage("Error en SendResponseToClient: Detalle::", ex);
}
Java Client:
socket.connect(new InetSocketAddress(param.getServerIp(), param.getPort()), 20000);
InputStream sockInp = socket.getInputStream();
InputStreamReader streamReader = new InputStreamReader(sockInp, Charset.forName("UTF-8"));
sockReader = new BufferedReader(streamReader);
String tmp = null;
while((tmp = sockReader.readLine()) != null){
System.out.println(tmp);
}
For this simple test, the output show is:
ß
I did some testing printing out the byte[] on each language and while on C# á output as:
195, 161
In java byte[] read print as:
-61,-95
Will this have to do with the Signed (java), UnSigned (C#) of byte type?.
Any feedback is greatly appreciated.
To me this seems like an endianess problem... you can check that by reversing the bytes in Java before printing the string...
which usually would be solved by including a BOM... see http://de.wikipedia.org/wiki/Byte_Order_Mark
Are you sure that's not a unicode character you are attemping to encode to bytes as UTF-8 data?
I found the below has a useful way of testing to see if the data in that string is ccorrect UTF-8 before you send it.
How to test an application for correct encoding (e.g. UTF-8)

Categories