I am trying to send a specific TCP packet to a server but it doesn't seem like it is sending the right data. How should I go about this
I have tried StreamWriter class. Using the NetworkStream. Sending Bytes, sending ASCII and sending text.
TcpClient client = new TcpClient("game_server_ip", port);
NetworkStream stream = client.GetStream();
StreamWriter writer = new StreamWriter(stream);
writer.WriteLine("....T..hello");
writer.Flush();
I am trying to send this exact packet:
00 00 00 0c 54 00 05 68 65 6c 6c 6f
Which translates to the text above
This is also the raw bytes:
0000000c54000568656c6c6f
The expected result should mean that the ingame chat should send a message saying hello. I have made sure the connection is up and running and it is. Also have tried sending the packet using Wireshark and WPE Pro and they work fine. (I got this packet from snifing)
For TCP you will need to connect to the remote endpoint.
Check this example out Example
Related
I need to display a live video stream in a UWP application.
The video stream comes from a GoPro. It is transported by UDP messages. It is a MPEG-2 TS stream. I can play it successfully using FFPlay with the following command line :
ffplay -fflags nobuffer -f:v mpegts udp://:8554
I would like to play it with MediaPlayerElement without using a third party library.
According to the following page :
https://learn.microsoft.com/en-us/windows/uwp/audio-video-camera/supported-codecs
UWP should be able to play it. (I installed the "Microsoft DVD" application in the Windows Store).
I receive the MPEG-2 TS stream with a UdpClient. It works well.
I receive in each UdpReceiveResult a 12 bytes header, followed by 4, 5, 6, or 7 MPEGTS packets (each packet is 188 bytes, beginning with 0x47).
I created a MseStreamSource :
_mseStreamSource = new MseStreamSource();
_mseStreamSource.Opened += (_, __) =>
{
_mseSourceBuffer = _mseStreamSource.AddSourceBuffer("video/mp2t");
_mseSourceBuffer.Mode = MseAppendMode.Sequence;
};
_mediaPlayerElement.MediaSource = MediaSource.CreateFromMseStreamSource(_mseStreamSource);
This is how I send the messages to the MseStreamSource :
UdpReceiveResult receiveResult = await _udpClient.ReceiveAsync();
byte[] bytes = receiveResult.Buffer;
mseSourceBuffer.AppendBuffer(bytes.AsBuffer());
The MediaPlayerElement displays the message "video not supported or incorrect file name". (not sure of the message, my Windows is in French).
Is it a good idea to use the MseAppendMode.Sequence mode ?
What should I pass to the AppendBuffer method ? The raw udp message including the 12 bytes header or each MPEGTS 188 bytes packet ?
I finally got the video working !
Here are the steps I follow to extract the MPEG-TS packets and correctly send them to the MseStreamSource :
The MseSourceBuffer needs to be in "Sequence" mode :
_mseSourceBuffer.Mode = MseAppendMode.Sequence;
For each received UDP datagram, I extract the MPEG-TS packets. To do that, I ignore the first 12 bytes of the UDP datagram. Then I extract each 188 bytes packet in a separate array (each packet starts with 0x47).
I send each packet to a synchronized queue.
I dequeue the packets from the queue and send them grouped to the MseSourceBuffer. I create a new group for each PAT packet (pid = 0) :
byte[] bytes;
// [...] combine the packets of the group
mseSourceBuffer.AppendBuffer(bytes.AsBuffer());
I tried to use a MemoryStream and call the AppendStream() method, but with no success.
Also care about threads synchronization : packets order should not be lost. That is the reason for the synchronized queue.
Hope it can help someone else.
This wikipedia MPEG-TS page helped me a lot.
For some experimentation was working with Simple HTTP Server code here
In one case I wanted it to serve some ANSI encoded text configuration files. I am aware there are more issues with this code but the only one I'm currently concerned with is Content-Length is wrong, but only for certain text files.
Example code:
Output stream initialisation:
outputStream = new StreamWriter(new BufferedStream(socket.GetStream()));
The handling of HTTP get:
public override void handleGETRequest(HttpProcessor p)
{
if (p.http_url.EndsWith(".pac"))
{
string filename = Path.Combine(Path.GetDirectoryName(System.Reflection.Assembly.GetExecutingAssembly().Location), p.http_url.Substring(1));
Console.WriteLine(string.Format("HTTP request for : {0}", filename));
if (File.Exists(filename))
{
FileInfo fi = new FileInfo(filename);
DateTime lastWrite = fi.LastWriteTime;
Stream fs = File.Open(filename, FileMode.Open, FileAccess.Read, FileShare.Read);
StreamReader sr = new StreamReader(fs);
string result = sr.ReadToEnd().Trim();
Console.WriteLine(fi.Length);
Console.WriteLine(result.Length);
p.writeSuccess("application/x-javascript-config",result.Length,lastWrite);
p.outputStream.Write(result);
// fs.CopyTo(p.outputStream.BaseStream);
p.outputStream.BaseStream.Flush();
fs.Close();
}
else
{
Console.WriteLine("404 - FILE not found!");
p.writeFailure();
}
}
}
public void writeSuccess(string content_type,long length,DateTime lastModified) {
outputStream.Write("HTTP/1.0 200 OK\r\n");
outputStream.Write("Content-Type: " + content_type + "\r\n");
outputStream.Write("Last-Modified: {0}\r\n", lastModified.ToUniversalTime().ToString("r"));
outputStream.Write("Accept-Range: bytes\r\n");
outputStream.Write("Server: FlakyHTTPServer/1.3\r\n");
outputStream.Write("Date: {0}\r\n", DateTime.Now.ToUniversalTime().ToString("r"));
outputStream.Write(string.Format("Content-Length: {0}\r\n\r\n", length));
}
For most files I've tested with Content-Length is correct. However when testing with HTTP debugging tool Fiddler some times protocol violation is reported on Content-Length.
For example fiddler says:
Request Count: 1
Bytes Sent: 303 (headers:303; body:0)
Bytes Received: 29,847 (headers:224; body:29,623)
So Content-Length should be 29623. But the HTTP header generated is
Content-Length: 29617
I saved the body of HTTP content from Fiddler and visibly compared the files, couldn't notice any difference. Then loaded them into BeyondCompare Hex compare, there are several problems with files like this:
Original File: 2D 2D 96 20 2A 2F
HTTP Content : 2D 2D EF BF BD 20 2A 2F
Original File: 27 3B 0D 0A 09 7D 0D 0A 0D 0A 09
HTTP Content : 27 3B 0A 09 7D 0A 0A 09
I suspect problem is related to encoding but not exactly sure. Only serving ANSI encoded files, no Unicode.
I made the file serve correctly with right Content-Length by modifying parts of the file with bytes sequence. Made this change in 3 parts of the file:
2D 2D 96 (--–) to 2D 2D 2D (---)
Based on the bytes you pasted, it looks like there are a couple things going wrong here. First, it seems that CRLF in your input file (0D 0A) is being converted to just LF (0A). Second, it looks like the character encoding is changing, either when reading the file into a string, or Writeing the string to the HTTP client.
The HTTP Content-Length represents the number of bytes in the stream, whereas string.Length gives you the number of characters in the string. Unless your file is exclusively using the first 128 ASCII characters (which precludes non-English characters as well as special windows-1252 characters like the euro sign), it's unlikely that string.Length will exactly equal the length of the string encoded in either UTF-8 or ISO-8859-1.
If you convert the string to a byte[] before sending it to the client, you'll be able to get the "true" Content-Length. However, you'll still end up with mangled text if you didn't read the file using the proper encoding. (Whether you specify the encoding or not, a conversion is happening when reading the file into a string of Unicode characters.)
I highly recommend specifying the charset in the Content-Type header (e.g. application/x-javascript-config;charset=utf-8). It doesn't matter whether your charset is utf-8, utf-16, iso-8859-1, windows-1251, etc., as long as it's the same character encoding you use when converting your string into a byte[].
i have compressed file (binary file/compressed string - i'm not sure what it is),
i'am trying to decompress this file by c#/vb.net ,
i tried to decompress it with Gzip:
Private Shared Function gzuncompress(ByVal data() As Byte) As Byte()
Dim input As MemoryStream = New MemoryStream(data)
Dim gzip As GZipStream = New GZipStream(input, CompressionMode.Decompress)
Dim output As MemoryStream = New MemoryStream
gzip.CopyTo(output)
Return output.ToArray
End Function
gzuncompress(New System.Net.WebClient().DownloadData("http://haxball.com/list3"))
but there is an exception (where : gzip.CopyTo(output)):
The magic number in GZip header is not correct
but when i tried to uncompress it by php it's worked :) .
php header('Content-Type: text/html; charset=utf-8');
$list = file_get_contents('http://haxball.com/list3');
$list = gzuncompress($list);
$len = implode('', unpack('n*', $list));
$bytes = unpack('c*', $list);
$string = implode('', array_map('chr', $bytes));
echo $string;
you can check the code here:
http://www.compileonline.com/execute_php_online.php
someone have the php's gzuncompress c#/vb.net alternative?
Even if there is a extarnal exe file that can do the same as the php's gzuncompress function it will be very good answer,
kind of:
Process.start("c:\umcompress.exe -f c:\list3 -o c:\res.txt")
Note:A good example is better than explanation
Update:
The First 30 Bytes Of The File:
78 DA 8C BD 79 F4 5D D7 55 26 78 65 0D F1 24 0F 89 E3 98 4C 5C 47 21 71 E2 C8 B9 E7 9E E1
That is a zlib stream. The zlib format is described in RFC 1950, and consists of a two-byte header and a four-byte trailer around a deflate stream. You will need to write your own code to process the header and trailer, and you can use the DeflateStream class to decompress the deflate stream.
Or you can use DotNetZip which will process the zlib stream directly.
I'm trying to send a hex string to a tcp socket. I have some problems with the format or conversion of this string, cause I'm not very sure what the format its using.
I've written a WindowsPhone app which is working fine based on Socket Class.
This app emulates request, that are normaly send from a desktop program to a device which hosts a webservice.
Via wireshark, I found out, that the webservice will accept an input stream (think its in hex) and returns a 2nd. hex stream which contains the data I need.
So the desktop app is sending a stream
and Wireshark shows when :
Data (8 bytes)
Data: 62ff03fff00574600
Length: 8
Now I've tried a lot to reproduce this stream. I thougt, it used to be a UTF8 string and converted this stream to this format. But every time I send it, is see in Wireshark the following output: 62c3bf03c3bf00574600
As far as i've investigated 62 = b but ff send always c3bf.
Does somebody know how to send this stream in the right format?
Cheers,
Jo
The socket transport shouldn't care, the content of a TCP packet is binary representing "whatever".
From the code you pointed to in the comments:
byte[] payload = Encoding.UTF8.GetBytes(data);
socketEventArg.SetBuffer(payload, 0, payload.Length);
...
response = Encoding.UTF8.GetString(e.Buffer, e.Offset, e.BytesTransferred);
response = response.Trim('\0');
At the end of the socket send/receive (data == response). If that isn't occurring you need to figure how where the problem is. The first step is to write some very simple code like so:
string source = "your problem text string";
byte[] encode = Encoding.UTF8.GetBytes(source);
target = Encoding.UTF8.GetString(encode, 0, encode.Length);
Debug.Assert(source == target);
If that works, then output the 'encode' array can check to make sure that is contained in the packet data where it is being send, then verify that that is what is being received. If you are sending the right data but receiving it corrupted you have serious problems ... I doubt you find that but if so write a very simple test program that sends and receives on the same machine (localhost) to see if it is repeatable.
If I had to guess I would say that the characters being encoded are not Unicode or that Win phone doesn't properly support it (Proper unicode support).
As long as you don't know the protocol / the encoding the server expects you can only replay the known messages, like the bytes you provided in your question.
Therefore you just define directly the byte array like this:
byte[] payload = new byte[] {0x62, 0xff, 0x03, 0xff, 0xf0, 0x05, 0x74, 0x60};
and send it over the socket like you did with the encoded string before. The server should now accept the message like it was sent by the client you sniffed.
I am writing C# application for reading/writing RFID tags which are formatted by android aplication (NXP). I found default keys (A0 A1 A2 A3 A4 A5 and D3 F7 D3 F7 D3 F7) and i can read all data from tag, but problem is that i cant write anything on it
other weird thing is that Key block looks like this
[00 00 00 00 00 00 78 77 88 C1 00 00 00 00 00 00] and this block is authenticating by key [A0 A1 A2 A3 A4 A5]
My question is how authenticate sector to have permissions for writing?
let me understand the problem - you wrote the NDEF/RTD information (URI or other) using the Android phone with NFC to some NFC tag (which one - MIFARE Ultralight, Topaz or other?). Now you have an USB NFC reader (which one?) connected to the desktop PC and you are trying to read data from tag? Is it correct?
Please confirm and put there missing information.
BR
STeN