I'm trying to send packets from client to server over tcp stream.
The client connects to the server and tries to send an image. However, the server gets the image only when I shutdown the client. (The server gets the image at the exact same moment I shutdown the client)
I use ProtoBuf-Net for serializing and deserializing.
Here's my relevant code:
This is my client code :
// Connect to the client
client.Connect(Client.SERVER_IP, 1729);
// Capture screenshot
Bitmap captureBitmap = new Bitmap(Screen.PrimaryScreen.Bounds.Width,
Screen.PrimaryScreen.Bounds.Height, System.Drawing.Imaging.PixelFormat.Format32bppArgb);
System.Drawing.Rectangle captureRectangle = Screen.PrimaryScreen.Bounds;
Graphics captureGraphics = Graphics.FromImage(captureBitmap);
captureGraphics.CopyFromScreen(captureRectangle.Left, captureRectangle.Top, 0, 0,
captureRectangle.Size);
// Serialize the screenshot to the socket's stream
ImageConverter img = new ImageConverter();
Packet<byte[]> packet = new Packet<byte[]> { value = (byte[])img.ConvertTo(captureBitmap, typeof(byte[])), type = PacketType.IMAGE });
Serializer.SerializeWithLengthPrefix(stream, packet.GetType().AssemblyQualifiedName, PrefixStyle.Base128);
Serializer.Serialize(stream, packet);
stream.Flush();
This is my server code :
ImageConverter imageConverter = new ImageConverter();
// Wait for client to conncet
var client = new ExtendedTcpClient(listener.AcceptTcpClient());
currentClientControlling = client;
// Deserialize type
var typeName = Serializer.DeserializeWithLengthPrefix<string>(stream, PrefixStyle.Base128);
var type = Type.GetType(typeName);
// Register the type
var model = RuntimeTypeModel.Default;
model.Add(type, true);
// Deserialize the data
var bytes = model.Deserialize(stream, null, type);
var image = (Bitmap)imageConverter.ConvertFrom(bytes);
And this is my Packet model :
public enum PacketType
{
IMAGE
}
[ProtoContract]
public class Packet<T>
{
[ProtoMember(1)]
public PacketType type { get; set; }
[ProtoMember(2)]
public T value { get; set; }
}
I didn't figure out what the problem was. However, I solved it by replacing Serializer.Serialize(stream, packet); with Serializer.SerializeWithLengthPrefix(stream, packet, PrefixStyle.Base128);
Related
I can get plenty of images of desktop capture, but I want to encode them with some video encoder, then send the video data through network. The another application (Data receiver) will get that data and play them on the screen.
How can I encode images to video, and send them?
// There is a simple desktop capture class
public class DesktopCapture
{
public readonly Bitmap Frame;
public readonly Graphics FrameGraphics;
private readonly Rectangle CaptureArea;
public DesktopCapture(Rectangle captureArea)
{
CaptureArea = captureArea;
Frame = new Bitmap(captureArea.Width, captureArea.Height);
FrameGraphics = Graphics.FromImage(Frame);
}
public void Capture()
{
FrameGraphics.CopyFromScreen(CaptureArea.Location, Point.Empty, CaptureArea.Size);
}
}
What I want:
TcpClient client = new TcpClient();
client.Connect(new IPEndPoint(IPAddress.Loopback, 5555));
DesktopCapture desktopCap = new DesktopCapture(Screen.PrimaryScreen.Bounds);
NetworkStream clientStream = client.GetStream();
VideoStreamWriter videoStream = new VideoStreamWriter(Encoder.SomeEncoder, clientStream);
videoStreamWriter.StartEncoding();
while (true)
{
desktopCap.Capture();
videoStreamWriter.EncodeFrame(desktopCap.Frame);
}
So I'm aiming to write a little multiplayer game but first I'm working on getting a server/client architecture setup that can just send simple messages to oneanother using Sockets and JSON.
Rather quickly I've encountered an issue that when a sender sends two messages in rapid succession the receiver receives them as part of the same stream, even though they are seperated by flushes.
The entire Connection class looks like this, it is used by both the sender and receiver:
namespace Networking
{
public class Connection
{
JsonSerializerSettings jsonSettings = new JsonSerializerSettings() { TypeNameHandling = TypeNameHandling.Auto };
private Socket socket;
private NetworkStream stream;
/// <summary>
/// Make a connection with a stream from a socket.
/// </summary>
public Connection(Socket socket)
{
stream = new NetworkStream(socket);
}
public void Close()
{
stream.Close();
socket.Close();
stream.Dispose();
socket.Dispose();
}
private object writeLock = new object();
public void Write(Request request)
{
lock (writeLock)
{
using (var memoryStream = new MemoryStream())
{
Message message = new Message(null, Messaging.Enums.MessageType.Connect, request);
string json = JsonConvert.SerializeObject(message, jsonSettings);
stream.Write(Encoding.ASCII.GetBytes(json));
stream.Flush();
}
}
}
public Message Read()
{
byte[] data = new byte[1024];
using (MemoryStream ms = new MemoryStream())
{
do
{
int numBytesRead = stream.Read(data, 0, data.Length);
ms.Write(data, 0, numBytesRead);
}
while (stream.DataAvailable);
ms.Seek(0, SeekOrigin.Begin);
StreamReader sr = new StreamReader(ms);
string json = sr.ReadToEnd();
return JsonConvert.DeserializeObject<Message>(json, jsonSettings);
}
}
}
}
The incoming data and error looks like this:
I suppose this question isn't 100% specific to C# and .NET but more about how to deal with this problem with sockets in general. Looking for some guidance thanks!
I am working on a software solution to stream a Windows Desktop to a Xamarin application, using TCP.
It works, but I cannot manage to get better framerate.
Here's how I do :
The client, a Xamarin app, listens on a Tcp socket. The server (C# console app) connects to it, captures the Desktop and sends the image. Then, the app show it as an Image.
When sending a bitmap, the server starts sending its size, on a 4 bytes array. The client first listens 4 bytes, then the requested size.
DesktopRecorder.cs (Server side)
private void CaptureScreen()
{
bitmap = new Bitmap(SIZE.Width, SIZE.Height, PixelFormat.Format32bppRgb);
graphics = Graphics.FromImage(bitmap);
graphics.CopyFromScreen(0, 0, 0, 0, SIZE, CopyPixelOperation.SourceCopy);
CaptureReady(BitmapToArray(bitmap));
}
private byte[] BitmapToArray(Bitmap bitmap)
{
using (var ms = new MemoryStream())
{
bitmap.Save(ms, ImageFormat.Png);
return ms.ToArray();
}
}
TcpNetworker.cs (Server side, connecting in TCP):
public override void SendData(byte[] data)
{
if (Tcp.Connected)
{
int size = data.Length;
var sizeArray = BitConverter.GetBytes(size);
if (BitConverter.IsLittleEndian)
sizeArray.Reverse();
Console.WriteLine($"Sending {size} o.");
Write(sizeArray);
Stream.Flush();
Write(data);
}
void Write(byte[] b)
{
Stream.Write(b, 0, b.Length);
}
}
ViewModel.cs (Client side, Xamarin):
private void Accept()
{
try
{
var client = Tcp.AcceptTcpClient();
Console.WriteLine("---------------- Client Connected ----------------");
Stream = client.GetStream();
byte[] sizeArray;
byte[] dataArray;
while (client.Connected)
{
sizeArray = new byte[4];
Stream.Read(sizeArray, 0, 4);
if (BitConverter.IsLittleEndian)
sizeArray.Reverse();
int size = BitConverter.ToInt32(sizeArray, 0);
Console.WriteLine($"Size: {size}");
dataArray = new byte[size];
Stream.Read(dataArray, 0, size);
Dispatcher.Invoke(() => ImageReceived?.Invoke(this, new MemoryStream(dataArray)));
}
}
catch(Exception e)
{
Console.WriteLine($"Error: {e.Message} {Environment.NewLine} {e.StackTrace}");
}
Accept();
}
MainWindow.cs (Page's code behind):
public MainWindow()
{
InitializeComponent();
var model = new MainWindowViewModel();
model.ImageReceived += Model_ImageReceived;
}
private void Model_ImageReceived(object sender, System.IO.MemoryStream memoryStream)
{
memoryStream.Position = 0;
//var image = System.Drawing.Image.FromStream(memoryStream);
var imageSource = new BitmapImage();
imageSource.BeginInit();
imageSource.CacheOption = BitmapCacheOption.OnLoad;
imageSource.StreamSource = memoryStream;
imageSource.EndInit();
MainImage.Source = imageSource;
memoryStream.Dispose();
}
The CaptureScreen() method of DesktopRecorder is called on a given framerate. It works fine at 10 FPS, but when increasing this framerate, I get trouble in the client, the app cannot manage to find the correct size. After receiving some images, the 4 bytes read doesn't contain the size. It is like an offset has been added, and the server/client unsychronize.
So, I am asking 2 things :
How can I keep a synchronization between the server and client after sending many TCP packets, since I need to keep 4 bytes for the size ? I don't feel using a fixed size of bitmap or something like that.
Currently, I am encoding the bitmap into a PNG image, which can be handled natively by Xamarin. Is there a better way to compress my image and sends it faster, even there is a loss in quality ? I already did that kind of things between a C server and an Angular app, and I had to encode it in PNG and Base64. Can I do the same ?
P.S : I know I can manage to get better performance using UDP, but for some reasons, I have to keep with TCP :/.
Thanks for your help !
I know this has been asked numerous times... and I have searched and tried everything that I can, but I am still not sure why I get the "parameter is not valid" exception...
I have an Amazon EC2 instance running Windows Server 2012. On that machine, I am running Unity3D (Unity 5). That Unity application is sending frames (images) from the EC2 instance to my local laptop, via TCP. My client is running Windows 10, not that it is likely to make any difference.
To get my image data, I do the following:
byte[] GetScreenData() {
// Create a texture the size of the screen, RGB24 format
int width = Screen.width;
int height = Screen.height;
RenderTexture rt = new RenderTexture(width, height, 24);
Texture2D tex = new Texture2D(width, height, TextureFormat.RGB24, false);
Camera camera = GameObject.Find("Main Camera").GetComponent < Camera > ();
camera.targetTexture = rt;
camera.Render();
RenderTexture.active = rt;
// Read screen contents into the texture
tex.ReadPixels(new Rect(0, 0, width, height), 0, 0);
camera.targetTexture = null;
RenderTexture.active = null;
Destroy(rt);
// Encode texture into JPG
byte[] bytes = tex.EncodeToJPG();
Destroy(tex);
return bytes;
}
I then serialize my data using FlatBuffers:
public static byte[] FlatSerialize(this ServerToClientMessage message) {
var builder = new FlatBufferBuilder(1);
//Create an ID
var MessageId = builder.CreateString(message.MessageId.ToString());
//Start the vector...
//Loop over each byte and add it - my god, is there not a better way?
FlatServerToClientMessage.StartImagebytesVector(builder, message.ImageBytes.Length);
foreach(var imageByte in message.ImageBytes) {
builder.AddByte(imageByte);
}
var imagebytes = builder.EndVector();
//Start the FlatServerToClientMessage and add the MessageId and imagebytes
FlatServerToClientMessage.StartFlatServerToClientMessage(builder);
FlatServerToClientMessage.AddMessageid(builder, MessageId);
FlatServerToClientMessage.AddImagebytes(builder, imagebytes);
//End the FlatServerToClientMessage and finish it...
var flatMessage = FlatServerToClientMessage.EndFlatServerToClientMessage(builder);
FlatServerToClientMessage.FinishFlatServerToClientMessageBuffer(builder, flatMessage);
return builder.SizedByteArray();
}
Next, I send my data:
public void SendRaw(byte[] dataToSend) {
///We must send the length of the message before sending the actual message
var sizeInfo = new byte[4]; // = BitConverter.GetBytes(dataToSend.Length);
//Shift the bytes
sizeInfo[0] = (byte) dataToSend.Length;
sizeInfo[1] = (byte)(dataToSend.Length >> 8);
sizeInfo[2] = (byte)(dataToSend.Length >> 16);
sizeInfo[3] = (byte)(dataToSend.Length >> 24);
try {
var stream = Client.GetStream();
//Send the length of the data
stream.Write(sizeInfo, 0, 4);
//Send the data
stream.Write(dataToSend, 0, dataToSend.Length);
} catch (Exception ex) {
Debug.LogException(ex);
} finally {
//raise event to tell system that the client has disconnected and that listening must restart...
}
}
Back on my client device, I am listening for incoming data which deserializes and raises an event to alert the system to the arrival of a new image...
private void Run() {
try {
// ShutdownEvent is a ManualResetEvent signaled by
// Client when its time to close the socket.
while (!ShutDownEvent.WaitOne(0)) {
try {
if (!_stream.DataAvailable) continue;
//Read the first 4 bytes which represent the size of the message, and convert from byte array to int32
var sizeinfo = new byte[4];
_stream.Read(sizeinfo, 0, 4);
var messageSize = BitConverter.ToInt32(sizeinfo, 0);
//create a new buffer for the data to be read
var buffer = new byte[messageSize];
var read = 0;
//Continue reading from the stream until we have read all bytes #messageSize
while (read != messageSize) {
read += _stream.Read(buffer, read, buffer.Length - read);
}
var message = new ServerToClientMessage().FlatDeserialize(buffer);
//raise data received event
OnDataReceived(message);
} catch (IOException ex) {
// Handle the exception...
}
}
} catch (Exception ex) {
// Handle the exception...
} finally {
_stream.Close();
}
}
To deserialize, I do the following:
public static ServerToClientMessage FlatDeserialize(this ServerToClientMessage message, byte[] bytes) {
var bb = new ByteBuffer(bytes);
var flatmessage = FlatServerToClientMessage.GetRootAsFlatServerToClientMessage(bb);
message.MessageId = new Guid(flatmessage.Messageid);
message.ImageBytes = new byte[flatmessage.ImagebytesLength];
for (var i = 0; i < flatmessage.ImagebytesLength; i++) {
message.ImageBytes[i] = flatmessage.GetImagebytes(i);
}
return message;
}
For clarity, here is the ServerToClientMessage class:
public class ServerToClientMessage : EventArgs
{
public Guid MessageId { get; set; }
public byte[] ImageBytes { get; set; }
}
Anyway, next, the OnDataReceived event gets raised and that in turn calls a function to convert from the ImageBytes array to a System.Drawing.Image. That function is here:
public Image byteArrayToImage(byte[] byteArrayIn) {
// SAME PROBLEM!
//var converter = new System.Drawing.ImageConverter();
// return (Image)converter.ConvertFrom(byteArrayIn); ;
using(var memoryStream = new MemoryStream(byteArrayIn)) {
return Image.FromStream(memoryStream, false, false);
}
}
Now, my image data being sent from the server is fine and dandy... I have validated it. This all works fine when I use JSON, too. I've tried many ways to convert from a byte array to an Image, but I always seem to get the Parameter is not valid exception. I've also tried sending my image in different formats like JPG and PNG, as well as raw pixel data.
Anyone have an idea?
Figured it out.
It turns out that the data is backwards...due to FlatBuffers serialization.
The solution is to reverse the order of my for-loop during serialization:
for (var i = message.ImageBytes.Length; i -->0;)
{
builder.AddByte(message.ImageBytes[i]);
}
I'm trying to stream a webcam across from a client to a server but I'm having difficulty at the conversion from the byte array back to the bitmap on the server.
Here's the code:
public void handlerThread()
{
Socket handlerSocket = (Socket)alSockets[alSockets.Count-1];
NetworkStream networkStream = new
NetworkStream(handlerSocket);
int thisRead=0;
int blockSize=1024;
Byte[] dataByte = new Byte[blockSize];
lock(this)
{
// Only one process can access
// the same file at any given time
while(true)
{
thisRead=networkStream.Read(dataByte,0,blockSize);
pictureBox1.Image = byteArrayToImage(dataByte);
if (thisRead==0) break;
}
fileStream.Close();
}
lbConnections.Items.Add("File Written");
handlerSocket = null;
}
public Image byteArrayToImage(byte[] byteArrayIn)
{
MemoryStream ms = new MemoryStream(byteArrayIn); //here is my error
Image returnImage = Image.FromStream(ms);
return returnImage;
}
At the point marked above I get "Parameter is not valid" when trying to convert back to the image and crash. Any suggestions as to what I'm doing wrong?
Note this bit:
Image.Save(..) throws a GDI+ exception because the memory stream is closed
You can create an extension method or remove the "this" for a traditional method. This looks the same as your code so I wonder if you have some type of encoding or other issue relatd to creating your underlying byte array?
public static Image ToImage(this byte[] bytes)
{
// You must keep the stream open for the lifetime of the Image.
// Image disposal does clean up the stream.
var stream = new MemoryStream(bytes);
return Image.FromStream(stream);
}