I am working on a software solution to stream a Windows Desktop to a Xamarin application, using TCP.
It works, but I cannot manage to get better framerate.
Here's how I do :
The client, a Xamarin app, listens on a Tcp socket. The server (C# console app) connects to it, captures the Desktop and sends the image. Then, the app show it as an Image.
When sending a bitmap, the server starts sending its size, on a 4 bytes array. The client first listens 4 bytes, then the requested size.
DesktopRecorder.cs (Server side)
private void CaptureScreen()
{
bitmap = new Bitmap(SIZE.Width, SIZE.Height, PixelFormat.Format32bppRgb);
graphics = Graphics.FromImage(bitmap);
graphics.CopyFromScreen(0, 0, 0, 0, SIZE, CopyPixelOperation.SourceCopy);
CaptureReady(BitmapToArray(bitmap));
}
private byte[] BitmapToArray(Bitmap bitmap)
{
using (var ms = new MemoryStream())
{
bitmap.Save(ms, ImageFormat.Png);
return ms.ToArray();
}
}
TcpNetworker.cs (Server side, connecting in TCP):
public override void SendData(byte[] data)
{
if (Tcp.Connected)
{
int size = data.Length;
var sizeArray = BitConverter.GetBytes(size);
if (BitConverter.IsLittleEndian)
sizeArray.Reverse();
Console.WriteLine($"Sending {size} o.");
Write(sizeArray);
Stream.Flush();
Write(data);
}
void Write(byte[] b)
{
Stream.Write(b, 0, b.Length);
}
}
ViewModel.cs (Client side, Xamarin):
private void Accept()
{
try
{
var client = Tcp.AcceptTcpClient();
Console.WriteLine("---------------- Client Connected ----------------");
Stream = client.GetStream();
byte[] sizeArray;
byte[] dataArray;
while (client.Connected)
{
sizeArray = new byte[4];
Stream.Read(sizeArray, 0, 4);
if (BitConverter.IsLittleEndian)
sizeArray.Reverse();
int size = BitConverter.ToInt32(sizeArray, 0);
Console.WriteLine($"Size: {size}");
dataArray = new byte[size];
Stream.Read(dataArray, 0, size);
Dispatcher.Invoke(() => ImageReceived?.Invoke(this, new MemoryStream(dataArray)));
}
}
catch(Exception e)
{
Console.WriteLine($"Error: {e.Message} {Environment.NewLine} {e.StackTrace}");
}
Accept();
}
MainWindow.cs (Page's code behind):
public MainWindow()
{
InitializeComponent();
var model = new MainWindowViewModel();
model.ImageReceived += Model_ImageReceived;
}
private void Model_ImageReceived(object sender, System.IO.MemoryStream memoryStream)
{
memoryStream.Position = 0;
//var image = System.Drawing.Image.FromStream(memoryStream);
var imageSource = new BitmapImage();
imageSource.BeginInit();
imageSource.CacheOption = BitmapCacheOption.OnLoad;
imageSource.StreamSource = memoryStream;
imageSource.EndInit();
MainImage.Source = imageSource;
memoryStream.Dispose();
}
The CaptureScreen() method of DesktopRecorder is called on a given framerate. It works fine at 10 FPS, but when increasing this framerate, I get trouble in the client, the app cannot manage to find the correct size. After receiving some images, the 4 bytes read doesn't contain the size. It is like an offset has been added, and the server/client unsychronize.
So, I am asking 2 things :
How can I keep a synchronization between the server and client after sending many TCP packets, since I need to keep 4 bytes for the size ? I don't feel using a fixed size of bitmap or something like that.
Currently, I am encoding the bitmap into a PNG image, which can be handled natively by Xamarin. Is there a better way to compress my image and sends it faster, even there is a loss in quality ? I already did that kind of things between a C server and an Angular app, and I had to encode it in PNG and Base64. Can I do the same ?
P.S : I know I can manage to get better performance using UDP, but for some reasons, I have to keep with TCP :/.
Thanks for your help !
Related
Thanks for reading :]
When the current C# Socket server is opened and Android sends a message, the C# Server sends an image using a byteArray.
In Android, the size of the image was received normally, but BitmapFactory.decodeByteArray() continues to return null. Thanks for your comments.
C# code
byte[] data = ImageToByteArray(this.pictureBox1.Image);
try
{
m_socket.Send(BitConverter.GetBytes(data.Length));
m_socket.Send(data);
}
catch (Exception e)
{
MessageBox.Show(e.ToString());
}
public byte[] ImageToByteArray(System.Drawing.Image image)
{
MemoryStream ms = new MemoryStream();
image.Save(ms, System.Drawing.Imaging.ImageFormat.Png);
return ms.ToArray();
}
Android Code
BufferedInputStream bis = new BufferedInputStream(socket.getInputStream());
byte[] data = new byte[4];
bis.read(data) // Img Size
int ilen = byteToInt(data); // byte[] to Int
byte[] data2 = new byte[ilen];
Bitmap bit = BitmapFactory.decodeByteArray(data2, 0, dta2.length); // Return null
imageView.setImageBitmap(bit);
Also, D/skia: --- png error bad adaptive filter value
d/skia: --- codec->getAndroidPixels() failed.
#_#....... Thank you!!
I have a camera (now using Arducam OV2640) connected to ESP8266. I am trying to build something that sends websocket data between a C# application and ESP8266. The main purpose of this project is to build a robot, that can view the camera on the robot and control it using a C# application via Websocket.
However, the progress of my project is that it is able to send the command to the robot to make it move, but upon receiving the data from ESP8266, I am not able to piece the data into a proper video streaming to view on the windows C# application. I have used C# bitmap library but not able work properly. Any help is much appreciated. Below is what I have done so far:
//to send the data to ESP8266 Wifi board
public async void webSocketOperation(String operation)
{
ArraySegment<byte> bytesToSend = new ArraySegment<byte>(
Encoding.UTF8.GetBytes(operation));
await UserInterface.ws.SendAsync(
bytesToSend, WebSocketMessageType.Text,
true, CancellationToken.None);
}
//to receive the data sent from ESP8266 board
public async Task receiveData()
{
while(true)
{
ArraySegment<byte> buffer = new ArraySegment<byte>(new byte[819200]);
WebSocketReceiveResult result = await ws.ReceiveAsync(buffer, CancellationToken.None);
pbVideoFeed.Image = BytesToBitmap(buffer);
if (ws.State != WebSocketState.Open)
{
break;
}
}
}
//to convert from bytes to image
private Bitmap BytesToBitmap(ArraySegment<byte> buffer)
{
MemoryStream stream = null;
byte[] image_data = new byte[409600];
Bitmap resize_img = new Bitmap(pbVideoFeed.Width, pbVideoFeed.Height);
Graphics graphic = Graphics.FromImage(resize_img);
image_data = buffer.ToArray();
try
{
stream = new MemoryStream(image_data);
Bitmap result = new Bitmap(stream);
graphic.InterpolationMode = InterpolationMode.High;
graphic.DrawImage(result, new Rectangle(0, 0, pbVideoFeed.Width, pbVideoFeed.Height));
graphic.Dispose();
return new Bitmap((Image)resize_img);
//return new Bitmap((Image)new Bitmap(stream));
}
catch (ArgumentException ex)
{
throw ex;
}
finally
{
stream.Close();
}
}
I'm trying to send an image over a networkstream this is my client code:
private void Form1_Load(object sender, EventArgs e)
{
TcpClient client=new TcpClient();
client.Connect("127.0.0.1", 10);
NetworkStream ns = client.GetStream();
Bitmap screen = GetDesktopImage();//generate a screenshot.
MemoryStream ms = new MemoryStream();
screen.Save(ms, ImageFormat.Png);
byte[] byteCount = BitConverter.GetBytes((int)ms.Length);
ms.Position = 0;
ns.Write(byteCount, 0, byteCount.Length);
ms.CopyTo(ns);
ms.SetLength(0);
}
this is the server:
private void Start()
{
TcpListener listen = new TcpListener(IPAddress.Any, 10);
listen.Start();
NetworkStream ns = new NetworkStream(listen.AcceptTcpClient().Client);
byte[] temp = new byte[4];
ns.Read(temp, 0, 4);
int count = BitConverter.ToInt32(temp, 0);
byte[] buff = new byte[count];
pictureBox1.Image = Image.FromStream(ns);
}
private void Form1_Load(object sender, EventArgs e)
{
Thread th = new Thread(Start);
th.Start();
}
I dont see nothing on the picturebox and i guess the program hangs here- pictureBox1.Image = Image.FromStream(ns);just added a breakpoint there and it's not working.
**Only when i close the client program and stop the debugging ,then i can see a image on the picturebox on the server side.
Why is it?Any ideas?
My guess is that Image.FromStream does not know to stop reading when it has drawn the full image. Maybe the PNG format does not even allow for that. You need to give a stream to Image.FromStream that has a limited size. The easiest way would be to use BinaryReader.ReadBytes(count) to read the exact amount of bytes needed.
ns.Read(temp, 0, 4);: This is a bug because it assumes that the read will return exactly 4 bytes. This might not be the case. Again, use BinaryReader.ReadInt32 to safely read an int.
Better yet, abandon custom serialization formats and use something like protobuf length prefixed. Or, HTTP or WCF.
I wanted to create a project for streaming multiple images through net. I just wanted to start with small, functional code but already encountered a funky problem for me. Image is received, but it contains graphical bugs like its only a part of it.
Sadly cant show images cause of low rputation, here is link.
http://img543.imageshack.us/img543/1508/buggedy.jpg
Of course, million dollars for answer. ^^
TcpListener listener;
TcpClient client;
TcpClient datatoclient;
NetworkStream stream;
Host part:
listener = new TcpListener(5000);
listener.Start();
datatoclient = listener.AcceptTcpClient();
NetworkStream nowystream = datatoclient.GetStream();
MemoryStream ms = new MemoryStream();
byte[] image = File.ReadAllBytes("default.jpg");
switch (trackBar1.Value)
{
case 0:
image = File.ReadAllBytes("mirrion.jpg");
break;
case 1:
image = File.ReadAllBytes("tenis.jpg");
break;
case 2:
image = File.ReadAllBytes("marisasold.jpg");
break;
}
// get the image size in bytes
int numberOfBytes = image.Length;
// put the size into an array
byte[] numberOfBytesArray = BitConverter.GetBytes(numberOfBytes);
// send the image size
nowystream.Write(numberOfBytesArray, 0, numberOfBytesArray.Length);
// send the image
nowystream.Write(image, 0, numberOfBytes);
Client part:
client = new TcpClient("127.0.0.1", 5000);
stream = client.GetStream();
byte[] data = new byte[4];
// read the size
stream.Read(data, 0, data.Length);
int size = BitConverter.ToInt32(data, 0);
label1.Text = size.ToString();
// prepare buffer
data = new byte[size];
// load image
stream.Read(data, 0, data.Length);
// save image to file for test
File.WriteAllBytes("received.jpg", data);
MemoryStream MS = new MemoryStream(data);
pictureBox1.Image = Image.FromStream(MS);
stream.Read doesn't guarantee that it will read data.Length bytes. Instead it returns number of bytes read. So you should check its return value and continue reading till you get all the bytes.
See http://msdn.microsoft.com/en-us/library/system.io.stream.read(v=vs.90).aspx (Section Return Value)
Thr read method can be something like this
void Read(Stream stream, byte[] buffer,int offset,int len)
{
int read = 0;
while (read < len)
{
read += stream.Read(buffer, offset + read, len-read);
}
}
So I have found here in stackoverflow One code for sending through sockets a binary file, an image.. So i used it for test to my Project
private void send_ss()
{
byte[] data = new byte[1024];
int sent;
IPEndPoint ipep = new IPEndPoint(IPAddress.Parse("127.0.0.1"), 306);
Socket server = new Socket(AddressFamily.InterNetwork,
SocketType.Stream, ProtocolType.Tcp);
try
{
server.Connect(ipep);
}
catch (SocketException e)
{
//Console.WriteLine("Unable to connect to server.");
//Console.WriteLine(e.ToString());
//Console.ReadLine();
}
Bitmap bmp = new Bitmap("C:\\Windows\\Web\\Wallpaper\\Theme2\\img7.jpg");
MemoryStream ms = new MemoryStream();
// Save to memory using the Jpeg format
bmp.Save(ms, System.Drawing.Imaging.ImageFormat.Jpeg);
// read to end
byte[] bmpBytes = ms.ToArray();
bmp.Dispose();
ms.Close();
sent = SendVarData(server, bmpBytes);
//Console.WriteLine("Disconnecting from server...");
server.Shutdown(SocketShutdown.Both);
server.Close();
}
private static int SendVarData(Socket s, byte[] data)
{
int total = 0;
int size = data.Length;
int dataleft = size;
int sent;
byte[] datasize = new byte[4];
datasize = BitConverter.GetBytes(size);
sent = s.Send(datasize);
while (total < size)
{
sent = s.Send(data, total, dataleft, SocketFlags.None);
total += sent;
dataleft -= sent;
}
return total;
}
so i tried to send this picture on one of my Listening Sockets in Port 306 (listened with m IRC)
on *:socklisten:ac_img:{
var %p = $ticks $+ $time(hhnnss) $+ $ctime
sockaccept ac_img_ $+ %p
echo -s [] Image Connection Established On -> ac_img_ $+ %p
}
on *:sockread:ac_img_*:{
sockread &picture
bwrite $qt($mIRCdir $+ $sockname $+ .jpg) -1 -1 &picture
}
So i'm getting files like ac_img_2920385501147471360792067.jpg and so on. Same size with the original BUT the images just not appearing , so i opened Both files with word pad and they were a bit different... dunno why...
So any ideas why i'm facing this issue? i mean... I'm taking every single data from my socket and saving them to the file? Maybe a corrupt on file read through c#?
The image is different because you read it, parse it into a Bitmap and reencode it. The wordpad screenshot shows that both are JPEG's but with different metadata (for example "adobe" missing").
Just use File.ReadAllBytes or other lossless methods to read the image.
The sending code looks sound. Not sure why you're looping. Sending never does partial IOs AFAIK on blocking sockets.