I can get plenty of images of desktop capture, but I want to encode them with some video encoder, then send the video data through network. The another application (Data receiver) will get that data and play them on the screen.
How can I encode images to video, and send them?
// There is a simple desktop capture class
public class DesktopCapture
{
public readonly Bitmap Frame;
public readonly Graphics FrameGraphics;
private readonly Rectangle CaptureArea;
public DesktopCapture(Rectangle captureArea)
{
CaptureArea = captureArea;
Frame = new Bitmap(captureArea.Width, captureArea.Height);
FrameGraphics = Graphics.FromImage(Frame);
}
public void Capture()
{
FrameGraphics.CopyFromScreen(CaptureArea.Location, Point.Empty, CaptureArea.Size);
}
}
What I want:
TcpClient client = new TcpClient();
client.Connect(new IPEndPoint(IPAddress.Loopback, 5555));
DesktopCapture desktopCap = new DesktopCapture(Screen.PrimaryScreen.Bounds);
NetworkStream clientStream = client.GetStream();
VideoStreamWriter videoStream = new VideoStreamWriter(Encoder.SomeEncoder, clientStream);
videoStreamWriter.StartEncoding();
while (true)
{
desktopCap.Capture();
videoStreamWriter.EncodeFrame(desktopCap.Frame);
}
Related
I'm trying to send packets from client to server over tcp stream.
The client connects to the server and tries to send an image. However, the server gets the image only when I shutdown the client. (The server gets the image at the exact same moment I shutdown the client)
I use ProtoBuf-Net for serializing and deserializing.
Here's my relevant code:
This is my client code :
// Connect to the client
client.Connect(Client.SERVER_IP, 1729);
// Capture screenshot
Bitmap captureBitmap = new Bitmap(Screen.PrimaryScreen.Bounds.Width,
Screen.PrimaryScreen.Bounds.Height, System.Drawing.Imaging.PixelFormat.Format32bppArgb);
System.Drawing.Rectangle captureRectangle = Screen.PrimaryScreen.Bounds;
Graphics captureGraphics = Graphics.FromImage(captureBitmap);
captureGraphics.CopyFromScreen(captureRectangle.Left, captureRectangle.Top, 0, 0,
captureRectangle.Size);
// Serialize the screenshot to the socket's stream
ImageConverter img = new ImageConverter();
Packet<byte[]> packet = new Packet<byte[]> { value = (byte[])img.ConvertTo(captureBitmap, typeof(byte[])), type = PacketType.IMAGE });
Serializer.SerializeWithLengthPrefix(stream, packet.GetType().AssemblyQualifiedName, PrefixStyle.Base128);
Serializer.Serialize(stream, packet);
stream.Flush();
This is my server code :
ImageConverter imageConverter = new ImageConverter();
// Wait for client to conncet
var client = new ExtendedTcpClient(listener.AcceptTcpClient());
currentClientControlling = client;
// Deserialize type
var typeName = Serializer.DeserializeWithLengthPrefix<string>(stream, PrefixStyle.Base128);
var type = Type.GetType(typeName);
// Register the type
var model = RuntimeTypeModel.Default;
model.Add(type, true);
// Deserialize the data
var bytes = model.Deserialize(stream, null, type);
var image = (Bitmap)imageConverter.ConvertFrom(bytes);
And this is my Packet model :
public enum PacketType
{
IMAGE
}
[ProtoContract]
public class Packet<T>
{
[ProtoMember(1)]
public PacketType type { get; set; }
[ProtoMember(2)]
public T value { get; set; }
}
I didn't figure out what the problem was. However, I solved it by replacing Serializer.Serialize(stream, packet); with Serializer.SerializeWithLengthPrefix(stream, packet, PrefixStyle.Base128);
I am working on a software solution to stream a Windows Desktop to a Xamarin application, using TCP.
It works, but I cannot manage to get better framerate.
Here's how I do :
The client, a Xamarin app, listens on a Tcp socket. The server (C# console app) connects to it, captures the Desktop and sends the image. Then, the app show it as an Image.
When sending a bitmap, the server starts sending its size, on a 4 bytes array. The client first listens 4 bytes, then the requested size.
DesktopRecorder.cs (Server side)
private void CaptureScreen()
{
bitmap = new Bitmap(SIZE.Width, SIZE.Height, PixelFormat.Format32bppRgb);
graphics = Graphics.FromImage(bitmap);
graphics.CopyFromScreen(0, 0, 0, 0, SIZE, CopyPixelOperation.SourceCopy);
CaptureReady(BitmapToArray(bitmap));
}
private byte[] BitmapToArray(Bitmap bitmap)
{
using (var ms = new MemoryStream())
{
bitmap.Save(ms, ImageFormat.Png);
return ms.ToArray();
}
}
TcpNetworker.cs (Server side, connecting in TCP):
public override void SendData(byte[] data)
{
if (Tcp.Connected)
{
int size = data.Length;
var sizeArray = BitConverter.GetBytes(size);
if (BitConverter.IsLittleEndian)
sizeArray.Reverse();
Console.WriteLine($"Sending {size} o.");
Write(sizeArray);
Stream.Flush();
Write(data);
}
void Write(byte[] b)
{
Stream.Write(b, 0, b.Length);
}
}
ViewModel.cs (Client side, Xamarin):
private void Accept()
{
try
{
var client = Tcp.AcceptTcpClient();
Console.WriteLine("---------------- Client Connected ----------------");
Stream = client.GetStream();
byte[] sizeArray;
byte[] dataArray;
while (client.Connected)
{
sizeArray = new byte[4];
Stream.Read(sizeArray, 0, 4);
if (BitConverter.IsLittleEndian)
sizeArray.Reverse();
int size = BitConverter.ToInt32(sizeArray, 0);
Console.WriteLine($"Size: {size}");
dataArray = new byte[size];
Stream.Read(dataArray, 0, size);
Dispatcher.Invoke(() => ImageReceived?.Invoke(this, new MemoryStream(dataArray)));
}
}
catch(Exception e)
{
Console.WriteLine($"Error: {e.Message} {Environment.NewLine} {e.StackTrace}");
}
Accept();
}
MainWindow.cs (Page's code behind):
public MainWindow()
{
InitializeComponent();
var model = new MainWindowViewModel();
model.ImageReceived += Model_ImageReceived;
}
private void Model_ImageReceived(object sender, System.IO.MemoryStream memoryStream)
{
memoryStream.Position = 0;
//var image = System.Drawing.Image.FromStream(memoryStream);
var imageSource = new BitmapImage();
imageSource.BeginInit();
imageSource.CacheOption = BitmapCacheOption.OnLoad;
imageSource.StreamSource = memoryStream;
imageSource.EndInit();
MainImage.Source = imageSource;
memoryStream.Dispose();
}
The CaptureScreen() method of DesktopRecorder is called on a given framerate. It works fine at 10 FPS, but when increasing this framerate, I get trouble in the client, the app cannot manage to find the correct size. After receiving some images, the 4 bytes read doesn't contain the size. It is like an offset has been added, and the server/client unsychronize.
So, I am asking 2 things :
How can I keep a synchronization between the server and client after sending many TCP packets, since I need to keep 4 bytes for the size ? I don't feel using a fixed size of bitmap or something like that.
Currently, I am encoding the bitmap into a PNG image, which can be handled natively by Xamarin. Is there a better way to compress my image and sends it faster, even there is a loss in quality ? I already did that kind of things between a C server and an Angular app, and I had to encode it in PNG and Base64. Can I do the same ?
P.S : I know I can manage to get better performance using UDP, but for some reasons, I have to keep with TCP :/.
Thanks for your help !
I have a Java android code that sends data (image or text) to a C# application, to receive these data I'm using Async socket. But exists a problem that is relative to BeginReceive() function is not receiving the complete data when is sent an image.. Then how I can make a kind of "loop" to receive full data and after show the image on Picturebox (for example)?
Form
private Listener listener;
private Thread startListen;
private Bitmap _buffer;
public frmMain()
{
InitializeComponent();
}
private void serverReceivedImage(Client client, byte[] image)
{
try
{
byte[] newImage = new byte[image.Length - 6];
Array.Copy(image, 6, newImage, 0, newImage.Length);
using (var stream = new MemoryStream(newImage))
{
using (var msInner = new MemoryStream())
{
stream.Seek(2, SeekOrigin.Begin);
using (DeflateStream z = new DeflateStream(stream, CompressionMode.Decompress))
{
z.CopyTo(msInner);
}
msInner.Seek(0, SeekOrigin.Begin);
var bitmap = new Bitmap(msInner);
Invoke(new frmMain.ImageCompleteDelegate(ImageComplete), new object[] { bitmap });
}
}
}
catch (Exception)
{
System.Diagnostics.Process.GetCurrentProcess().Kill();
}
}
private delegate void ImageCompleteDelegate(Bitmap bitmap);
private void ImageComplete(Bitmap bitmap)
{
if (_buffer != null)
_buffer.Dispose();
_buffer = new Bitmap(bitmap);
pictureBox1.Size = _buffer.Size;
pictureBox1.Invalidate();
}
private void pictureBox1_Paint(object sender, PaintEventArgs e)
{
if (_buffer == null) return;
e.Graphics.DrawImage(_buffer, 0, 0);
}
private void startToolStripMenuItem_Click(object sender, EventArgs e)
{
startListen = new Thread(listen);
startListen.Start();
}
private void listen()
{
listener = new Listener();
listener.BeginListen(101);
listener.receivedImage += new Listener.ReceivedImageEventHandler(serverReceivedImage);
startToolStripMenuItem.Enabled = false;
}
Listener
class Listener
{
private Socket s;
public List<Client> clients;
public delegate void ReceivedImageEventHandler(Client client, byte[] image);
public event ReceivedImageEventHandler receivedImage;
private bool listening = false;
public Listener()
{
clients = new List<Client>();
s = new Socket(AddressFamily.InterNetwork, SocketType.Stream, ProtocolType.Tcp);
}
public bool Running
{
get { return listening; }
}
public void BeginListen(int port)
{
s.Bind(new IPEndPoint(IPAddress.Any, port));
s.Listen(100);
s.BeginAccept(new AsyncCallback(AcceptCallback), s);
listening = true;
}
public void StopListen()
{
if (listening == true)
{
s.Close();
listening = false;
}
}
void AcceptCallback(IAsyncResult ar)
{
Socket handler = (Socket)ar.AsyncState;
Socket sock = handler.EndAccept(ar);
Client client = new Client(sock);
clients.Add(client);
sock.BeginReceive(client.buffer, 0, client.buffer.Length, SocketFlags.None, new AsyncCallback(ReadCallback), client);
client.Send("REQUEST_PRINT" + Environment.NewLine);
handler.BeginAccept(new AsyncCallback(AcceptCallback), handler);
}
void ReadCallback(IAsyncResult ar)
{
Client client = (Client)ar.AsyncState;
try
{
int rec = client.sock.EndReceive(ar);
if (rec != 0)
{
string data = Encoding.UTF8.GetString(client.buffer, 0, rec);
if (data.Contains("SCREEN"))
{
byte[] bytes = Encoding.UTF8.GetBytes(data);
receivedImage(client, bytes);
}
else // not is a image, is a text
{
// prepare text to show in TextBox
}
}
else
{
Disconnected(client);
return;
}
client.sock.BeginReceive(client.buffer, 0, client.buffer.Length, SocketFlags.None, new AsyncCallback(ReadCallback), client);
}
catch
{
Disconnected(client);
client.sock.Close();
clients.Remove(client);
}
}
}
Client
class Client
{
public Socket sock;
public byte[] buffer = new byte[8192];
public Client(Socket sock)
{
this.sock = sock;
}
public void Send(string data)
{
byte[] buffer = Encoding.ASCII.GetBytes(data);
sock.BeginSend(buffer, 0, buffer.Length, SocketFlags.None, new AsyncCallback((ar) =>
{
sock.EndSend(ar);
}), buffer);
}
}
Android code
private byte[] compress(byte[] data) {
Deflater deflater = new Deflater();
deflater.setInput(data);
ByteArrayOutputStream outputStream = new ByteArrayOutputStream(data.length);
deflater.finish();
byte[] buffer = new byte[1024];
while (!deflater.finished()) {
int count = deflater.deflate(buffer);
outputStream.write(buffer, 0, count);
}
outputStream.close();
byte[] output = outputStream.toByteArray();
return output;
}
public static DataOutputStream dos;
public static byte[] array;
ByteArrayOutputStream bos = new ByteArrayOutputStream();
bitmap.compress(Bitmap.CompressFormat.PNG, 100, bos);
array = compress(bos.toByteArray());
//...
dos = new DataOutputStream(SocketBackgroundService.clientSocket.getOutputStream());
byte[] header = ("SCREEN").getBytes(StandardCharsets.UTF_8);
byte[] dataToSend = new byte[header.length + array.length];
System.arraycopy(header, 0, dataToSend, 0, header.length);
System.arraycopy(array, 0, dataToSend, header.length, array.length);
dos.writeInt(dataToSend.length);
dos.write(dataToSend, 0, dataToSend.length);
dos.flush();
EDITION
i'm always getting the error Invalid Parameter in this line
var bitmap = new Bitmap(msInner);
and using compression also happens the same here
z.CopyTo(msInner);
IvalidDataException
on ServerReceivedImage() method respectively.
using this
File.WriteAllBytes(Path.Combine(Environment.GetFolderPath(Environment.SpecialFolder.MyPictures), "image.png"), newImage);
i noted that is receiving only 15KB (size of file without use compression).
I was writing a comment but it does not give me enough space to express my frustration with your code.
My main points are
You try to recompress and perfectly compressed image. PNG is portable network graphics. It was designed for network transfers. If it is acceptable you should use something like jpeg.
You just decode received buffer using UTF8.GetString and search for a text, then re-encode that string and try to decompress and read an image from it, by starting from index 6 which is pretty meaningless considering you added a two byte size field to the start of stream and you really do not know position of "SCREEN".
You do not check if you have received ALL of the stream data.
All of the code looks like you have scoured the SO questions and answers and created a copy pasta.
Now my recommendations.
When transferring data from network, do not try to invent wheels. Try something like gRPC which has both android java and c# packages.
If you will use raw data, please, please know your bytes.
I assume you will extend your code by adding new command pairs. Since you have no magic markers of some kind of signal system, it will be very hard for you to distinguish data from header. For a simple implementation add some kind of magic data to your header and search for that data, then read header and then read data. You may need to read from socket again and again until you receive all of the data.
424A72 0600 53435245454E 008E0005 ..... 724A42
B J r 6 S C R E E N 36352 ..... rJB
this sample data shows that we have a valid stream by looking at "BJr". Then read a 2 byte unsigned integer to read command size which is 6 for SCREEN. Read command and then read four bytes unsigned length for command data. For our sample it is 36352. Just to be safe I've added an end of command marker "rJB".
For a bonus point try reducing memory allocations / copies, you can look at System.Span<T>
I have a camera (now using Arducam OV2640) connected to ESP8266. I am trying to build something that sends websocket data between a C# application and ESP8266. The main purpose of this project is to build a robot, that can view the camera on the robot and control it using a C# application via Websocket.
However, the progress of my project is that it is able to send the command to the robot to make it move, but upon receiving the data from ESP8266, I am not able to piece the data into a proper video streaming to view on the windows C# application. I have used C# bitmap library but not able work properly. Any help is much appreciated. Below is what I have done so far:
//to send the data to ESP8266 Wifi board
public async void webSocketOperation(String operation)
{
ArraySegment<byte> bytesToSend = new ArraySegment<byte>(
Encoding.UTF8.GetBytes(operation));
await UserInterface.ws.SendAsync(
bytesToSend, WebSocketMessageType.Text,
true, CancellationToken.None);
}
//to receive the data sent from ESP8266 board
public async Task receiveData()
{
while(true)
{
ArraySegment<byte> buffer = new ArraySegment<byte>(new byte[819200]);
WebSocketReceiveResult result = await ws.ReceiveAsync(buffer, CancellationToken.None);
pbVideoFeed.Image = BytesToBitmap(buffer);
if (ws.State != WebSocketState.Open)
{
break;
}
}
}
//to convert from bytes to image
private Bitmap BytesToBitmap(ArraySegment<byte> buffer)
{
MemoryStream stream = null;
byte[] image_data = new byte[409600];
Bitmap resize_img = new Bitmap(pbVideoFeed.Width, pbVideoFeed.Height);
Graphics graphic = Graphics.FromImage(resize_img);
image_data = buffer.ToArray();
try
{
stream = new MemoryStream(image_data);
Bitmap result = new Bitmap(stream);
graphic.InterpolationMode = InterpolationMode.High;
graphic.DrawImage(result, new Rectangle(0, 0, pbVideoFeed.Width, pbVideoFeed.Height));
graphic.Dispose();
return new Bitmap((Image)resize_img);
//return new Bitmap((Image)new Bitmap(stream));
}
catch (ArgumentException ex)
{
throw ex;
}
finally
{
stream.Close();
}
}
I have set up a C# server which at present serves up one test mp3 file over TCP. The code to send the file is as follows
public void RunStreamer()
{
log.MakeLog("Starting streamer");
Run = true;
//Need to look into how to thread this to not block the main window
TcpListener listen = new TcpListener(localAddr, _port);
listen.Start(); //startlistening to client requests
//blocks until a client request comes in
for (; ; )
{
Socket socket = listen.AcceptSocket();
if (socket.Connected)
{
SendFileToClient(socket);
socket.Disconnect(false);
}
}
}
void SendFileToClient(Socket socket)
{
log.MakeLog("Connection made");
NetworkStream netStream = new NetworkStream(socket);
StreamWriter writer = new StreamWriter(netStream);
//Todo - set specfified file - this file just for test
FileStream filestream = File.Open(#"C:\MusicTest\Test.mp3", FileMode.Open, FileAccess.Read, FileShare.Read);
filestream.CopyTo(netStream);
netStream.Flush();
netStream.Close();
}
In my test android set up I am making a call to the server on a button click:
public void btngo_click(View v)
{
final TcpClient client = new TcpClient();
new Thread(new Runnable(){
#Override
public void run() {
final MediaPlayer mediaPlayer = new MediaPlayer();
client.GetStream();
runOnUiThread(new Runnable(){
public void run()
{
int length = client.GetLength();
if(length > 0)
{
byte[] result = client.GetResult();
try {
// create temp file that will hold byte array
File tempMp3 = File.createTempFile("test", "mp3", getCacheDir());
tempMp3.deleteOnExit();
FileOutputStream fos = new FileOutputStream(tempMp3);
fos.write(result);
fos.close();
mediaPlayer.reset();
FileInputStream fis = new FileInputStream(tempMp3);
mediaPlayer.setDataSource(fis.getFD());
mediaPlayer.prepare();
mediaPlayer.start();
} catch (IOException ex) {
String s = ex.toString();
ex.printStackTrace();
}
}
}
});
}
}).start();
}
the stream is received in the TcpClient class which is as follows:
public class TcpClient {
public final static String SERVER_ADDRESS = "127.0.0.1";
public final static int SERVER_PORT = 65000;
public String TotalResult;
public int Length;
byte[] result = new byte[21000000];
public TcpClient()
{
}
public int GetLength()
{
return Length;
}
public byte[] GetResult()
{
return result;
}
public void GetStream()
{
try
{
final Socket socket = new Socket("192.0.0.5", 85000);
final InputStream input = new BufferedInputStream(socket.getInputStream());
ByteArrayOutputStream buffer = new ByteArrayOutputStream();
int nread;
while((nread = input.read(result, 0, result.length)) != -1)
{
buffer.write(result, 0, nread);
}
buffer.flush();
//input.read(result);
Length = result.length;
input.close();
socket.close();
} catch (UnknownHostException e) {
String exc = e.getMessage();
e.printStackTrace();
} catch (IOException e) {
String exc2 = e.getMessage();
e.printStackTrace();
}
}
}
With apologies for all the code here is my problem.
I am receiving the stream. The temp MP3 file is created and the media player starts. I then only get a short snippet of the test MP3 file (which is a full song). It also jumps about a bit. The length is not the same and the section of the song played is different each time.
How do I receive the full file in a ordered way such that it will provide full play back of the song.
I have tried to route around for this and have an idea that I need to tell my client what file size it should suspect and then perform some loop until all data is received although I have no idea how to successfully implement this if that is the correct solution.
Any pointers on where I am going wrong or what I can do to rectify would be greatly appreciated!!
Having received no answers on this I dug around a bit more. Two things were wrong:
Firstly I had not included the size of the stream as a int sized header in my stream. I understand that for smaller files this will not be a problem but as file sizes grow it is necessary to make sure that the whole stream has been received.
This in turn raised another issue. The int I was sending as byte[] form c# was not returning the correct value in Java. Turns out Java uses sbytes -128 to 127 range as opposed to byte. This then involved a bit of code to convert to an int. then I could instruct the reader to readfully passing in the byte[] buffer with the actual size of the expected stream = voila it worked. MP3 files is received and plays just fine.