I have a camera (now using Arducam OV2640) connected to ESP8266. I am trying to build something that sends websocket data between a C# application and ESP8266. The main purpose of this project is to build a robot, that can view the camera on the robot and control it using a C# application via Websocket.
However, the progress of my project is that it is able to send the command to the robot to make it move, but upon receiving the data from ESP8266, I am not able to piece the data into a proper video streaming to view on the windows C# application. I have used C# bitmap library but not able work properly. Any help is much appreciated. Below is what I have done so far:
//to send the data to ESP8266 Wifi board
public async void webSocketOperation(String operation)
{
ArraySegment<byte> bytesToSend = new ArraySegment<byte>(
Encoding.UTF8.GetBytes(operation));
await UserInterface.ws.SendAsync(
bytesToSend, WebSocketMessageType.Text,
true, CancellationToken.None);
}
//to receive the data sent from ESP8266 board
public async Task receiveData()
{
while(true)
{
ArraySegment<byte> buffer = new ArraySegment<byte>(new byte[819200]);
WebSocketReceiveResult result = await ws.ReceiveAsync(buffer, CancellationToken.None);
pbVideoFeed.Image = BytesToBitmap(buffer);
if (ws.State != WebSocketState.Open)
{
break;
}
}
}
//to convert from bytes to image
private Bitmap BytesToBitmap(ArraySegment<byte> buffer)
{
MemoryStream stream = null;
byte[] image_data = new byte[409600];
Bitmap resize_img = new Bitmap(pbVideoFeed.Width, pbVideoFeed.Height);
Graphics graphic = Graphics.FromImage(resize_img);
image_data = buffer.ToArray();
try
{
stream = new MemoryStream(image_data);
Bitmap result = new Bitmap(stream);
graphic.InterpolationMode = InterpolationMode.High;
graphic.DrawImage(result, new Rectangle(0, 0, pbVideoFeed.Width, pbVideoFeed.Height));
graphic.Dispose();
return new Bitmap((Image)resize_img);
//return new Bitmap((Image)new Bitmap(stream));
}
catch (ArgumentException ex)
{
throw ex;
}
finally
{
stream.Close();
}
}
Related
Thanks for reading :]
When the current C# Socket server is opened and Android sends a message, the C# Server sends an image using a byteArray.
In Android, the size of the image was received normally, but BitmapFactory.decodeByteArray() continues to return null. Thanks for your comments.
C# code
byte[] data = ImageToByteArray(this.pictureBox1.Image);
try
{
m_socket.Send(BitConverter.GetBytes(data.Length));
m_socket.Send(data);
}
catch (Exception e)
{
MessageBox.Show(e.ToString());
}
public byte[] ImageToByteArray(System.Drawing.Image image)
{
MemoryStream ms = new MemoryStream();
image.Save(ms, System.Drawing.Imaging.ImageFormat.Png);
return ms.ToArray();
}
Android Code
BufferedInputStream bis = new BufferedInputStream(socket.getInputStream());
byte[] data = new byte[4];
bis.read(data) // Img Size
int ilen = byteToInt(data); // byte[] to Int
byte[] data2 = new byte[ilen];
Bitmap bit = BitmapFactory.decodeByteArray(data2, 0, dta2.length); // Return null
imageView.setImageBitmap(bit);
Also, D/skia: --- png error bad adaptive filter value
d/skia: --- codec->getAndroidPixels() failed.
#_#....... Thank you!!
I am working on a software solution to stream a Windows Desktop to a Xamarin application, using TCP.
It works, but I cannot manage to get better framerate.
Here's how I do :
The client, a Xamarin app, listens on a Tcp socket. The server (C# console app) connects to it, captures the Desktop and sends the image. Then, the app show it as an Image.
When sending a bitmap, the server starts sending its size, on a 4 bytes array. The client first listens 4 bytes, then the requested size.
DesktopRecorder.cs (Server side)
private void CaptureScreen()
{
bitmap = new Bitmap(SIZE.Width, SIZE.Height, PixelFormat.Format32bppRgb);
graphics = Graphics.FromImage(bitmap);
graphics.CopyFromScreen(0, 0, 0, 0, SIZE, CopyPixelOperation.SourceCopy);
CaptureReady(BitmapToArray(bitmap));
}
private byte[] BitmapToArray(Bitmap bitmap)
{
using (var ms = new MemoryStream())
{
bitmap.Save(ms, ImageFormat.Png);
return ms.ToArray();
}
}
TcpNetworker.cs (Server side, connecting in TCP):
public override void SendData(byte[] data)
{
if (Tcp.Connected)
{
int size = data.Length;
var sizeArray = BitConverter.GetBytes(size);
if (BitConverter.IsLittleEndian)
sizeArray.Reverse();
Console.WriteLine($"Sending {size} o.");
Write(sizeArray);
Stream.Flush();
Write(data);
}
void Write(byte[] b)
{
Stream.Write(b, 0, b.Length);
}
}
ViewModel.cs (Client side, Xamarin):
private void Accept()
{
try
{
var client = Tcp.AcceptTcpClient();
Console.WriteLine("---------------- Client Connected ----------------");
Stream = client.GetStream();
byte[] sizeArray;
byte[] dataArray;
while (client.Connected)
{
sizeArray = new byte[4];
Stream.Read(sizeArray, 0, 4);
if (BitConverter.IsLittleEndian)
sizeArray.Reverse();
int size = BitConverter.ToInt32(sizeArray, 0);
Console.WriteLine($"Size: {size}");
dataArray = new byte[size];
Stream.Read(dataArray, 0, size);
Dispatcher.Invoke(() => ImageReceived?.Invoke(this, new MemoryStream(dataArray)));
}
}
catch(Exception e)
{
Console.WriteLine($"Error: {e.Message} {Environment.NewLine} {e.StackTrace}");
}
Accept();
}
MainWindow.cs (Page's code behind):
public MainWindow()
{
InitializeComponent();
var model = new MainWindowViewModel();
model.ImageReceived += Model_ImageReceived;
}
private void Model_ImageReceived(object sender, System.IO.MemoryStream memoryStream)
{
memoryStream.Position = 0;
//var image = System.Drawing.Image.FromStream(memoryStream);
var imageSource = new BitmapImage();
imageSource.BeginInit();
imageSource.CacheOption = BitmapCacheOption.OnLoad;
imageSource.StreamSource = memoryStream;
imageSource.EndInit();
MainImage.Source = imageSource;
memoryStream.Dispose();
}
The CaptureScreen() method of DesktopRecorder is called on a given framerate. It works fine at 10 FPS, but when increasing this framerate, I get trouble in the client, the app cannot manage to find the correct size. After receiving some images, the 4 bytes read doesn't contain the size. It is like an offset has been added, and the server/client unsychronize.
So, I am asking 2 things :
How can I keep a synchronization between the server and client after sending many TCP packets, since I need to keep 4 bytes for the size ? I don't feel using a fixed size of bitmap or something like that.
Currently, I am encoding the bitmap into a PNG image, which can be handled natively by Xamarin. Is there a better way to compress my image and sends it faster, even there is a loss in quality ? I already did that kind of things between a C server and an Angular app, and I had to encode it in PNG and Base64. Can I do the same ?
P.S : I know I can manage to get better performance using UDP, but for some reasons, I have to keep with TCP :/.
Thanks for your help !
I know this has been asked numerous times... and I have searched and tried everything that I can, but I am still not sure why I get the "parameter is not valid" exception...
I have an Amazon EC2 instance running Windows Server 2012. On that machine, I am running Unity3D (Unity 5). That Unity application is sending frames (images) from the EC2 instance to my local laptop, via TCP. My client is running Windows 10, not that it is likely to make any difference.
To get my image data, I do the following:
byte[] GetScreenData() {
// Create a texture the size of the screen, RGB24 format
int width = Screen.width;
int height = Screen.height;
RenderTexture rt = new RenderTexture(width, height, 24);
Texture2D tex = new Texture2D(width, height, TextureFormat.RGB24, false);
Camera camera = GameObject.Find("Main Camera").GetComponent < Camera > ();
camera.targetTexture = rt;
camera.Render();
RenderTexture.active = rt;
// Read screen contents into the texture
tex.ReadPixels(new Rect(0, 0, width, height), 0, 0);
camera.targetTexture = null;
RenderTexture.active = null;
Destroy(rt);
// Encode texture into JPG
byte[] bytes = tex.EncodeToJPG();
Destroy(tex);
return bytes;
}
I then serialize my data using FlatBuffers:
public static byte[] FlatSerialize(this ServerToClientMessage message) {
var builder = new FlatBufferBuilder(1);
//Create an ID
var MessageId = builder.CreateString(message.MessageId.ToString());
//Start the vector...
//Loop over each byte and add it - my god, is there not a better way?
FlatServerToClientMessage.StartImagebytesVector(builder, message.ImageBytes.Length);
foreach(var imageByte in message.ImageBytes) {
builder.AddByte(imageByte);
}
var imagebytes = builder.EndVector();
//Start the FlatServerToClientMessage and add the MessageId and imagebytes
FlatServerToClientMessage.StartFlatServerToClientMessage(builder);
FlatServerToClientMessage.AddMessageid(builder, MessageId);
FlatServerToClientMessage.AddImagebytes(builder, imagebytes);
//End the FlatServerToClientMessage and finish it...
var flatMessage = FlatServerToClientMessage.EndFlatServerToClientMessage(builder);
FlatServerToClientMessage.FinishFlatServerToClientMessageBuffer(builder, flatMessage);
return builder.SizedByteArray();
}
Next, I send my data:
public void SendRaw(byte[] dataToSend) {
///We must send the length of the message before sending the actual message
var sizeInfo = new byte[4]; // = BitConverter.GetBytes(dataToSend.Length);
//Shift the bytes
sizeInfo[0] = (byte) dataToSend.Length;
sizeInfo[1] = (byte)(dataToSend.Length >> 8);
sizeInfo[2] = (byte)(dataToSend.Length >> 16);
sizeInfo[3] = (byte)(dataToSend.Length >> 24);
try {
var stream = Client.GetStream();
//Send the length of the data
stream.Write(sizeInfo, 0, 4);
//Send the data
stream.Write(dataToSend, 0, dataToSend.Length);
} catch (Exception ex) {
Debug.LogException(ex);
} finally {
//raise event to tell system that the client has disconnected and that listening must restart...
}
}
Back on my client device, I am listening for incoming data which deserializes and raises an event to alert the system to the arrival of a new image...
private void Run() {
try {
// ShutdownEvent is a ManualResetEvent signaled by
// Client when its time to close the socket.
while (!ShutDownEvent.WaitOne(0)) {
try {
if (!_stream.DataAvailable) continue;
//Read the first 4 bytes which represent the size of the message, and convert from byte array to int32
var sizeinfo = new byte[4];
_stream.Read(sizeinfo, 0, 4);
var messageSize = BitConverter.ToInt32(sizeinfo, 0);
//create a new buffer for the data to be read
var buffer = new byte[messageSize];
var read = 0;
//Continue reading from the stream until we have read all bytes #messageSize
while (read != messageSize) {
read += _stream.Read(buffer, read, buffer.Length - read);
}
var message = new ServerToClientMessage().FlatDeserialize(buffer);
//raise data received event
OnDataReceived(message);
} catch (IOException ex) {
// Handle the exception...
}
}
} catch (Exception ex) {
// Handle the exception...
} finally {
_stream.Close();
}
}
To deserialize, I do the following:
public static ServerToClientMessage FlatDeserialize(this ServerToClientMessage message, byte[] bytes) {
var bb = new ByteBuffer(bytes);
var flatmessage = FlatServerToClientMessage.GetRootAsFlatServerToClientMessage(bb);
message.MessageId = new Guid(flatmessage.Messageid);
message.ImageBytes = new byte[flatmessage.ImagebytesLength];
for (var i = 0; i < flatmessage.ImagebytesLength; i++) {
message.ImageBytes[i] = flatmessage.GetImagebytes(i);
}
return message;
}
For clarity, here is the ServerToClientMessage class:
public class ServerToClientMessage : EventArgs
{
public Guid MessageId { get; set; }
public byte[] ImageBytes { get; set; }
}
Anyway, next, the OnDataReceived event gets raised and that in turn calls a function to convert from the ImageBytes array to a System.Drawing.Image. That function is here:
public Image byteArrayToImage(byte[] byteArrayIn) {
// SAME PROBLEM!
//var converter = new System.Drawing.ImageConverter();
// return (Image)converter.ConvertFrom(byteArrayIn); ;
using(var memoryStream = new MemoryStream(byteArrayIn)) {
return Image.FromStream(memoryStream, false, false);
}
}
Now, my image data being sent from the server is fine and dandy... I have validated it. This all works fine when I use JSON, too. I've tried many ways to convert from a byte array to an Image, but I always seem to get the Parameter is not valid exception. I've also tried sending my image in different formats like JPG and PNG, as well as raw pixel data.
Anyone have an idea?
Figured it out.
It turns out that the data is backwards...due to FlatBuffers serialization.
The solution is to reverse the order of my for-loop during serialization:
for (var i = message.ImageBytes.Length; i -->0;)
{
builder.AddByte(message.ImageBytes[i]);
}
I am trying to create TCP IP Client in WPF GUI / C#.NET for a Ubunu Server.
Problem: I am able to connect to the server machine , the connection work correctly, send the message correctly and the ubuntu console show as well , Client Connected and sent this command like Start Video Feed on the server but When it comes to Read the Response nothing happens - It does not read the byte array that should be returned by the Ubuntu Server. Actually on message - 102 it should start the video feed on the server and return back the video feed byte array which should be read further and display the video. No code written yet to display the video feed as I am unable to read the feed from the server however, the client sends the commands(messages) to the server correctly as mentioned can view it on a console of the Ubuntu server machine. Please suggest thanks !!
Below is the code please have a look and suggest me what I am doing wrong :
namespace POC_TCP_Listener
{
/// <summary>
/// Interaction logic for MainWindow.xaml
/// </summary>
public partial class MainWindow : Window
{
public MainWindow()
{
InitializeComponent();
}
private int WhichEventFired = 0;
private void Button_Click_1(object sender, RoutedEventArgs e)
{
try
{
// string message = "{Site: 1}";
WhichEventFired = 1; //Start Video Feed
Thread ClientThread = new Thread(new ThreadStart(ConnectToServerAndRetrieveBytes));
ClientThread.Start();
}
catch (Exception ex)
{
string st = ex.Message;
}
}
private void ConnectToServerAndRetrieveBytes()
{
TcpClient TCP = new TcpClient();
TCP.Connect("IPAddress", 5001);
byte[] packet;
var size = 9;
var header = 102;
var siteId = 1;
var state = 1;
if (WhichEventFired == 1)
{
header = 102; // Start Video Feed
}
else if (WhichEventFired == 2)
{
header = 114; // Stop Video Feed
}
else
{
header = 115; // query Temperature
}
// <8> <115> <1>
packet = BitConverter.GetBytes(size).Reverse().Concat(BitConverter.GetBytes(header).Reverse()).Concat(BitConverter.GetBytes(siteId).Reverse()).Concat(BitConverter.GetBytes(state).Reverse()).ToArray();
// Translate the passed message into ASCII and store it as a Byte array.
Byte[] data = packet;
// Get a client stream for reading and writing.
NetworkStream stream = TCP.GetStream();
// Send the message to the connected TcpServer.
stream.Write(data, 0, data.Length);
byte[] buffer = new byte[64 * 1024];
using (MemoryStream ms = new MemoryStream())
{
int read;
// In the below line - it stops and nothing happens after it - Please Suggest
while ((read = stream.Read(buffer, 0, buffer.Length)) > 0)
{
ms.Write(buffer, 0, read);
}
// return ms.ToArray();
}
stream.Close();
TCP.Close();
}
}
}
strong text
Please suggest why it stops working or let me know if I am doing anything wrong.
I think the problem is that if there are no more packages left, it escapes the while-loop. If you want to check it endlessly you should try using a for. But i am not sure.
Yeah that should be it. If there are no Packages in the moment the while-loop is starting, it terminates itself because there is nothing to read. So you could check if the Ubuntu Server returns you something.
I have set up a C# server which at present serves up one test mp3 file over TCP. The code to send the file is as follows
public void RunStreamer()
{
log.MakeLog("Starting streamer");
Run = true;
//Need to look into how to thread this to not block the main window
TcpListener listen = new TcpListener(localAddr, _port);
listen.Start(); //startlistening to client requests
//blocks until a client request comes in
for (; ; )
{
Socket socket = listen.AcceptSocket();
if (socket.Connected)
{
SendFileToClient(socket);
socket.Disconnect(false);
}
}
}
void SendFileToClient(Socket socket)
{
log.MakeLog("Connection made");
NetworkStream netStream = new NetworkStream(socket);
StreamWriter writer = new StreamWriter(netStream);
//Todo - set specfified file - this file just for test
FileStream filestream = File.Open(#"C:\MusicTest\Test.mp3", FileMode.Open, FileAccess.Read, FileShare.Read);
filestream.CopyTo(netStream);
netStream.Flush();
netStream.Close();
}
In my test android set up I am making a call to the server on a button click:
public void btngo_click(View v)
{
final TcpClient client = new TcpClient();
new Thread(new Runnable(){
#Override
public void run() {
final MediaPlayer mediaPlayer = new MediaPlayer();
client.GetStream();
runOnUiThread(new Runnable(){
public void run()
{
int length = client.GetLength();
if(length > 0)
{
byte[] result = client.GetResult();
try {
// create temp file that will hold byte array
File tempMp3 = File.createTempFile("test", "mp3", getCacheDir());
tempMp3.deleteOnExit();
FileOutputStream fos = new FileOutputStream(tempMp3);
fos.write(result);
fos.close();
mediaPlayer.reset();
FileInputStream fis = new FileInputStream(tempMp3);
mediaPlayer.setDataSource(fis.getFD());
mediaPlayer.prepare();
mediaPlayer.start();
} catch (IOException ex) {
String s = ex.toString();
ex.printStackTrace();
}
}
}
});
}
}).start();
}
the stream is received in the TcpClient class which is as follows:
public class TcpClient {
public final static String SERVER_ADDRESS = "127.0.0.1";
public final static int SERVER_PORT = 65000;
public String TotalResult;
public int Length;
byte[] result = new byte[21000000];
public TcpClient()
{
}
public int GetLength()
{
return Length;
}
public byte[] GetResult()
{
return result;
}
public void GetStream()
{
try
{
final Socket socket = new Socket("192.0.0.5", 85000);
final InputStream input = new BufferedInputStream(socket.getInputStream());
ByteArrayOutputStream buffer = new ByteArrayOutputStream();
int nread;
while((nread = input.read(result, 0, result.length)) != -1)
{
buffer.write(result, 0, nread);
}
buffer.flush();
//input.read(result);
Length = result.length;
input.close();
socket.close();
} catch (UnknownHostException e) {
String exc = e.getMessage();
e.printStackTrace();
} catch (IOException e) {
String exc2 = e.getMessage();
e.printStackTrace();
}
}
}
With apologies for all the code here is my problem.
I am receiving the stream. The temp MP3 file is created and the media player starts. I then only get a short snippet of the test MP3 file (which is a full song). It also jumps about a bit. The length is not the same and the section of the song played is different each time.
How do I receive the full file in a ordered way such that it will provide full play back of the song.
I have tried to route around for this and have an idea that I need to tell my client what file size it should suspect and then perform some loop until all data is received although I have no idea how to successfully implement this if that is the correct solution.
Any pointers on where I am going wrong or what I can do to rectify would be greatly appreciated!!
Having received no answers on this I dug around a bit more. Two things were wrong:
Firstly I had not included the size of the stream as a int sized header in my stream. I understand that for smaller files this will not be a problem but as file sizes grow it is necessary to make sure that the whole stream has been received.
This in turn raised another issue. The int I was sending as byte[] form c# was not returning the correct value in Java. Turns out Java uses sbytes -128 to 127 range as opposed to byte. This then involved a bit of code to convert to an int. then I could instruct the reader to readfully passing in the byte[] buffer with the actual size of the expected stream = voila it worked. MP3 files is received and plays just fine.