Let's say I have a wireless camera that I want to stream footage from to unity in real-time. Is there a way to achieve this?
Bonus questions:
What about cameras with wider angles (180, perhaps even 360)
How much of an issue would the delay be if this is a footage I want to be interacting with
Possible to send more data such as depth perception(using depth perception camera) aside from the regular footage only?
Am I crazy or has this been done?
Thanks in advance
I assume this is a camera with Ethernet port or Wi-Fi that you can connect to and stream images from it live.
If so, then yes, it can be done with Unity.
How it is done without an external library:
Connecting to the Camera:
1.Connect to the-same local network with the camera or if unpn is supported, you can also connect to it through internet. Usually, you need the IP and the port of the camera to do this. Let's say that the Camera IP Address is 192.168.1.5 and the port number is 900. The url to connect to is http://192.168.1.5:900.
Sometimes, it is simply an url that ends with .mjpg or .bin such as http://192.168.1.5/mjpg/video.mjpg and http://192.168.1.5/mjpg/video.bin
Every camera is different. The only way to find the url is to read its manual. If the manual is not available, connect to it with its official Application then use Wireshark to discover the url of the camera Image. The username and the password(if required) can also be found in the manual. If not available, google the model number and everything you need should be found.
Extracting the JPEG from the Camera:
When connected to the camera, the camera will be sending endless data to you. This data you can scan and retrieve the image from it.
2.Search for the JPEG header which is 0xFF followed by 0xD8. If these two bytes are next to each other then start reading the bytes and continue to save them to an array. You can use an index(int) variable to keep count of how many bytes you have received.
int counter = 0;
byte[] completeImageByte = new byte[500000];
byte[] receivedBytes = new byte[500000];
receivedBytes[counter] = byteFromCamera;
counter++;
3.While reading the data from the camera, check if the next two bytes is the JPEG footer which is 0xFF followed by 0xD9. If this is true then you have received the complete image(1 frame).
Your image bytes should look something like:
0xFF 0xD8 someotherbytes(thousands of them)..... then 0xFF 0xD9
Copy receivedBytes to the completeImageByte variable so that it can be used to display the image later on. Reset counter variable to 0.
Buffer.BlockCopy(receivedBytes, 0, completeImageByte, 0, counter);
counter = 0;
Displaying the JPEG Image to the Screen:
4.Display the Image to screen
Since you will be receiving many images per second, the most efficient way I found to display this is to use RawImage Component. So, don't use Image or Sprite Renderer for this if you want this to run on mobile devices.
public RawImage screenDisplay;
if(updateFrame){
Texture2D camTexture = new Texture2D(2, 2);
camTexture.LoadImage(completeImageByte);
screenDisplay.texture = camTexture;
}
You only need to do camTexture = new Texture2D(2, 2); once in the Start() function.
5.Jump back to step 2 and continue doing it until you want to quite.
API for connecting to the Camera:.
Use HttpWebRequest if the camera requires authentication (username and password).
For those that does not require authentication, use UnityWebRequest. When using UnityWebRequest, you must derive your own classes from DownloadHandlerScript or your app will crash as you will be receiving data non stop.
Example of deriving your own class from DownloadHandlerScript:
using UnityEngine;
using System.Collections;
using UnityEngine.Networking;
public class CustomWebRequest : DownloadHandlerScript
{
// Standard scripted download handler - will allocate memory on each ReceiveData callback
public CustomWebRequest()
: base()
{
}
// Pre-allocated scripted download handler
// Will reuse the supplied byte array to deliver data.
// Eliminates memory allocation.
public CustomWebRequest(byte[] buffer)
: base(buffer)
{
}
// Required by DownloadHandler base class. Called when you address the 'bytes' property.
protected override byte[] GetData() { return null; }
// Called once per frame when data has been received from the network.
protected override bool ReceiveData(byte[] byteFromCamera, int dataLength)
{
if (byteFromCamera == null || byteFromCamera.Length < 1)
{
//Debug.Log("CustomWebRequest :: ReceiveData - received a null/empty buffer");
return false;
}
//Search of JPEG Image here
return true;
}
// Called when all data has been received from the server and delivered via ReceiveData
protected override void CompleteContent()
{
//Debug.Log("CustomWebRequest :: CompleteContent - DOWNLOAD COMPLETE!");
}
// Called when a Content-Length header is received from the server.
protected override void ReceiveContentLength(int contentLength)
{
//Debug.Log(string.Format("CustomWebRequest :: ReceiveContentLength - length {0}", contentLength));
}
}
Usage:
using UnityEngine;
using System.Collections;
using UnityEngine.Networking;
public class Test : MonoBehaviour
{
CustomWebRequest camImage;
UnityWebRequest webRequest;
byte[] bytes = new byte[90000];
void Start()
{
string url = "http://camUrl/mjpg/video.mjpg";
webRequest = new UnityWebRequest(url);
webRequest.downloadHandler = new CustomWebRequest(bytes);
webRequest.Send();
}
}
You can them perform step 2,3,4 and 5 in the ReceiveData function from the CustomWebRequest script.
Controlling Camera:
Cameras have commands to pan, rotate, flip, mirror and perform other function.This is different in every camera but it is simple as making GET/POST request to a url of the camera and providing the queries. These commands can be found in the camera manual.
For example: http://192.168.1.5?pan=50&rotate=90
Other Frameworks:
AForge - A free framework that can handle both JPEG/MJPES and FFMPEG from the camera. You have to modify it to work with Unity and you should if you can't do step 2,3,4 and 5.
Related
I am writing an android app in c# (using Avalonia) to communicate with my embedded usb device. The device has an atxmega microcontroller and utilises the atmel provided cdc usb driver. I have a desktop version of the app using System.IO.Ports.SerialPort. I use SerialPort.Write() to write data and SerialPort.ReadByte() in a loop to read data - using SerialPort.Read() and trying to read multiple bytes very often fails, while the looped byte by byte reading basically never does. It works on both Windows and Mac pcs.
The communication is very straightforward - the host sends commands and expects a known length of data to come in from the device.
On the android app I am using the android.hardware.usb class and adapted the code from the multiple serial port wrappers for cdc devices. Here is my connect function:
public static bool ConnectToDevice()
{
if (Connected) return true;
UsbDevice myDevice;
if ((myDevice = GetConnectedDevice()) == null) return false;
if (!HasPermission(myDevice)) return false;
var transferInterface = myDevice.GetInterface(1);
var controlInterface = myDevice.GetInterface(0);
writeEndpoint = transferInterface.GetEndpoint(1);
readEndpoint = transferInterface.GetEndpoint(0);
deviceConnection = manager.OpenDevice(myDevice);
if (deviceConnection != null)
{
deviceConnection.ClaimInterface(transferInterface, true);
deviceConnection.ClaimInterface(controlInterface, true);
SendAcmControlMessage(0x20, 0, parameterMessage);
SendAcmControlMessage(0x22, 0x03, null); //Dtr Rts true
Connected = true;
OnConnect?.Invoke();
return true;
}
return false;
}
The two control transfers are what I borrowed from the serial port wrappers and I use them to:
Set the default parameters - 115200 baudrate, 1 stop bit, parity none, 8 data bits which is the same as in my embedded usb config. This doesn't seem necessary but I do it anyway. The response from the control transfer is 7, so I assume it works properly.
Set Dtr and Rts to true. This doesn't seem to be necessary either - the desktop code works with both set to false as well as to true. The control transfer response is 0 so I assume it also works properly.
I checked all endpoints and parameters after connection and they all seem to be correct - both read and write Endpoints are bulk endpoints with correct data direction.
Writing to my device is not a problem. I can use either the bulkTransfer method as well as UsbRequest and they both work perfectly.
Unfortunately everything falls apart when trying to read data. The initial exchange with the device is as follows:
Host sends a command and expects 4 bytes of data.
Host sends another command and expects 160 bytes of data.
The bulk transfer method looks like this:
public static byte[] ReadArrayInBulk(int length)
{
byte[] result = new byte[length];
int transferred = deviceConnection.BulkTransfer(readEndpoint, result, length, 500);
if (transferred != length)
{
transferred = deviceConnection.BulkTransfer(readEndpoint, result, length, 500);
}
return result;
}
The first transfer almost always (99.9%) returns 0 and then the second returns the proper data, up to a point. For the first exchange it does receive the 4 bytes correctly. However, for the second exchange, when it expects 160 bytes the second transfer returns 80, 77, 22 and other random lengths of data. I tried making a loop and performing bulk transfers until the number of bytes transferred is equal to the expected amount but unfortunately after receiving the random length of data for the first time the bulk transfer starts always returning 0.
So I tried the UsbRequest way to get the data:
public static byte[] GetArrayRequest(int length)
{
byte[] result = new byte[length];
UsbRequest req = new UsbRequest();
req.Initialize(deviceConnection, readEndpoint);
ByteBuffer buff = ByteBuffer.Wrap(result);
req.Queue(buff, length);
var resulting = deviceConnection.RequestWait();
if (resulting != null) buff.Get(result);
req.Close();
return result;
}
The WrapBuffer does get filled with the expected amount of data (the position is the same as length), unfortunately the data is all 0. It does work for a couple transfers once in a while, but 99% of the time it just returns zeroes.
So, given that the desktop version also has trouble reading data in bulk using the SerialPort.Read() method but works perfectly when reading them one by one (SerialPort.ReadByte() which underneath uses SerialPort.Read() for a single byte) in a loop I decided to try doing that in my app as well.
Unfortunately, when looping using the UsbRequest method not only does it take ages but it also always returns zeroes. When looping using the BulkTransfer method for a single byte array it returns 0 on the first transfer and -1 on the second, indicating a timeout.
I have spent so much time with this that I am at my wits' end. I checked multiple implementations of usb serial for android projects from mik3y, kai-morich, felHR85 but all they do differently from me are the two control transfers, which I have now added - to no avail.
Can anyone tell me what could cause the bulkTransfer to always return 0 in response to the first transfer (which is supposed to be a success, but what does it actually do with no data read?) and only return some data or timeout (in case of a single byte transfer) on the second one? If it were reliably doing so I would just get used to it, but unfortunately it only seems to work correctly for smaller transfers and only up to a certain point, because while the initial couple/couple dozen bytes are correct, then it stops receiving any more in random intervals. Or why the UsbRequest method does fill the buffers, but with only zeroes 99% of the time?
I am trying to show AR video by connecting the hand tracking camera device (Leap motion: window device manager list: 0) and webcam camera (window device manager list: 1) in Unity.
Leap motion connects using the manufacturer's SDK and package files, so there is no need for additional coding to connect the camera in Unity.
The problem occurs when I connect a webcam camera (window device manager list: 1) in Unity and show AR video.
When the following code is applied to an object, if both Leap motion and webcam camera are connected, leap motion is recognized and output as video, and video output of webcam camera becomes impossible.
If only the webcam is connected after unplugging the leap motion from the PC, the video output of the webcam camera is possible.
I want to output video by selecting webcam camera (window device manager list: 1) on the object with both Leap motion and webcam camera connected to the PC.
Since I am a beginner in Unity, I need to simply modify it in the code below.
Waiting for help.
using UnityEngine;
using System.Collections;
public class WebCam : MonoBehaviour {
// Use this for initialization
void Start () {
WebCamTexture web = new WebCamTexture(1280,720,60);
GetComponent<MeshRenderer>().material.mainTexture = web;
web.Play();
}
// Update is called once per frame
void Update () {
}
}
There is a constructor of WebCamTexture that takes the parameter
deviceName: The name of the video input device to be used.
You can list all available devices via WebCamTexture.devices and get the name like e.g.
var devices = WebCamTexture.devices;
var webcamTexture = new WebCamTexture(devices[1].name);
You might also be able then to filter out the device you need like e.g.
using System.Linq;
...
var device = devices.Select(d => d.name).FirstOrDefault(n => !n.Contains("Leap"));
For finding out how the cameras are called and to be able to filter by name you could print them all like e.g.
Debug.Log(string.Join("\n", devices.Select(d => d.name)));
Theoretically you could even feed them into a dropdown and let the user decide which device to use before creating the WebCamTexture then you wouldn't have to guess the name hardcoded at all ;)
Also note:
Call Application.RequestUserAuthorization before creating a WebCamTexture.
I am trying to receive some video from a Parrot Bebop 2 drone.
I am using this Bebop.sdp file, which is supplied by Parrot.
I have previously tried in Python which worked by setting an environment variable OPENCV_FFMPEG_CAPTURE_OPTIONS to protocol_whitelist;file,rtp,udp.
This worked in Python.
Since then we have ported that project mostly to C#. But when trying to connect to the stream, we get this error Protocol 'rtp' not on whitelist 'file,crypto'!.
I have seen other examples where this -protocol_whitelist "file,rtp,udp" is passed through ffmpeg arguments, but in this case it does not seem as a solution, since i cannot pass it on.
Firstly i started with a simple test:
VideoCapture videoCapture = new VideoCapture(0);
var frame = videoCapture.QueryFrame();
while (frame != null)
{
using (frame)
{
CvInvoke.Imshow("frame", frame);
CvInvoke.WaitKey(1);
}
frame = videoCapture.QueryFrame();
}
This code get's the stream from the webcam and works.
I get the error when i run it with the SDP file:
VideoCapture videoCapture = new VideoCapture(#"./bebop.sdp");
var frame = videoCapture.QueryFrame();
while (frame != null)
{
using (frame)
{
CvInvoke.Imshow("frame", frame);
CvInvoke.WaitKey(1);
}
frame = videoCapture.QueryFrame();
}
I have tried to add both:
Environment.SetEnvironmentVariable("OPENCV_FFMPEG_CAPTURE_OPTIONS", "protocol_whitelist;file,rtp,udp");
And for a more aggressive approach:
Environment.SetEnvironmentVariable("OPENCV_FFMPEG_CAPTURE_OPTIONS", "protocol_whitelist;file,rtp,udp", EnvironmentVariableTarget.User);
None of them seem to have an impact since i get same error.
I would expect when setting the environment variables that the OpenCV would whitelist the protocols needed so the stream would come trough, and be displayed in the frame.
Environment variable was working, however due to the need for Visual Studio to restart first, in order to update environment variables, this was not discovered before today.
I have a C# solution that solves a dynamic problem, I want to simulate the motion of masses in a real-time manner. I prepared a Unity3D exe file with some objects which are controlled using internal scripts of the Unity and it works internally which is not useful for me. in this respect, in my external C# code, after problem solution, I called the exe file using Process.Start();.
The problem is that I can't pass the position values to Unity3D process. How is it possible to send my C# parameters to the running Unity process?
I read so many tutorials in which Unity gets the input from keyboard or mouse or other devices but I want to send my code parameters to Unity as an input.
Is this possible?
Use Environment.GetCommandLineArgs() to get the arguments.
In Unity
class YourScript : Monobehaviour
{
void Start()
{
string[] args = Environment.GetCommandLineArgs();
}
}
Your code
Process.Start("your_program.exe", "arg1 arg2 ...");
If you're looking to just parse data when you launch with arguments, using Environment.GetCommandLineArgs() is good enough.
If you need to send and receive data both ways (or one way), you can use memory mapped files (docs).
Example:
//creates a memory map file in system memory with 1024 bytes of space
var mmf = MemoryMappedFile.CreateOrOpen("memory", 1024);
var accessor = mmf.CreateViewAccessor();
//writes 3.14 as a double to the memory at position 8
accessor.Write(8, 3.14);
//reads the double from position 8
double pi = accessor.Read<double>(8);
This is certainly doable, I use this method of communication between my network process and the game engine in order to avoid assembly reloads when the editor compiles scripts during run time so that I dont lose any connections.
and also there is another problem! the code I am using in Unity is:
void Update()
{
//reads the double from position 1
using (var mmf = MemoryMappedFile.OpenExisting("memory"))
{
using (var accessor = mmf.CreateViewAccessor())
{
accessor.Read(1, out float omegay);
float omegayy = Convert.ToSingle(omegay);
transform.Rotate(new Vector3(0, omegayy, 0) * Time.deltaTime);
}
}
}
my parameter (omegay) is a negative value, but in unity animates it as a positive number!
Let's say I have a wireless camera that I want to stream footage from to unity in real-time. Is there a way to achieve this?
Bonus questions:
What about cameras with wider angles (180, perhaps even 360)
How much of an issue would the delay be if this is a footage I want to be interacting with
Possible to send more data such as depth perception(using depth perception camera) aside from the regular footage only?
Am I crazy or has this been done?
Thanks in advance
I assume this is a camera with Ethernet port or Wi-Fi that you can connect to and stream images from it live.
If so, then yes, it can be done with Unity.
How it is done without an external library:
Connecting to the Camera:
1.Connect to the-same local network with the camera or if unpn is supported, you can also connect to it through internet. Usually, you need the IP and the port of the camera to do this. Let's say that the Camera IP Address is 192.168.1.5 and the port number is 900. The url to connect to is http://192.168.1.5:900.
Sometimes, it is simply an url that ends with .mjpg or .bin such as http://192.168.1.5/mjpg/video.mjpg and http://192.168.1.5/mjpg/video.bin
Every camera is different. The only way to find the url is to read its manual. If the manual is not available, connect to it with its official Application then use Wireshark to discover the url of the camera Image. The username and the password(if required) can also be found in the manual. If not available, google the model number and everything you need should be found.
Extracting the JPEG from the Camera:
When connected to the camera, the camera will be sending endless data to you. This data you can scan and retrieve the image from it.
2.Search for the JPEG header which is 0xFF followed by 0xD8. If these two bytes are next to each other then start reading the bytes and continue to save them to an array. You can use an index(int) variable to keep count of how many bytes you have received.
int counter = 0;
byte[] completeImageByte = new byte[500000];
byte[] receivedBytes = new byte[500000];
receivedBytes[counter] = byteFromCamera;
counter++;
3.While reading the data from the camera, check if the next two bytes is the JPEG footer which is 0xFF followed by 0xD9. If this is true then you have received the complete image(1 frame).
Your image bytes should look something like:
0xFF 0xD8 someotherbytes(thousands of them)..... then 0xFF 0xD9
Copy receivedBytes to the completeImageByte variable so that it can be used to display the image later on. Reset counter variable to 0.
Buffer.BlockCopy(receivedBytes, 0, completeImageByte, 0, counter);
counter = 0;
Displaying the JPEG Image to the Screen:
4.Display the Image to screen
Since you will be receiving many images per second, the most efficient way I found to display this is to use RawImage Component. So, don't use Image or Sprite Renderer for this if you want this to run on mobile devices.
public RawImage screenDisplay;
if(updateFrame){
Texture2D camTexture = new Texture2D(2, 2);
camTexture.LoadImage(completeImageByte);
screenDisplay.texture = camTexture;
}
You only need to do camTexture = new Texture2D(2, 2); once in the Start() function.
5.Jump back to step 2 and continue doing it until you want to quite.
API for connecting to the Camera:.
Use HttpWebRequest if the camera requires authentication (username and password).
For those that does not require authentication, use UnityWebRequest. When using UnityWebRequest, you must derive your own classes from DownloadHandlerScript or your app will crash as you will be receiving data non stop.
Example of deriving your own class from DownloadHandlerScript:
using UnityEngine;
using System.Collections;
using UnityEngine.Networking;
public class CustomWebRequest : DownloadHandlerScript
{
// Standard scripted download handler - will allocate memory on each ReceiveData callback
public CustomWebRequest()
: base()
{
}
// Pre-allocated scripted download handler
// Will reuse the supplied byte array to deliver data.
// Eliminates memory allocation.
public CustomWebRequest(byte[] buffer)
: base(buffer)
{
}
// Required by DownloadHandler base class. Called when you address the 'bytes' property.
protected override byte[] GetData() { return null; }
// Called once per frame when data has been received from the network.
protected override bool ReceiveData(byte[] byteFromCamera, int dataLength)
{
if (byteFromCamera == null || byteFromCamera.Length < 1)
{
//Debug.Log("CustomWebRequest :: ReceiveData - received a null/empty buffer");
return false;
}
//Search of JPEG Image here
return true;
}
// Called when all data has been received from the server and delivered via ReceiveData
protected override void CompleteContent()
{
//Debug.Log("CustomWebRequest :: CompleteContent - DOWNLOAD COMPLETE!");
}
// Called when a Content-Length header is received from the server.
protected override void ReceiveContentLength(int contentLength)
{
//Debug.Log(string.Format("CustomWebRequest :: ReceiveContentLength - length {0}", contentLength));
}
}
Usage:
using UnityEngine;
using System.Collections;
using UnityEngine.Networking;
public class Test : MonoBehaviour
{
CustomWebRequest camImage;
UnityWebRequest webRequest;
byte[] bytes = new byte[90000];
void Start()
{
string url = "http://camUrl/mjpg/video.mjpg";
webRequest = new UnityWebRequest(url);
webRequest.downloadHandler = new CustomWebRequest(bytes);
webRequest.Send();
}
}
You can them perform step 2,3,4 and 5 in the ReceiveData function from the CustomWebRequest script.
Controlling Camera:
Cameras have commands to pan, rotate, flip, mirror and perform other function.This is different in every camera but it is simple as making GET/POST request to a url of the camera and providing the queries. These commands can be found in the camera manual.
For example: http://192.168.1.5?pan=50&rotate=90
Other Frameworks:
AForge - A free framework that can handle both JPEG/MJPES and FFMPEG from the camera. You have to modify it to work with Unity and you should if you can't do step 2,3,4 and 5.