Let's say I have a wireless camera that I want to stream footage from to unity in real-time. Is there a way to achieve this?
Bonus questions:
Thanks in advance
I assume this is a camera with Ethernet port or Wi-Fi that you can connect to and stream images from it live.
If so, then yes, it can be done with Unity.
How it is done without an external library:
Connecting to the Camera:
1.Connect to the-same local network with the camera or if unpn is supported, you can also connect to it through internet. Usually, you need the IP and the port of the camera to do this. Let's say that the Camera IP Address is 192.168.1.5
and the port number is 900
. The url to connect to is http://192.168.1.5:900
.
Sometimes, it is simply an url that ends with .mjpg or .bin such as http://192.168.1.5/mjpg/video.mjpg
and http://192.168.1.5/mjpg/video.bin
Every camera is different. The only way to find the url is to read its manual. If the manual is not available, connect to it with its official Application then use Wireshark to discover the url of the camera Image. The username
and the password
(if required) can also be found in the manual. If not available, google the model number and everything you need should be found.
Extracting the JPEG from the Camera:
When connected to the camera, the camera will be sending endless data to you. This data you can scan and retrieve the image from it.
2.Search for the JPEG header which is 0xFF
followed by 0xD8
. If these two bytes are next to each other then start reading the bytes and continue to save them to an array. You can use an index(int
) variable to keep count of how many bytes you have received.
int counter = 0;
byte[] completeImageByte = new byte[500000];
byte[] receivedBytes = new byte[500000];
receivedBytes[counter] = byteFromCamera;
counter++;
3.While reading the data from the camera, check if the next two bytes is the JPEG footer which is 0xFF
followed by 0xD9
. If this is true then you have received the complete image(1 frame).
Your image bytes should look something like:
0xFF
0xD8
someotherbytes(thousands of them)..... then 0xFF
0xD9
Copy receivedBytes
to the completeImageByte
variable so that it can be used to display the image later on. Reset counter
variable to 0.
Buffer.BlockCopy(receivedBytes, 0, completeImageByte, 0, counter);
counter = 0;
Displaying the JPEG Image to the Screen:
4.Display the Image to screen
Since you will be receiving many images per second, the most efficient way I found to display this is to use RawImage
Component. So, don't use Image
or Sprite Renderer
for this if you want this to run on mobile devices.
public RawImage screenDisplay;
if(updateFrame){
Texture2D camTexture = new Texture2D(2, 2);
camTexture.LoadImage(completeImageByte);
screenDisplay.texture = camTexture;
}
You only need to do camTexture = new Texture2D(2, 2);
once in the Start()
function.
5.Jump back to step 2 and continue doing it until you want to quite.
API for connecting to the Camera:.
Use HttpWebRequest
if the camera requires authentication (username and password).
For those that does not require authentication, use UnityWebRequest
. When using UnityWebRequest
, you must derive your own classes from DownloadHandlerScript
or your app will crash as you will be receiving data non stop.
Example of deriving your own class from DownloadHandlerScript
:
using UnityEngine;
using System.Collections;
using UnityEngine.Networking;
public class CustomWebRequest : DownloadHandlerScript
{
// Standard scripted download handler - will allocate memory on each ReceiveData callback
public CustomWebRequest()
: base()
{
}
// Pre-allocated scripted download handler
// Will reuse the supplied byte array to deliver data.
// Eliminates memory allocation.
public CustomWebRequest(byte[] buffer)
: base(buffer)
{
}
// Required by DownloadHandler base class. Called when you address the 'bytes' property.
protected override byte[] GetData() { return null; }
// Called once per frame when data has been received from the network.
protected override bool ReceiveData(byte[] byteFromCamera, int dataLength)
{
if (byteFromCamera == null || byteFromCamera.Length < 1)
{
//Debug.Log("CustomWebRequest :: ReceiveData - received a null/empty buffer");
return false;
}
//Search of JPEG Image here
return true;
}
// Called when all data has been received from the server and delivered via ReceiveData
protected override void CompleteContent()
{
//Debug.Log("CustomWebRequest :: CompleteContent - DOWNLOAD COMPLETE!");
}
// Called when a Content-Length header is received from the server.
protected override void ReceiveContentLength(int contentLength)
{
//Debug.Log(string.Format("CustomWebRequest :: ReceiveContentLength - length {0}", contentLength));
}
}
Usage:
using UnityEngine;
using System.Collections;
using UnityEngine.Networking;
public class Test : MonoBehaviour
{
CustomWebRequest camImage;
UnityWebRequest webRequest;
byte[] bytes = new byte[90000];
void Start()
{
string url = "http://camUrl/mjpg/video.mjpg";
webRequest = new UnityWebRequest(url);
webRequest.downloadHandler = new CustomWebRequest(bytes);
webRequest.Send();
}
}
You can them perform step 2,3,4 and 5 in the ReceiveData
function from the CustomWebRequest
script.
Controlling Camera:
Cameras have commands to pan, rotate, flip, mirror and perform other function.This is different in every camera but it is simple as making GET/POST request to a url of the camera and providing the queries. These commands can be found in the camera manual.
For example: http://192.168.1.5?pan=50&rotate=90
Other Frameworks:
AForge - A free framework that can handle both JPEG/MJPES and FFMPEG from the camera. You have to modify it to work with Unity and you should if you can't do step 2,3,4 and 5.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With