Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to synchronize two USB cameras to use them as Stereo Camera?

Tags:

opencv

camera

I am trying to implement object detection using Stereo Vision in OPENCV. I am using two Logitech-C310 camera for this. But I am not getting synchronized frames with two cameras. Time difference between two cameras frame capture is also not same.

  • How synchronization can be done ?

  • In Stereo Cameras like Bumblebee, Minoru etc. do we need to synchronize ?


Thanks for your response.

I am trying to implement person tracking with a moving robotic platform. I am using cvQueryFrame(capture) to capture each frame from both cameras one by one in a loop. Here is the part of code that I am using:

CvCapture* capture_1 = cvCreateCameraCapture(0);
Cvcapture* capture_2 = cvCreateCameraCapture(1);
for(i=1;i<=20;i++)
{
 frame_1= cvQueryFrame(capture_1);
 frame_2= cvQueryFrame(Capture_2);

//processing of frames//

}

even if someone moves with moderate speed in front of camera the difference between frame_1 and frame_2 is visible.


Is this delay because of cvQueryFrame(capture)?

like image 217
user3291650 Avatar asked Feb 10 '14 07:02

user3291650


1 Answers

TL;DR

See my last code snippet "A simple workaround". That's how I did it.


Although I didn't work with CvCapture but with VideoCapture and not C++ but Python, my solution might still apply to your problem. I also wanted to capture synchronized stereo images with OpenCV.

A naive attempt might be:

vidStreamL = cv2.VideoCapture(0)
vidStreamR = cv2.VideoCapture(2)

_, imgL = vidStreamL.read()
_, imgR = vidStreamR.read()

vidStreamL.release()
vidStreamR.release()

Problem 1: The second camera is only triggered after the first image is captured and retrieved from the camera, which takes some time.

A better way is to grab the frame first (tell the cameras to pin down the current frame) and to retrieve it afterwards:

vidStreamL = cv2.VideoCapture(0)
vidStreamR = cv2.VideoCapture(2)

vidStreamL.grab()
vidStreamR.grab()
_, imgL = vidStreamL.retrieve()
_, imgR = vidStreamR.retrieve()

vidStreamL.release()
vidStreamR.release()

Problem 2: I still measured differences of about 200 ms (filming a watch with milliseconds). The reason is an internal capture buffer, described here. Unfortunately, it cannot always be easily deactivated (at least in my OpenCV version).

A simple workaround is to grab frames multiple times until the capture buffer is empty before retrieving the actual images:

vidStreamL = cv2.VideoCapture(0)
vidStreamR = cv2.VideoCapture(2)

for i in range(10):
    vidStreamL.grab()
    vidStreamR.grab()
_, imgL = vidStreamL.retrieve()
_, imgR = vidStreamR.retrieve()

vidStreamL.release()
vidStreamR.release()

This solution works well for my case. I couldn't see any measurable difference (< 10 ms).

(Problem 3:) Technically, the cameras are still not synchronized. Normal USB webcams won't be able to do that. But more professional cameras often have an external trigger to actually control when to start capturing a frame. This is beyond the scope of this post.

like image 116
Falko Avatar answered Oct 16 '22 14:10

Falko