Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Rotation and Translation from Essential Matrix incorrect

I currently have a stereo camera setup. I have calibrated both cameras and have the intrinsic matrix for both cameras K1 and K2.

K1 = [2297.311,      0,       319.498;
      0,       2297.313,      239.499;
      0,             0,       1];

K2 = [2297.304,      0,       319.508;
      0,       2297.301,      239.514;
      0,             0,       1];

I have also determined the Fundamental matrix F between the two cameras using findFundamentalMat() from OpenCV. I have tested the Epipolar constraint using a pair of corresponding points x1 and x2 (in pixel coordinates) and it is very close to 0.

F = [5.672563368940768e-10, 6.265600996978877e-06, -0.00150188302445251;
     6.766518121363063e-06, 4.758206104804563e-08,  0.05516598334827842;
     -0.001627120880791009, -0.05934224611334332,   1];

x1 = 133,75    
x2 = 124.661,67.6607

transpose(x2)*F*x1 = -0.0020

From F I am able to obtain the Essential Matrix E as E = K2'*F*K1. I decompose E using the MATLAB SVD function to get the 4 possibilites of rotation and translation of K2 with respect to K1.

E = transpose(K2)*F*K1;
svd(E);

[U,S,V] = svd(E);

diag_110 = [1 0 0; 0 1 0; 0 0 0];
newE = U*diag_110*transpose(V);
[U,S,V] = svd(newE); //Perform second decompose to get S=diag(1,1,0)

W = [0 -1 0; 1 0 0; 0 0 1];

R1 = U*W*transpose(V);
R2 = U*transpose(W)*transpose(V);
t1 = U(:,3); //norm = 1
t2 = -U(:,3); //norm = 1

Let's say that K1 is used as the coordinate frame for which we make all measurements. Therefore, the center of K1 is at C1 = (0,0,0). With this it should be possible to apply the correct rotation R and translation t such that C2 = R*(0,0,0)+t (i.e. the center of K2 is measured with respect to the center of K1)

Now let's say that using my corresponding pairs x1 and x2. If I know the length of each pixel in both my cameras and since I know the focal length from the intrinsic matrix, I should be able to determine two vectors v1 and v2 for both cameras that intersect at the same point as seen below.

pixel_length = 7.4e-6; //in meters
focal_length = 17e-3;  //in meters

dx1 = (133-319.5)*pixel_length; //x-distance from principal point of 640*480 image
dy1 = (75-239.5) *pixel_length; //y-distance from principal point of 640*480 image
v1  = [dx1 dy1 focal_length] - (0,0,0); //vector found using camera center and corresponding image point on the image plane

dx2 = (124.661-319.5)*pixel_length; //same idea 
dy2 = (67.6607-239.5)*pixel_length; //same idea
v2  = R * ( [dx2 dy2 focal_length] - (0,0,0) ) + t; //apply R and t to measure v2 with respect to K1 frame

With this vector and knowing the line equation in parametric form, we can then equate the two lines to triangulate and solve the two scalar quantities s and t through the left hand divide function in MATLAB to solve for the system of equations.

C1 + s*v1 = C2 + t*v2
C1-C2 = tranpose([v2 v1])*transpose([s t]) //solve Ax = B form system to find s and t

With s and t determined we can find the triangulated point by plugging back into the line equation. However, my process has not been successful as I cannot find a single R and t solution in which the point is in front of both cameras and where both cameras are pointed forwards.

Is there something wrong with my pipeline or thought process? Is it at all possible to obtain each individual pixel ray?

like image 452
user1431515 Avatar asked Feb 02 '16 18:02

user1431515


1 Answers

When you decompose the essential matrix into R and t you get 4 different solutions. Three of them project the points behind one or both cameras, and one of them is correct. You have to test which one is correct by triangulating some sample points.

There is a function in the Computer Vision System Toolbox in MATLAB called cameraPose, which will do that for you.

like image 153
Dima Avatar answered Sep 20 '22 15:09

Dima