I am trying to recover the movement of a camera by using the
fundamental matrix, and the algorithm as given on Wikipedia. For
this I need to find the fundamental matrix. I am using
OpenCV::findFundamentalMat
for this.
Two unexpected behaviours:
FM_8POINT
is different.Have I not understood something here? Is my example false, or what is going on? Can anyone suggest a better test example?
Below is a minimal example. Create 12 artificial points, shift each of
those points 10 pixel to the right, find the fundamental matrix from
these two sets of points and print yFx
for each point.
Example:
int main(int argc, const char* argv[])
{
// Create two sets of points. Points in pts2 are moved 10pixel to the right of the points in pts1.
std::vector<cv::Point2f> pts1, pts2;
for(double y = 0; y < 460; y+=150)
{
for(double x= 0; x < 320; x += 150)
{
pts1.push_back(cv::Point2f(x, y));
pts2.push_back(cv::Point2f(x+10.0, y));
}
}
cv::Mat F = cv::findFundamentalMat(pts1, pts2);
for(int i = 0; i < pts1.size(); i++)
{
// Creating p1, p2, the two points. Please let me know if this can be done in fewer lines.
cv::Mat p1(3,1, CV_64FC1), p2(3,1, CV_64FC1);
p1.at<double>(0) = pts1.at(i).x;
p1.at<double>(1) = pts1.at(i).y;
p1.at<double>(2) = 1.0;
p2.at<double>(0) = pts2.at(i).x;
p2.at<double>(1) = pts2.at(i).y;
p2.at<double>(2) = 1.0;
// Print yFx for each pair of points. This should be 0 for all.
cout << p1.t() * F * p2 << endl;
}
}
For FM_RANSAC
I get
[1.999], [2], [2], [1.599], [1.599], [1.599], [1.198], [1.198], [1.198], [0.798], [0.798], [0.798]
For FM_8POINT
the fundamental matrix is zeros(3,3)
and thus yFx
is 0
for all y
, x
.
I only found: T and R estimation from essential matrix but that didn't help much.
Edit: yFx
is the wrong way round (p1
/p2
switched in the cout-line). This is example is also not working because all points lie on a plane.
I believe that the fundamental matrix solves the equation p2.t() * F * p1 = 0
, i.e. you have p1 and p2 reversed in your code. As to why the 8-point algorithm is returning the zero matrix, I have no idea, sorry.
Edit: Okay, I believe I recall why the 8-point algorithm is producing a bad result here. Your motion between the two set of points is pure translation without rotation, i.e. it only has three degrees of freedom. The fundamental matrix has 7 degrees of freedom, so it is impossible to estimate; this is called a degenerate case. See this paper for a further description of degenerate cases in fundamental/essential matrix estimation.
It might also be the case that there is no rigid transformation between the two viewpoints you get by artificially moving pixel coordinates, thus there is no fundamental matrix satisfying the requirements. A better test case might be to use a function such as cv::warpPerspective with a known warp matrix.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With