Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to undistort points in camera shot coordinates and obtain corresponding undistorted image coordinates?

I use OpenCV to undestort set of points after camera calibration. The code follows.

const int npoints = 2; // number of point specified 

// Points initialization. 
// Only 2 ponts in this example, in real code they are read from file.
float input_points[npoints][2] = {{0,0}, {2560, 1920}}; 

CvMat * src = cvCreateMat(1, npoints, CV_32FC2);
CvMat * dst = cvCreateMat(1, npoints, CV_32FC2);

// fill src matrix
float * src_ptr = (float*)src->data.ptr;
for (int pi = 0; pi < npoints; ++pi) {
    for (int ci = 0; ci < 2; ++ci) {
        *(src_ptr + pi * 2 + ci) = input_points[pi][ci];
    }
}

cvUndistortPoints(src, dst, &camera1, &distCoeffs1);

After the code above dst contains following numbers:

-8.82689655e-001 -7.05507338e-001 4.16228324e-001 3.04863811e-001

which are too small in comparison with numbers in src.

At the same time if I undistort image via the call:

cvUndistort2( srcImage, dstImage, &camera1, &dist_coeffs1 );

I receive good undistorted image which means that pixel coordinates are not modified so drastically in comparison with separate points.

How to obtain the same undistortion for specific points as for images? Thanks.

like image 266
sergtk Avatar asked Dec 14 '11 05:12

sergtk


2 Answers

The points should be "unnormalized" using camera matrix.

More specifically, after call of cvUndistortPoints following transformation should be also added:

double fx = CV_MAT_ELEM(camera1, double, 0, 0);
double fy = CV_MAT_ELEM(camera1, double, 1, 1);
double cx = CV_MAT_ELEM(camera1, double, 0, 2);
double cy = CV_MAT_ELEM(camera1, double, 1, 2);

float * dst_ptr = (float*)dst->data.ptr;
for (int pi = 0; pi < npoints; ++pi) {
    float& px = *(dst_ptr + pi * 2);
    float& py = *(dst_ptr + pi * 2 + 1);
    // perform transformation. 
    // In fact this is equivalent to multiplication to camera matrix
    px = px * fx + cx;
    py = py * fy + cy;
}

More info on camera matrix at OpenCV 'Camera Calibration and 3D Reconstruction'

UPDATE:

Following C++ function call should work as well:

std::vector<cv::Point2f> inputDistortedPoints = ...
std::vector<cv::Point2f> outputUndistortedPoints;
cv::Mat cameraMatrix = ...
cv::Mat distCoeffs = ...

cv::undistortPoints(inputDistortedPoints, outputUndistortedPoints, cameraMatrix, distCoeffs, cv::noArray(), cameraMatrix);
like image 125
sergtk Avatar answered Oct 16 '22 14:10

sergtk


It may be your matrix size :)

OpenCV expects a vector of points - a column or a row matrix with two channels. But because your input matrix is only 2 pts, and the number of channels is also 1, it cannot figure out what's the input, row or colum.

So, fill a longer input mat with bogus values, and keep only the first:

const int npoints = 4; // number of point specified 

// Points initialization. 
// Only 2 ponts in this example, in real code they are read from file.
float input_points[npoints][4] = {{0,0}, {2560, 1920}}; // the rest will be set to 0

CvMat * src = cvCreateMat(1, npoints, CV_32FC2);
CvMat * dst = cvCreateMat(1, npoints, CV_32FC2);

// fill src matrix
float * src_ptr = (float*)src->data.ptr;
for (int pi = 0; pi < npoints; ++pi) {
    for (int ci = 0; ci < 2; ++ci) {
        *(src_ptr + pi * 2 + ci) = input_points[pi][ci];
    }
}

cvUndistortPoints(src, dst, &camera1, &distCoeffs1);

EDIT

While OpenCV specifies undistortPoints accept only 2-channel input, actually, it accepts

  • 1-column, 2-channel, multi-row mat or (and this case is not documented)
  • 2 column, multi-row, 1-channel mat or
  • multi-column, 1 row, 2-channel mat

(as seen in undistort.cpp, line 390)

But a bug inside (or lack of available info), makes it wrongly mix the second one with the third one, when the number of columns is 2. So, your data is considered a 2-column, 2-row, 1-channel.

like image 45
Sam Avatar answered Oct 16 '22 16:10

Sam