I have calibrated my camera with OpenCV (findChessboard etc) so I have: - Camera Distortion Coefficients & Intrinsics matrix - Camera Pose information (Translation & Rotation, computed separatedly via other means) as Euler Angles & a 4x4 - 2D points within the camera frame
How can I convert these 2D points into 3D unit vectors pointing out into the world? I tried using cv::undistortPoints but that doesn't seem to do it (only returns 2D remapped points), and I'm not exactly sure what method of matrix math to use to model the camera via the Camera intrinsics I have.
Convert your 2d point into a homogenous point (give it a third coordinate equal to 1) and then multiply by the inverse of your camera intrinsics matrix. For example
cv::Matx31f hom_pt(point_in_image.x, point_in_image.y, 1);
hom_pt = camera_intrinsics_mat.inv()*hom_pt; //put in world coordinates
cv::Point3f origin(0,0,0);
cv::Point3f direction(hom_pt(0),hom_pt(1),hom_pt(2));
//To get a unit vector, direction just needs to be normalized
direction *= 1/cv::norm(direction);
origin and direction now define the ray in world space corresponding to that image point. Note that here the origin is centered on the camera, you can use your camera pose to transform to a different origin. Distortion coefficients map from your actual camera to the pinhole camera model and should be used at the very beginning to find your actual 2d coordinate. The steps then are
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With