Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Doubts in camera calibration

I am working on a project based on machine vision . Wide angle lens with high resolution pinhole camera is being used.

Working distance : Distance between Camera and object .

The resolution will be nearly 10MP. The image size may be 3656 pixel width and 2740 pixel height. The project requirements are as mentioned below

  1. My working distance must be nearly 5metres.
  2. The camera needs to be tilted at an angle of 13 degree.

To avoid lens distortion in camera I do camera calibration using OpenCV.

Below mentioned are my doubts pertaining to this camera calibration

  1. Since the working distance is 5 meters, should the camera calibration too be done with the same distance?

  2. Since the camera is tilted by an angle 13deg in the application ,is it necessary to do the calibration too with the camera tilted at respective angle?

like image 925
dip Avatar asked Feb 14 '13 07:02

dip


People also ask

Why is camera calibration important where it is applied?

The camera calibration aims to determine the geometric parameters of the image formation process [1]. This is a crucial step in many computer vision applications especially when metric information about the scene is required.

How many points are needed for camera calibration?

To estimate the camera parameters, you need to have 3-D world points and their corresponding 2-D image points. You can get these correspondences using multiple images of a calibration pattern, such as a checkerboard.

How many pictures does it take to calibrate a camera?

You need at least two images to calibrate each camera separately, just to get the intrinsics. If you have already calibrated each camera separately, then, yes, you can use a single pair of checkerboard images to get R and t. However, you will not get a very good accuracy.

Why do I need to calibrate my camera?

As such, calibration helps to understand the internal camera parameters, including focal length, the center of the image sensor, the rectangularity of a pixel, and aspect ratio. It also helps to determine the camera's external parameters such as AprilTags, including position and orientation.

What is accurate camera calibration?

Accurate camera calibration and orientation procedures are a necessary prerequisite for the extraction of precise and reliable 3D metric information from images. A camera is considered calibrated if the principal distance, principal point offset and lens distortion parameters are known.

What are the distortion parameters in camera calibration?

The radial distortion can be modeled using three parameters as follows The tangential distortion can be modeled using two parameters as follows As we can see, there are a total of five distortion parameters k₁, k₂, k₃, p₁and p₂. Finding the distortion parameters is the final purpose of Camera Calibration.

How to calibrate depth map with uncalibrated camera?

Internal or External Calibration). Unfortunately with uncalibrated cameras you can only get the depth map up to scale, i. e. in unknown units. For that you would need to estimate the Fundamental matrix from pairs of matching points, rectify the images, and compute the disparity map.


2 Answers

My answer is "maybe" to the first question, and "no" to the second.

While it is true that it is not strictly necessary to calibrate with the target at the same or nearby distance as the subject, in practice it is possible only if you have enough depth of field (in particular, if you are focused at infinity), and use a fixed iris.

The reason is the Second Rule of Camera Calibration: "Thou shalt not touch the lens during or after calibration". In particular, you may not refocus nor change the f-stop, because both focusing and iris affect the nonlinear lens distortion and (albeit less so, depending on the lens) the field of view. Of course, you are completely free to change the exposure time, as it does not affect the lens geometry at all.

See also, for general comment, this other answer of mine.

like image 167
Francesco Callari Avatar answered Oct 03 '22 07:10

Francesco Callari


The answer is no to both questions. Camera calibration essentially finds the relationship between the focal length and the pixel plane when assuming the pinhole camera model; and optionally (as you will require due to your wide angle lens), radial distortion. These relationships are independent of the position of the camera in the world.

By the way, I see you tagged this as matlab: I can recommend the Camera Calibration Toolbox for MATLAB as a nice easy way of calibrating cameras. It guides you through process nicely.

like image 41
devrobf Avatar answered Oct 03 '22 08:10

devrobf