I am working on a project based on machine vision . Wide angle lens with high resolution pinhole camera is being used.
Working distance : Distance between Camera and object .
The resolution will be nearly 10MP. The image size may be 3656 pixel width and 2740 pixel height. The project requirements are as mentioned below
To avoid lens distortion in camera I do camera calibration using OpenCV.
Below mentioned are my doubts pertaining to this camera calibration
Since the working distance is 5 meters, should the camera calibration too be done with the same distance?
Since the camera is tilted by an angle 13deg in the application ,is it necessary to do the calibration too with the camera tilted at respective angle?
The camera calibration aims to determine the geometric parameters of the image formation process [1]. This is a crucial step in many computer vision applications especially when metric information about the scene is required.
To estimate the camera parameters, you need to have 3-D world points and their corresponding 2-D image points. You can get these correspondences using multiple images of a calibration pattern, such as a checkerboard.
You need at least two images to calibrate each camera separately, just to get the intrinsics. If you have already calibrated each camera separately, then, yes, you can use a single pair of checkerboard images to get R and t. However, you will not get a very good accuracy.
As such, calibration helps to understand the internal camera parameters, including focal length, the center of the image sensor, the rectangularity of a pixel, and aspect ratio. It also helps to determine the camera's external parameters such as AprilTags, including position and orientation.
Accurate camera calibration and orientation procedures are a necessary prerequisite for the extraction of precise and reliable 3D metric information from images. A camera is considered calibrated if the principal distance, principal point offset and lens distortion parameters are known.
The radial distortion can be modeled using three parameters as follows The tangential distortion can be modeled using two parameters as follows As we can see, there are a total of five distortion parameters k₁, k₂, k₃, p₁and p₂. Finding the distortion parameters is the final purpose of Camera Calibration.
Internal or External Calibration). Unfortunately with uncalibrated cameras you can only get the depth map up to scale, i. e. in unknown units. For that you would need to estimate the Fundamental matrix from pairs of matching points, rectify the images, and compute the disparity map.
My answer is "maybe" to the first question, and "no" to the second.
While it is true that it is not strictly necessary to calibrate with the target at the same or nearby distance as the subject, in practice it is possible only if you have enough depth of field (in particular, if you are focused at infinity), and use a fixed iris.
The reason is the Second Rule of Camera Calibration: "Thou shalt not touch the lens during or after calibration". In particular, you may not refocus nor change the f-stop, because both focusing and iris affect the nonlinear lens distortion and (albeit less so, depending on the lens) the field of view. Of course, you are completely free to change the exposure time, as it does not affect the lens geometry at all.
See also, for general comment, this other answer of mine.
The answer is no to both questions. Camera calibration essentially finds the relationship between the focal length and the pixel plane when assuming the pinhole camera model; and optionally (as you will require due to your wide angle lens), radial distortion. These relationships are independent of the position of the camera in the world.
By the way, I see you tagged this as matlab
: I can recommend the Camera Calibration Toolbox for MATLAB as a nice easy way of calibrating cameras. It guides you through process nicely.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With