I got camera intrinsic matrix and distortion parameters using camera calibration.
The unit of the focal length is pixels, i guess.
Then, how can i calculate field of view (along y) ?
Is this formula right?
double fov_y = 2*atan(height/2/fy)*180/CV_PI;
I'll use it to parameters of
gluPerspective()
The intrinsic matrix (commonly represented in equations as K ) allows you to transform 3D coordinates to 2D coordinates on an image plane using the pinhole camera model. The values fx and fy are the pixel focal length, and are identical for square pixels.
The intrinsic parameters represent the optical center and focal length of the camera. The world points are transformed to camera coordinates using the extrinsics parameters. The camera coordinates are mapped into the image plane using the intrinsics parameters.
Principal Point Offset, x0, y0 The camera's "principal axis" is the line perpendicular to the image plane that passes through the pinhole. Its itersection with the image plane is referred to as the "principal point," illustrated below.
OpenCV has a function that does this. Looking at the implementation (available on GitHub) we have given an image with dimensions w x h and a camera matrix:
the equations for the field of view are:
In continuation of @mallwright's answer, here is a bit of Python/numpy code to compute the field of view from the image resolution and focal lengths (in pixels):
import numpy as np
# Prepare
w, h = 1280, 720
fx, fy = 1027.3, 1026.9
# Go
fov_x = np.rad2deg(2 * np.arctan2(w, 2 * fx))
fov_y = np.rad2deg(2 * np.arctan2(h, 2 * fy))
print("Field of View (degrees):")
print(f" {fov_x = :.1f}\N{DEGREE SIGN}")
print(f" {fov_y = :.1f}\N{DEGREE SIGN}")
Output:
Field of View (degrees):
fov_x = 63.8°
fov_y = 38.6°
Note that this assumes that the principal point is at the center of the image and that there is no distortion, see this answer.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With