I'm using a ARSCNView from ARKit to display live a video feed from the camera on the iPad. I have the ARSCNView object setup exactly as Xcode's Augmented Reality App template. I was wondering if there is a way to get the field of view of the camera?
@IBOutlet var sceneView: ARSCNView!
func start() {
sceneView.delegate = self
sceneView.session.run(ARWorldTrackingConfiguration())
// Retrieve camera FOV here
}
There are a couple of ways to go here, and a possible false start to beware of.
If you're already working with ARKit via SceneKit (ARSCNView
), you might assume that ARKit is automatically updating the SceneKit camera (the view's pointOfView
's camera
) to match the projection transform used by ARKit. This is correct.
However, ARKit directly sets the SCNCamera
's projectionTransform
. When you work with geometric properties of SCNCamera
like zNear
and zFar
and fieldOfView
, SceneKit derives a projection matrix for use in rendering. But if you set projectionTransform
directly, there's no math that can recover the near/far and xFov/yFov values, so the corresponding SCNCamera
properties are invalid. That is, sceneView.pointOfView.camera.fieldOfView
and similar APIs always return bogus results for an ARKit app.
So, what can you do instead? Read on...
An AR session continually vends ARFrame
objects through its delegate, or you can request the currentFrame
from it. Each frame has an ARCamera
attached that describes the imaging parameters, one of which is a projectionMatrix
that's dependent on field of view. (There's also the aforementioned SceneKit projectionTransform
, which is the same matrix.)
A standard 3D projection matrix includes scaling terms that are based on the vertical field of view and aspect ratio. Specifically, the matrix looks like this:
[ xScale 0 0 0 ] xScale = yScale * aspectRatio (width/height)
[ 0 yScale 0 0 ] yScale = 1 / tan(yFov/2)
[ 0 0 nf1 nf2 ] nf1 and nf2 relate to near and far clip planes,
[ 0 0 -1 0 ] so aren't relevant to field of view
So you should be able to get yFov
by solving the yScale
equation:
let projection = session.currentFrame!.camera.projectionMatrix
let yScale = projection[1,1]
let yFov = 2 * atan(1/yScale) // in radians
let yFovDegrees = yFov * 180/Float.pi
And for horizontal field of view, you can multiply by the aspect ratio (specifically, the width/height ratio):
let imageResolution = session.currentFrame!.camera.imageResolution
let xFov = yFov * Float(imageResolution.width / imageResolution.height)
Note: Here, "horizontal" and "vertical" are with respect to the camera image, which is natively in landscape orientation regardless of how your device or AR view UI are oriented.
If you look closely, though, you might notice that the aspect ratio between xFov
/yFov
here (and the aspect ratio of imageResolution
) don't necessarily match that of your device screen (especially on iPhone X) or the view you're drawing AR content in. That's because you've measured the FOV angles of the camera image, not those of your app's AR view. Don't worry, there's an API for that, too...
ARCamera
offers two APIs for getting a projection matrix. Besides the one we just went over, there's also projectionMatrix(for:viewportSize:zNear:zFar:)
, which takes presentation into account. If you want to match not the FOV of the camera, but the FOV of how ARSCNView
or ARSKView
(or Unity or Unreal, probably?) render your AR scene, use this, passing the device orientation and see of your view. Then do all the same math as above:
let imageResolution = session.currentFrame!.camera.imageResolution
let viewSize = sceneView.bounds.size
let projection = session.currentFrame!.camera.projectionMatrix(for: .portrait,
viewportSize: viewSize, zNear: zNear, zFar: zFar)
let yScale = projection[1,1] // = 1/tan(fovy/2)
let yFovDegrees = 2 * atan(1/yScale) * 180/Float.pi
let xFovDegrees = yFovDegrees * Float(viewSize.height / viewSize.width)
What you pass for zNear
and zFar
doesn't matter, since we're not using the parts of the matrix that depend on that. (You might still need to ensure zNear < zFar
and zNear != zFar != 0
.)
Note: Now the height/width are based on your view (or rather, the attributes of your view that you pass to
projectionMatrix(for:...)
). In this example,yFov
is vertical with respect to the UI because the orientation isportrait
, so you multiply by the height/width aspect ratio to getxFov
. If you're in landscape, you multiply by width/height instead.
Keen observers may have noticed that the above calculations ignore parts of the projection matrix. That's because the definition of FOV angle is an optical property of the camera, not anything to do with 3D projection, so a whole projection matrix is an intermediate result you might not really need.
ARCamera
also exposes an intrinsics
matrix that describes optical properties of the camera. The first and second values along the diagonal in this matrix are the horizontal and vertical focal length of a single pixel in the camera image. If you have focal length and image width/height, you can compute FOV per the definition of FOV angle:
let imageResolution = session.currentFrame!.camera.imageResolution
let intrinsics = session.currentFrame!.camera.intrinsics
let xFovDegrees = 2 * atan(Float(imageResolution.width)/(2 * intrinsics[0,0])) * 180/Float.pi
let yFovDegrees = 2 * atan(Float(imageResolution.height)/(2 * intrinsics[1,1])) * 180/Float.pi
Note: Like the version that uses
projectionMatrix
, this is based on the size and always-landscape orientation of the camera image, not the device screen or the view you're displaying AR content in. If you need something based on the viewport instead, scroll back up to "Projection Matrix with Viewport".
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With