Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Is it possible to access multiple cameras from ARFrame?

I have an ARSession using a ARWorldTrackingConfiguration as part of its configuration. I've also enabled face tracking via:

configuration.userFaceTrackingEnabled = true

In the func session(_ session: ARSession, didUpdate frame: ARFrame) delegate method, I can successfully get the frame.capturedImage from the world-facing camera, but it doesn't seem like there's a way to access the frame from the face-facing camera.

Am I correct in this assumption?

If so, is there some other way to access the frames of both cameras when using face and world tracking together?

like image 477
narner Avatar asked Jul 07 '20 21:07

narner


People also ask

How to add multiple cameras to a Raspberry Pi?

So in order to attach multiple cameras to your Pi, you’ll need to leverage at least one (if not more) USB cameras. That said, in order to build my own multi-camera Raspberry Pi setup, I ended up using: A Raspberry Pi camera module + camera housing (optional).

Can I use the same stream with two cameras?

The compatibility extends to using the same configuration and replacing one of those streams with two streams from two physical cameras that are part of the same logical camera. With the camera session ready, you can dispatch the desired capture requests.

Can I use multiple cameras with Raspberry Pi and OpenCV?

Best of all, our implementation of multiple camera access with the Raspberry Pi and OpenCV is capable of running in real-time (or near real-time, depending on the number of cameras you have attached), making it perfect for creating your own multi-camera home surveillance system. Keep reading to learn more. Looking for the source code to this post?

How does the camera API for Multi-Camera Work?

If the output target corresponds to one of the output targets that was sent as an output configuration along with a physical camera ID, then that physical camera receives and processes the request. Another addition to the camera APIs for multi-camera is the ability to identify logical cameras and find the physical cameras behind them.


1 Answers

About two simultaneous ARConfigurations

As a rule, one ARSession is able to run just one ARConfiguration at a time. But there's an exception: we can use Face tracking config within a World tracking configuration. However, ARWorldTrackingConfiguration is a "main" in that case (hence Face tracking is a "supplemental" config).

Both cameras (rear and front) produce 60 ARFrames per second, containing RGB, Depth, anchors, axis, feature points, etc. And each camera has its own ARFrames, what can be used for defining intrinsic and extrinsic ARCamera parameters (like 3x3 camera matrix or 4x4 transform matrix).

@NSCopying var currentFrame: ARFrame? { get }

However, in ARKit 5.0, if you are running World tracking config with activated instance property userFaceTrackingEnabled, you can get access only to ARFrames coming from rear camera – at the moment there's no access to simultaneous ARFrames coming from front camera.

let config = ARWorldTrackingConfiguration()

if ARWorldTrackingConfiguration.supportsUserFaceTracking {
    config.userFaceTrackingEnabled = true
}
sceneView.session.run(config, options: [])
    
let currentFrame = sceneView.session.currentFrame
let rearCameraTransform = currentFrame?.camera.transform
let rearCameraAnchors = currentFrame?.anchors
    
print(rearCameraTransform?.columns.3 as Any)
print(rearCameraAnchors as Any)

But, of course, you can control all ARFaceAnchors in World tracking environment.

enter image description here

Tip:

In ARKit 5.0 you can use ARFaceTrackingConfiguration config on the following devices:

TrueDepth sensor iOS version CPU Depth data
if YES iOS11 ....... iOS15 A11, A12, A13, A14, A15 true
if NO iOS13 ....... iOS15 A12, A13, A14, A15 false

So, as a developer, you need to check whether current device supports Face tracking config or not:

import ARKit

@UIApplicationMain
class AppDelegate: UIResponder, UIApplicationDelegate {

    var window: UIWindow?

    func application(_ application: UIApplication, didFinishLaunchingWithOptions launchOptions: [UIApplication.LaunchOptionsKey: Any]?) -> Bool {
        
        if !ARFaceTrackingConfiguration.isSupported {
            let storyboard = UIStoryboard(name: "Main", bundle: nil)
            window?.rootViewController = storyboard.instantiateViewController(withIdentifier: "unsupportedDeviceMessage")
        }
        return true
    }
}
like image 72
Andy Jazz Avatar answered Oct 18 '22 19:10

Andy Jazz