Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Record video from front facing camera during ARKit ARSession on iPhone X

I'm using an ARSession combined with an ARFaceTrackingConfiguration to track my face. At the same time, I would like to record a video from the front facing camera of my iPhone X. To do so I'm using AVCaptureSession but as soon as I start recording, the ARSession gets interrupted.

These are two snippets of code:

// Face tracking
let configuration = ARFaceTrackingConfiguration()
    configuration.isLightEstimationEnabled = false
let session = ARSession()
session.run(configuration, options: [.removeExistingAnchors, .resetTracking])

// Video recording
let camera = AVCaptureDevice.default(.builtInWideAngleCamera, for: .video, position: .front)!
input = try! AVCaptureDeviceInput(device: camera)
session.addInput(input)
session.addOutput(output)

Does anybody know how to do the two things at the same time? Apps like Snapchat allow users to record and use the True Depth sensor at the same time so I imagine what I'm asking is perfectly feasible. Thanks!

like image 794
lucamegh Avatar asked Jun 15 '18 18:06

lucamegh


People also ask

Does ARKit work with front camera?

Unity TechnologiesIt is possible to access both the front and rear camera at the same time, but not through ARKit.

How can I record my iPhone camera while using another app?

Camera access is usually limited to apps running in full-screen mode. If your app enters a multitasking mode like Split View, the system disables the camera. Starting in iOS 16, your app can use the camera while multitasking by setting the isMultitaskingCameraAccessEnabled property to true on supported systems.

Can you face track on iPhone?

The facial tracking capability enabled by iPhone has proven its accuracy and performance with its entertainingly impressive Animojis. Due to the built-in TrueDepth Camera, iPhone's face tracking is highly accurate under most lighting conditions, providing a solid facial motion capture data source.


1 Answers

ARKit runs its own AVCaptureSession, and there can be only one capture session running at a time — if you run a capture session, you preempt ARKit’s, which prevents ARKit from working.

However, ARKit does provide access to the camera pixel buffers it receives from its capture session, so you can record video by feeding those sample buffers to an AVAssetWriter. (It’s basically the same workflow you’d use when recording video from AVCaptureVideoDataOutput... a lower-level way of doing video recording compared to AVCaptureMovieFileOutput.)

You can also feed the ARKit camera pixel buffers (see ARFrame.capturedImage) to other technologies that work with live camera imagery, like the Vision framework. Apple has a sample code project demonstrating such usage.

like image 167
rickster Avatar answered Oct 05 '22 22:10

rickster