Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to record video in RealityKit?

I have a RealityKit project in Xcode and I want to record the ARView. I considered ReplayKit, but that is for screen recording, I want to record only the ARView with its camera feed. I considered the open source project ARVideoKit by AFathi but that doesn't support RealityKit... something about different rendering paths. I have found a Medium article which describes how to implement a recording feature in an ARKit app, but the problem is that it requires the method: func renderer(_ renderer: SCNSceneRenderer) which is not available in RealityKit because it is specifically a SceneKit method.

like image 751
Tadreik Avatar asked Dec 30 '19 15:12

Tadreik


People also ask

How do I download reality Kit?

RealityKit is available in Xcode starting with version 11. You can download the beta of Xcode 11 from the Apple Developer website under the Applications tab. Reality Composer is available within Xcode 11 on the Mac by selecting Xcode from the menu bar | Open Developer Tool | Reality Composer.

What is ReplayKit?

The ReplayKit namespace provides classes that allows screen recording of the developer's application. Additionally, it provides a standard RPPreviewViewController view controller that allows the user to preview, trim, and share the recording. Developers must use the SharedRecorder singleton to create replays.

What is Apple reality kit?

RealityKit is an augmented reality authoring framework introduced in 2019 focused on realistic rendering and making it easy to create AR apps. Leveraging ARKit to read the device's sensor data, RealityKit allows you to place 3D content in the real-world environment and make that content look as realistic as possible.


1 Answers

My answer assumes you are familiar with recording video and audio using AVAssetWriter.

There is a captured frame that is provided as part of the ARKit session(_:didUpdate:) method. The ARFrame object returned has a CVPixelBuffer named capturedFrame. Handle the frame as you would a regular video recording session, except instead of being captured in captureOutput(_:didOutput:from:) method, it is captured here instead. You may still need a captureOutput(_:didOutput:from:) method for audio if you intend on recording audio from the microphone, too.

In my case, I converted my captured frame into a MTLTexture and used Metal to process my video frames before passing them to an AVAssetWriter. I wanted to draw on top of my camera frames before recording. Unfortunately, doing this is very complicated and not a quick and short copy+paste answer I'm afraid. Hopefully pointing you to the capturedFrame object returned by ARKit is a good place for you to start.

Example on how to record videos using AVAssetWriter: https://programmersought.com/article/80131041234/;jsessionid=38CBA6743FB3C440DE9D2B25A6854B28

You will also need to verse yourself in Metal if you want to draw your 3D models into the capture feed before it is encoded to video: https://developer.apple.com/documentation/metalkit/

like image 196
JCutting8 Avatar answered Oct 21 '22 04:10

JCutting8