I am playing with ARKit, and I would like to create a video from the ARSKView frames. I tried to use ReplayKit, but the behaviour is not what I expect : - I don't want to record the whole screen. - I don't want the user to be prompted we are recording the screen.
Also, how can I combine micro input and video ? As I guess the audio is not streamed in ARSKView? Here is the code (from Apple example) :
import UIKit
import SpriteKit
import ARKit
class ViewController: UIViewController, ARSKViewDelegate {
@IBOutlet var sceneView: ARSKView!
override func viewDidLoad() {
super.viewDidLoad()
// Set the view's delegate
sceneView.delegate = self
// Show statistics such as fps and node count
sceneView.showsFPS = true
sceneView.showsNodeCount = true
// Load the SKScene from 'Scene.sks'
if let scene = SKScene(fileNamed: "Scene") {
sceneView.presentScene(scene)
}
}
override func viewWillAppear(_ animated: Bool) {
super.viewWillAppear(animated)
// Create a session configuration
let configuration = ARWorldTrackingSessionConfiguration()
// Run the view's session
sceneView.session.run(configuration)
}
override func viewWillDisappear(_ animated: Bool) {
super.viewWillDisappear(animated)
// Pause the view's session
sceneView.session.pause()
}
override func didReceiveMemoryWarning() {
super.didReceiveMemoryWarning()
// Release any cached data, images, etc that aren't in use.
}
// MARK: - ARSKViewDelegate
func view(_ view: ARSKView, nodeFor anchor: ARAnchor) -> SKNode? {
// Create and configure a node for the anchor added to the view's session.
let labelNode = SKLabelNode(text: "👾")
labelNode.horizontalAlignmentMode = .center
labelNode.verticalAlignmentMode = .center
return labelNode;
}
func session(_ session: ARSession, didFailWithError error: Error) {
// Present an error message to the user
}
func sessionWasInterrupted(_ session: ARSession) {
// Inform the user that the session has been interrupted, for example, by presenting an overlay
}
func sessionInterruptionEnded(_ session: ARSession) {
// Reset tracking and/or remove existing anchors if consistent tracking is required
}}
And in case it's necessary, the Scene class :
import SpriteKit
import ARKit
class Scene: SKScene {
override func didMove(to view: SKView) {
// Setup your scene here
}
override func update(_ currentTime: TimeInterval) {
// Called before each frame is rendered
}
override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
guard let sceneView = self.view as? ARSKView else {
return
}
// Create anchor using the camera's current position
if let currentFrame = sceneView.session.currentFrame {
// Create a transform with a translation of 0.2 meters in front of the camera
var translation = matrix_identity_float4x4
translation.columns.3.z = -0.2
let transform = simd_mul(currentFrame.camera.transform, translation)
// Add a new anchor to the session
let anchor = ARAnchor(transform: transform)
sceneView.session.add(anchor: anchor)
}
}
}
If you need to record only frames (as with AVCaptureSession
, not "real" 3D scene with SCNNodes
), just get them as ARFrame.capturedImage
in updateAtTime
delegate function of SCNSceneRenderer
:
func renderer(_ renderer: SCNSceneRenderer, updateAtTime time: TimeInterval) {
createMovieWriterOnce(frame: session.currentFrame)
appendFrameWithMetadaToMovie(frame: session.currentFrame)
}
I haven't found a way to get frame size from ARSession
, so MovieWriter
waits for first frame to setup size:
func createMovieWriterOnce(frame: ARFrame?) {
if(frame == nil) { return }
DispatchQueue.once(token: "SimplestMovieWriter.constructor") {
movieWriter = SimplestMovieWriter(frameWidth: CVPixelBufferGetWidth(frame!.capturedImage), frameHeight: CVPixelBufferGetHeight(frame!.capturedImage))
}
}
And each next CVPixelBuffer
is fed to MovieWriter
:
func appendFrameWithMetadaToMovie(frame: ARFrame?) {
if(!isVideoRecording || frame == nil) { return }
let interestingPoints = frame?.rawFeaturePoints?.points
movieWriter.appendBuffer(buffer: (frame?.capturedImage)!, withMetadata: interestingPoints)
}
MovieWriter
is custom class with AVAssetWriter
, AVAssetWriterInput
and AVAssetWriterInputPixelBufferAdaptor
.
You can save video without audio, then use AVAssetExportSession
to add anything you want (audio, subtitles, metadata):
let composition = AVMutableComposition()
...
let trackVideo = composition.addMutableTrack(withMediaType: AVMediaTypeVideo, preferredTrackID: kCMPersistentTrackID_Invalid)
let videoFileAsset = AVURLAsset(url: currentURL!, options: nil)
let videoFileAssetTrack = videoFileAsset.tracks(withMediaType: AVMediaTypeVideo)[0]
// add audio track here
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With