Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to simultaneously record and play captured video using AVFoundation with a few seconds delay?

I'm looking into making my Swift iOS app record a video and play it back on the same screen with 30 seconds delay.

I've been using an official example to record a video. Then I added a button that would trigger playing self.movieFileOutput?.outputFileURL using AVPlayer in a separate view on the screen. It's close to what I want but obviously it stops playing once it comes to the end of the file written to the disk and does not proceed when the next buffered chunk is written.

I could stop the video recording every 30 seconds and save the URL for each file so I can play it back but that means that there would be interruptions in video capture and playback.

How can I make video recording never stop and playback always be on the screen with any delay I want?

I've seen a similar question and all the answers pointed at AVFoundation docs. I couldn't find how to make AVFoundation to write predictable chunks of video from memory to disk when recording.

like image 735
Maklaus Avatar asked Aug 26 '17 23:08

Maklaus


1 Answers

You can achieve what you want by recording 30s chunks of video, then enqueueing them to an AVQueuePlayer for seamless playback. Recording the video chunks would be very easy with AVCaptureFileOutput on macOS, but sadly, on iOS you cannot create new chunks without dropping frames, so you have to use the wordier, lower level AVAssetWriter API:

import UIKit
import AVFoundation

// TODO: delete old videos
// TODO: audio

class ViewController: UIViewController {
    // capture
    let captureSession = AVCaptureSession()

    // playback
    let player = AVQueuePlayer()
    var playerLayer: AVPlayerLayer! = nil

    // output. sadly not AVCaptureMovieFileOutput
    var assetWriter: AVAssetWriter! = nil
    var assetWriterInput: AVAssetWriterInput! = nil

    var chunkNumber = 0
    var chunkStartTime: CMTime! = nil
    var chunkOutputURL: URL! = nil

    override func viewDidLoad() {
        super.viewDidLoad()

        playerLayer = AVPlayerLayer(player: player)
        view.layer.addSublayer(playerLayer)

        // inputs
        let videoCaptureDevice = AVCaptureDevice.defaultDevice(withMediaType: AVMediaTypeVideo)
        let videoInput = try! AVCaptureDeviceInput(device: videoCaptureDevice)
        captureSession.addInput(videoInput)

        // outputs
        // iOS AVCaptureFileOutput/AVCaptureMovieFileOutput still don't support dynamically
        // switching files (?) so we have to re-implement with AVAssetWriter
        let videoOutput = AVCaptureVideoDataOutput()
        // TODO: probably something else
        videoOutput.setSampleBufferDelegate(self, queue: DispatchQueue.main)
        captureSession.addOutput(videoOutput)

        captureSession.startRunning()
    }

    override func viewDidLayoutSubviews() {
        super.viewDidLayoutSubviews()
        playerLayer.frame = view.layer.bounds
    }

    func createWriterInput(for presentationTimeStamp: CMTime) {
        let fileManager = FileManager.default
        chunkOutputURL = fileManager.urls(for: .documentDirectory, in: .userDomainMask)[0].appendingPathComponent("chunk\(chunkNumber).mov")
        try? fileManager.removeItem(at: chunkOutputURL)

        assetWriter = try! AVAssetWriter(outputURL: chunkOutputURL, fileType: AVFileTypeQuickTimeMovie)
        // TODO: get dimensions from image CMSampleBufferGetImageBuffer(sampleBuffer)
        let outputSettings: [String: Any] = [AVVideoCodecKey:AVVideoCodecH264, AVVideoWidthKey: 1920, AVVideoHeightKey: 1080]
        assetWriterInput = AVAssetWriterInput(mediaType: AVMediaTypeVideo, outputSettings: outputSettings)
        assetWriterInput.expectsMediaDataInRealTime = true
        assetWriter.add(assetWriterInput)

        chunkNumber += 1
        chunkStartTime = presentationTimeStamp

        assetWriter.startWriting()
        assetWriter.startSession(atSourceTime: chunkStartTime)
    }
}

extension ViewController: AVCaptureVideoDataOutputSampleBufferDelegate {
    func captureOutput(_ captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, from connection: AVCaptureConnection!) {
        let presentationTimeStamp = CMSampleBufferGetPresentationTimeStamp(sampleBuffer)

        if assetWriter == nil {
            createWriterInput(for: presentationTimeStamp)
        } else {
            let chunkDuration = CMTimeGetSeconds(CMTimeSubtract(presentationTimeStamp, chunkStartTime))

            if chunkDuration > 30 {
                assetWriter.endSession(atSourceTime: presentationTimeStamp)

                // make a copy, as finishWriting is asynchronous
                let newChunkURL = chunkOutputURL!
                let chunkAssetWriter = assetWriter!

                chunkAssetWriter.finishWriting {
                    print("finishWriting says: \(chunkAssetWriter.status.rawValue, chunkAssetWriter.error)")
                    print("queuing \(newChunkURL)")
                    self.player.insert(AVPlayerItem(url: newChunkURL), after: nil)
                    self.player.play()
                }
                createWriterInput(for: presentationTimeStamp)
            }
        }

        if !assetWriterInput.append(sampleBuffer) {
            print("append says NO: \(assetWriter.status.rawValue, assetWriter.error)")
        }
    }
}

p.s. it's very curious to see what you were doing 30 seconds ago. What exactly are you making?

like image 73
Rhythmic Fistman Avatar answered Nov 18 '22 09:11

Rhythmic Fistman