I'm trying to record captured frames as video while performing image processing tasks on the frames at the same time in parallel.
I have a single AVCaptureSession which I have added two separate outputs to -
I confirmed to both AVCaptureVideoDataOutputSampleBufferDelegate and AVCaptureFileOutputRecordingDelegate
I am using captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection)
for frame capture and analyze
and func fileOutput(_ output: AVCaptureFileOutput, didStartRecordingTo fileURL: URL, from connections: [AVCaptureConnection]) for video recording
For some reason, each method works separately, but when I'm adding both outputs, only the video recording works and the "captureOutput" function is not being called at all.
Any thoughts why this is happening, what am I doing wrong? or what should I make sure while setting up and configuring the session?
These two(AVCaptureVideoDataOutput, AVCaptureMovieFileOutput) will not work with each other. You can use captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) for getting the frame to analyse as well as recording. You can find sample code here
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With