I record video (.mp4 file) using AVAssetWriter with CMSampleBuffer data (from video, audio inputs).
While recording I want to process frames, I'm converting CMSampleBuffer to CIImage and processing it.
but how to update CMSampleBuffer with my new processed image buffer from CIImage?
func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) {
if output == videoOutput {
let imageBuffer: CVPixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)!
let ciimage: CIImage = CIImage(cvPixelBuffer: imageBuffer)
... // my code to process CIImage (for example add augmented reality)
// but how to convert it back to CMSampleBuffer?
// because AVAssetWriterInput to encode video/audio in file needs CMSampleBuffer
...
}
...
}
You need to render your CIImage into a CVPixelBuffer, using CIContext's render(_:to:bounds:colorSpace:) method.
Then you can create a CMSampleBuffer from the CVPixelBuffer using e.g. CMSampleBufferCreateReadyWithImageBuffer(_:_:_:_:_:)
You may need to use a pool of CVPixelBuffer for efficiency reasons, an example of this is shown in Apple's AVCamPhotoFilter sample code. In particular, see the RosyCIRenderer class.
Also see this answer which may help you Applying a CIFilter to a Video File and Saving it
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With