Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Using CIFilter with AVFoundation (iOS)

I am trying to apply filters to a video composition created with AVFoundation on iOS (filters could be, eg, blur, pixelate, sepia, etc). I need to both apply the effects in real-time and be able to render the composite video out to disk, but I'm happy to start with just one or the other.

Unfortunately, I can't seem to figure this one out. Here's what I can do:

  • I can add a layer for animation to the UIView that's playing the movie, but it's not clear to me if I can process the incoming video image this way.
  • I can add an array of CIFilters to the AVPlayerLayer, but it turns out these are ignored in iOS (it only works on Mac OS X).
  • I can add an AVVideoCompositionCoreAnimationTool to the AVVideoCompopsition, but I'm not sure this would accomplish video processing (rather than animation) and it crashes with a message about not being designed for real-time playback anyway. I believe this is the solution for rendering animation when rendering to disk.

Other apps do this (I think), so I assume I'm missing something obvious.

note: I've looked into GPUImage and I'd love to use it, but it just doesn't work well with movies, especially movies with audio. See for example:

  • GPUImage filters in runtime on AVMutableComposition
  • https://github.com/BradLarson/GPUImage/issues/1339
like image 669
Bjorn Roche Avatar asked Dec 17 '13 06:12

Bjorn Roche


2 Answers

You could use the AVVideoCompositing and AVAsynchronousVideoCompositionRequest protocol to implement a custom compositor.

CVPixelBufferRef pixelBuffer = [AVAsynchronousVideoCompositionRequest sourceFrameByTrackID:trackID];
CIImage *theImage = [CIImage imageWithCVPixelBuffer:pixelBuffer];
CIImage *motionBlurredImage = [[CIFilter *filterWithName:@"CIMotionBlur" keysAndValues:@"inputImage", theImage, nil] valueForKey:kCIOutputImageKey];
CIContext *someCIContext = [CIContext contextWithEAGLContext:eaglContext];
[someCIContext render:motionBlurredImage toCVPixelBuffer:outputBuffer];

Then render the pixel buffer using OpenGL as described in Apple's Documentation. This would allow you to implement any number of transitions or filters that you want. You can then set the AVAssetExportSession.videoCompostion and you will be able to export the composited video to disk.

like image 104
Jonathan Avatar answered Nov 16 '22 10:11

Jonathan


You can read AVComposition (it's an AVAsset subclass) with AVAssetReader. Get pixelbuffers, pass it to CIFilter (setting it up so that it uses GPU for rendering (no color management etc.) and render it on screen/output buffer depending on your needs. I do not think that Blur can be achieved realtime unless you use directly GPU.

You can read about CIFilter application to video (Applying Filter to Video section):

https://developer.apple.com/library/ios/documentation/graphicsimaging/conceptual/CoreImaging/ci_tasks/ci_tasks.html#//apple_ref/doc/uid/TP30001185-CH3-BAJDAHAD

like image 3
Laz Avatar answered Nov 16 '22 08:11

Laz