What's the right way to add an image overlay to a video created with AVAssetWriter?
It's possible to do so with AVAssetExportSession, but this question is about how to do so with AVAssetWriter so there is more control over the quality and output.
There are two scenarios:
1) Simple: Add single overlay that is present the entire duration of the video (similar to a watermark).
2) Complex: Add different overlays that animate in and out of the video at different times (similar to using AVVideoCompositionCoreAnimationTool).
There's a lot of different approaches to this and the correct answer is going to depend on exactly what your use case is.
At a high level, here's three approaches:
- You appear to be already familiar with AVVideoCompositionCoreAnimationTool. You CAN use this with AVAssetWriter. Check out https://github.com/rs/SDAVAssetExportSession which is a drop in replacement for AVAssetExportSession that allows you to pass the AVAssetWriter settings you're seeking (because it uses AVAssetWriter internally).
- If you want to composite something like a WaterMark into live video (like in this question Simulate AVLayerVideoGravityResizeAspectFill: crop and center video to mimic preview without losing sharpness) then you can actually modify the sample buffer which is passed to the captureOutput function by the AVCaptureVideoDataOutputSampleBufferDelegate. The typical approach here is convert the CMSampleBuffer to a CIImage and then do whatever manipulation you like, finally convert the CIImage BACK to a CMSampleBuffer and write it out. In the question linked, the CMSampleBuffer is simply passed on without any manipulation. NB The step from CIImage back to CMSampleBuffer is relatively low level, there are lots of examples on StackOverflow however, although not many in Swift. Here's one implementation (for OSX however) Adding filters to video with AVFoundation (OSX) - how do I write the resulting image back to AVWriter?
- Depending on just HOW complex what you need to do is, you could look at implementing your own custom compositor by creating a class that complies with https://developer.apple.com/library/mac/documentation/AVFoundation/Reference/AVVideoCompositing_Protocol/ and you reference in the AVVideoComposition. This is complex and (probably) overkill - if you don't know why you need to do this, then you probably don't need one. If you start struggling with problems like "how can I have multiple animation layers on different tracks in my video and not all on one track" or "how can I rotate, scale and animate moving video within an image frame - like a polaroid that spins in while the video is playing in the frame"... well this is what you need to look into.
If you need some further info, then if you add some clarification on what you're trying to do, I may be able to expand this answer to add more detail on the appropriate approach.