I have now fully setup the ability to record video using the AVFoundation
framework and this is all fine but now I am looking to add an overlay during the record (also visible on the AVCaptureVideoPreviewLayer
Layer)
I can add this overlay UIView
object to the VideoPreviewLayer but I am struggling how to get the same view to be on the recorded video.This UIView
could contain anything from UILabel
s to UIImageView
s.
You want to overlay UIView
's, but if you don't mind using CALayer
s you could
add your overlay after export using AVAssetExportSession
's AVVideoComposition
property. It has a property, AVVideoCompositionCoreAnimationTool *animationTool
which lets you add animating CALayer
s to your output, although I think you're out of luck if your overlay's appearance can't be described by CABasicAnimation
. Your example of a display heading may be possible, although I imagine something as simple as a current time counter would not. If you can live with this restriction, the WWDC 2010 code sample 'AVEditDemo' is a good starting point.
If you need more control, you could manually render the overlay UIView
onto the capture frames, using [view.layer renderInContext:contextToThenRenderToFrame]
and then write these frames to file using AVAssetWriter
(once you capture frames to memory you can no longer use AVCaptureMovieFileOutput
).
Warning: the frames you are capturing may not arrive at a uniform rate and depend on ambient lighting and even system load. If your overlay changes at a higher rate than the capture video, then you will need to repeat frames in the second solution. This is handled for you by AVVideoComposition
in the first solution.
P.S. Solution two is fiddly, but without going into details, iOS7 seems to have made this a lot easier.
I am not sure if this is the thing you are looking for but i guess you can use Brad Larson's GPU library,there is a class called GPUImageElement which lets you add overlays and views.Please check out the examples,especially the one called Filter showcase and scroll to something called UIElement.
Here is some sample code:
else if (filterType == GPUIMAGE_UIELEMENT)
{
GPUImageAlphaBlendFilter *blendFilter = [[GPUImageAlphaBlendFilter alloc] init];
blendFilter.mix = 1.0;
NSDate *startTime = [NSDate date];
UILabel *timeLabel = [[UILabel alloc] initWithFrame:CGRectMake(0.0, 0.0, 240.0f, 320.0f)];
timeLabel.font = [UIFont systemFontOfSize:17.0f];
timeLabel.text = @"Time: 0.0 s";
timeLabel.textAlignment = UITextAlignmentCenter;
timeLabel.backgroundColor = [UIColor clearColor];
timeLabel.textColor = [UIColor whiteColor];
uiElementInput = [[GPUImageUIElement alloc] initWithView:timeLabel];
[filter addTarget:blendFilter];
[uiElementInput addTarget:blendFilter];
[blendFilter addTarget:filterView];
__unsafe_unretained GPUImageUIElement *weakUIElementInput = uiElementInput;
[filter setFrameProcessingCompletionBlock:^(GPUImageOutput * filter, CMTime frameTime){
timeLabel.text = [NSString stringWithFormat:@"Time: %f s", -[startTime timeIntervalSinceNow]];
[weakUIElementInput update];
}];
}
you can find it in AVFoundation Programming Guide in the editing part
There is a section that deals with overlay of image. I will try to put up a sample code /project.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With