I'm recording video and audio using AVCaptureVideoDataOutput
and AVCaptureAudioDataOutput
and in the captureOutput:didOutputSampleBuffer:fromConnection:
delegate method, I want to draw text onto each individual sample buffer I'm receiving from the video connection. The text changes with about every frame (it's a stopwatch label) and I want that to be recorded on top of the video data that's captured.
Here's what I've been able to come up with so far:
//1.
CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
//2.
UIImage *textImage = [self createTextImage];
CIImage *maskImage = [CIImage imageWithCGImage:textImage.CGImage];
//3.
CVPixelBufferLockBaseAddress(pixelBuffer, 0);
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
NSDictionary *options = [NSDictionary dictionaryWithObject:(__bridge id)colorSpace forKey:kCIImageColorSpace];
CIImage *inputImage = [CIImage imageWithCVPixelBuffer:pixelBuffer options:options];
//4.
CIFilter *filter = [CIFilter filterWithName:@"CIBlendWithMask"];
[filter setValue:inputImage forKey:@"inputImage"];
[filter setValue:maskImage forKey:@"inputMaskImage"];
CIImage *outputImage = [filter outputImage];
CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);
//5.
[self.renderContext render:outputImage toCVPixelBuffer:pixelBuffer bounds:[outputImage extent] colorSpace:CGColorSpaceCreateDeviceRGB()];
//6.
[self.pixelBufferAdaptor appendPixelBuffer:pixelBuffer withPresentationTime:timestamp];
createTextImage
does. I was able to verify that this step works; I saved an image with text drawn to it to my photos.CIBlendWithMask
, setting the input image as the one created from the original pixel buffer and the input mask as the CIImage
made from the image with text drawn on it.CIContext
was created beforehand with [CIContext contextWithOptions:nil];
.pixelBufferAdaptor
with the appropriate timestamp.The video that's saved at the end of recording has no visible changes to it i.e. no mask image has been drawn onto the pixel buffers.
Anyone have any idea where I'm going wrong here? I've been stuck on this for days, any help would be so appreciated.
EDIT:
- (UIImage *)createTextImage {
UIGraphicsBeginImageContextWithOptions(CGSizeMake(self.view.bounds.size.width, self.view.bounds.size.height), NO, 1.0);
NSMutableAttributedString *timeStamp = [[NSMutableAttributedString alloc]initWithString:self.timeLabel.text attributes:@{NSForegroundColorAttributeName:self.timeLabel.textColor, NSFontAttributeName: self.timeLabel.font}];
NSMutableAttributedString *countDownString = [[NSMutableAttributedString alloc]initWithString:self.cDownLabel.text attributes:@{NSForegroundColorAttributeName:self.cDownLabel.textColor, NSFontAttributeName:self.cDownLabel.font}];
[timeStamp drawAtPoint:self.timeLabel.center];
[countDownString drawAtPoint:self.view.center];
UIImage *blank = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return blank;
}
Do you want to as below?
Instead of using CIBlendWithMask
, you should use CISourceOverCompositing
, try this:
//4.
CIFilter *filter = [CIFilter filterWithName:@"CISourceOverCompositing"];
[filter setValue:maskImage forKey:kCIInputImageKey];
[filter setValue:inputImage forKey:kCIInputBackgroundImageKey];
CIImage *outputImage = [filter outputImage];
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With