I'm resolving task of scanning front camera input for faces, detecting them and getting them as UIImage-objects. I'm using AVFoundation to scan and detect faces.
Like this:
let input = try AVCaptureDeviceInput(device: captureDevice)
captureSession = AVCaptureSession()
captureSession!.addInput(input)
output = AVCaptureMetadataOutput()
captureSession?.addOutput(output)
output.setMetadataObjectsDelegate(self, queue: dispatch_get_main_queue())
output.metadataObjectTypes = [AVMetadataObjectTypeFace]
videoPreviewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
videoPreviewLayer?.videoGravity = AVLayerVideoGravityResizeAspectFill
videoPreviewLayer?.frame = view.layer.bounds
view.layer.addSublayer(videoPreviewLayer!)
captureSession?.startRunning()
In delegate method didOutputMetadataObjects I'm getting face as AVMetadataFaceObject and highliting it with red frame like this:
let metadataObj = metadataObjects[0] as! AVMetadataFaceObject
let faceObject = videoPreviewLayer?.transformedMetadataObjectForMetadataObject(metadataObj)
faceFrame?.frame = faceObject!.bounds
Question is: How can I get faces as UIImages?
I've tried to dance over the 'didOutputSampleBuffer' but it isn't called at all :c
I did the same thing using didOutputSampleBuffer and Objective-C. It looks like:
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
CFDictionaryRef attachments = CMCopyDictionaryOfAttachments(kCFAllocatorDefault, sampleBuffer, kCMAttachmentMode_ShouldPropagate);
CIImage *ciImage = [[CIImage alloc] initWithCVPixelBuffer:pixelBuffer options:(__bridge NSDictionary *)attachments];
if (attachments)
CFRelease(attachments);
NSNumber *orientation = (__bridge NSNumber *)(CMGetAttachment(imageDataSampleBuffer, kCGImagePropertyOrientation, NULL));
NSArray *features = [[CIDetector detectorOfType:CIDetectorTypeFace context:nil options:@{ CIDetectorAccuracy: CIDetectorAccuracyHigh }] featuresInImage:ciImage options:@{ CIDetectorImageOrientation: orientation }];
if (features.count == 1) {
CIFaceFeature *faceFeature = [features firstObject];
CGRect faceRect = faceFeature.bounds;
CGImageRef tempImage = [[CIContext contextWithOptions:nil] createCGImage:ciImage fromRect:ciImage.extent];
UIImage *image = [UIImage imageWithCGImage:tempImage scale:1.0 orientation:orientation.intValue];
UIImage *face = [image extractFace:faceRect];
}
}
where extractFace is an extension of UIImage:
- (UIImage *)extractFace:(CGRect)rect {
rect = CGRectMake(rect.origin.x * self.scale,
rect.origin.y * self.scale,
rect.size.width * self.scale,
rect.size.height * self.scale);
CGImageRef imageRef = CGImageCreateWithImageInRect(self.CGImage, rect);
UIImage *result = [UIImage imageWithCGImage:imageRef scale:self.scale orientation:self.imageOrientation];
CGImageRelease(imageRef);
return result;
}
Creating video output:
AVCaptureVideoDataOutput *videoOutput = [[AVCaptureVideoDataOutput alloc] init];
videoOutput.videoSettings = @{ (id)kCVPixelBufferPixelFormatTypeKey: [NSNumber numberWithInt:kCMPixelFormat_32BGRA] };
videoOutput.alwaysDiscardsLateVideoFrames = YES;
self.videoOutputQueue = dispatch_queue_create("OutputQueue", DISPATCH_QUEUE_SERIAL);
[videoOutput setSampleBufferDelegate:self queue:self.videoOutputQueue];
[self.session addOutput:videoOutput];
- (UIImage *) screenshot {
CGSize size = CGSizeMake(faceFrame.frame.size.width, faceFrame.frame.size.height);
UIGraphicsBeginImageContextWithOptions(size, NO, [UIScreen mainScreen].scale);
CGRect rec = CGRectMake(faceFrame.frame.origin.x, faceFrame.frame.orogin.y, faceFrame.frame.size.width, faceFrame.frame.size.height);
[_viewController.view drawViewHierarchyInRect:rec afterScreenUpdates:YES];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return image;
}
Take some cue from above
let contextImage: UIImage = <<screenshot>>!
let cropRect: CGRect = CGRectMake(x, y, width, height)
let imageRef: CGImageRef = CGImageCreateWithImageInRect(contextImage.CGImage, cropRect)
let image: UIImage = UIImage(CGImage: imageRef, scale: originalImage.scale, orientation: originalImage.imageOrientation)!
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With