I'm trying to implement capture image and videos from my app, and now from iOS 10 onward "AVCaptureStillImageOutput" is deprecated.
Please help me to implement AVCapturePhotoOutput in Objective-C.
Here is my sample code:
_avCaptureOutput = [[AVCapturePhotoOutput alloc]init];
_avSettings = [AVCapturePhotoSettings photoSettings];
AVCaptureSession* captureSession = [[AVCaptureSession alloc] init];
[captureSession startRunning];
AVCaptureConnection *connection = [self.movieOutput connectionWithMediaType:AVMediaTypeVideo];
if (connection.active)
{
//connection is active
NSLog(@"Connection is active");
id previewPixelType = _avSettings.availablePreviewPhotoPixelFormatTypes.firstObject;
NSDictionary *format = @{(NSString*)kCVPixelBufferPixelFormatTypeKey:previewPixelType,(NSString*)kCVPixelBufferWidthKey:@160,(NSString*)kCVPixelBufferHeightKey:@160};
_avSettings.previewPhotoFormat = format;
[_avCaptureOutput capturePhotoWithSettings:_avSettings delegate:self];
}
else
{
NSLog(@"Connection is not active");
//connection is not active
//try to change self.captureSession.sessionPreset,
//or change videoDevice.activeFormat
}
_avCaptureOutput = [[AVCapturePhotoOutput alloc]init];
_avSettings = [AVCapturePhotoSettings photoSettings];
AVCaptureSession* captureSession = [[AVCaptureSession alloc] init];
[captureSession startRunning];
[self.avCaptureOutput capturePhotoWithSettings:self.avSettings delegate:self];
self must implement the AVCapturePhotoCaptureDelegate
#pragma mark - AVCapturePhotoCaptureDelegate
-(void)captureOutput:(AVCapturePhotoOutput *)captureOutput didFinishProcessingPhotoSampleBuffer:(CMSampleBufferRef)photoSampleBuffer previewPhotoSampleBuffer:(CMSampleBufferRef)previewPhotoSampleBuffer resolvedSettings:(AVCaptureResolvedPhotoSettings *)resolvedSettings bracketSettings:(AVCaptureBracketedStillImageSettings *)bracketSettings error:(NSError *)error
{
if (error) {
NSLog(@"error : %@", error.localizedDescription);
}
if (photoSampleBuffer) {
NSData *data = [AVCapturePhotoOutput JPEGPhotoDataRepresentationForJPEGSampleBuffer:photoSampleBuffer previewPhotoSampleBuffer:previewPhotoSampleBuffer];
UIImage *image = [UIImage imageWithData:data];
}
}
Now, you get the image, and do whatever you want.
Note: Since iOS 11, -captureOutput:didFinishProcessingPhotoSampleBuffer:...
is deprecated, need to use -captureOutput:didFinishProcessingPhoto:error:
instead:
- (void)captureOutput:(AVCapturePhotoOutput *)output didFinishProcessingPhoto:(AVCapturePhoto *)photo error:(nullable NSError *)error
{
NSData *imageData = [photo fileDataRepresentation];
UIImage *image = [UIImage imageWithData:data];
...
}
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With