I was wondering if it's possible to add multiple AVCaptureVideoDataOutput
to AVCaptureSession
with a single camera device input?
My experiments indicate that adding a second VideoDataOutput
will cause canAddOutput
return NO
. But I couldn't find anywhere on Apple's documentation says multiple data output is disallow.
We can not use single AVCaptureSession
for multiple AVCaptureVideoDataOutput
objects.
What can you do is you can make multiple AVCaptureVideoDataOutput
with multiple AVCaptureSession
objects.
You can create two different setups of the AVCaptureVideoDataOutput
and AVCaptureSession
then you can use them one after another in the app and you will be able to achieve the goal.
In my case, I had to capture front and back image using the camera at a time.
I did create two different objects for AVCaptureVideoDataOutput
and AVCaptureSession
as given below.
/* Front camera settings */
@property bool isFrontRecording;
@property (strong, nonatomic) AVCaptureDeviceInput *videoInputBack;
@property (strong, nonatomic) AVCaptureStillImageOutput *imageOutputBack;
@property (strong, nonatomic) AVCaptureSession *sessionBack;
/* Back camera settings */
@property bool isBackRecording;
@property (strong, nonatomic) AVCaptureDeviceInput *videoInputFront;
@property (strong, nonatomic) AVCaptureStillImageOutput *imageOutputFront;
@property (strong, nonatomic) AVCaptureSession *sessionFront;
Now in view did load initially I did setup back camera and set up flags for session started recording or not for both sessions as followed.
- (void)viewDidLoad {
[super viewDidLoad];
[self setupBackAVCapture];
self.isFrontRecording = NO;
self.isBackRecording = NO;
}
- (void)setupBackAVCapture
{
NSError *error = nil;
self.sessionBack = [[AVCaptureSession alloc] init];
self.sessionBack.sessionPreset = AVCaptureSessionPresetPhoto;
AVCaptureDevice *camera = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
self.videoInputBack = [[AVCaptureDeviceInput alloc] initWithDevice:camera error:&error];
[self.sessionBack addInput:self.videoInputBack];
self.imageOutputBack = [[AVCaptureStillImageOutput alloc] init];
[self.sessionBack addOutput:self.imageOutputBack];
}
Now, whenever the user starts capturing photo we will capture the front photo using below code.
- (IBAction)buttonCapture:(id)sender {
[self takeBackPhoto];
}
- (void)takeBackPhoto
{
[self.sessionBack startRunning];
if (!self.isFrontRecording) {
self.isFrontRecording = YES;
AudioServicesPlaySystemSound(kSystemSoundID_Vibrate);
AVCaptureConnection *videoConnection = [self.imageOutputBack connectionWithMediaType:AVMediaTypeVideo];
if (videoConnection == nil) {
return;
}
[self.imageOutputBack
captureStillImageAsynchronouslyFromConnection:videoConnection
completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
if (imageDataSampleBuffer == NULL) {
return;
}
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
UIImage *image = [[UIImage alloc] initWithData:imageData];
UIImageWriteToSavedPhotosAlbum(image, self, nil, nil);
[self.imageView setImage:image];
[self.sessionBack stopRunning];
// Set up front camera setting and capture photo.
[self setupFrontAVCapture];
[self takeFrontPhoto];
}];
self.isFrontRecording = NO;
}
}
As soon as the back image will be captured we will setup session to capture front image using setupFrontAVCapture method and then we will capture front image using takeFrontPhoto method as given below.
- (void)setupFrontAVCapture
{
NSError *error = nil;
self.sessionFront = [[AVCaptureSession alloc] init];
self.sessionFront.sessionPreset = AVCaptureSessionPresetPhoto;
AVCaptureDevice *camera = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
camera = [self cameraWithPosition:AVCaptureDevicePositionFront];
self.videoInputFront = [[AVCaptureDeviceInput alloc] initWithDevice:camera error:&error];
[self.sessionFront addInput:self.videoInputFront];
self.imageOutputFront = [[AVCaptureStillImageOutput alloc] init];
[self.sessionFront addOutput:self.imageOutputFront];
}
- (void)takeFrontPhoto
{
[self.sessionFront startRunning];
if (!self.isBackRecording) {
self.isBackRecording = YES;
AudioServicesPlaySystemSound(kSystemSoundID_Vibrate);
AVCaptureConnection *videoConnection = [self.imageOutputFront connectionWithMediaType:AVMediaTypeVideo];
if (videoConnection == nil) {
return;
}
[self.imageOutputFront
captureStillImageAsynchronouslyFromConnection:videoConnection
completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
if (imageDataSampleBuffer == NULL) {
return;
}
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
UIImage *image = [[UIImage alloc] initWithData:imageData];
UIImageWriteToSavedPhotosAlbum(image, self, nil, nil);
[self.imageViewBack setImage:image];
[self.sessionFront stopRunning];
}];
self.isBackRecording = NO;
}
}
This way you can use two different set of AVCaptureSession
and AVCaptureStillImageOutput
objects and you can achieve your goal.
Please let me know if you have any confusion.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With