Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How can I get autofocus to work in a second AVCaptureSession without recreating the sessions?

Autofocus is not working on the first AVCaptureSession when I create a second AVCaptureSession. The second session to be created is the one where autofocus works and the first created one does not autofocus.

I would expect that either session would be able to auto focus when started after the other one is stopped in the same way the auto white balance and auto exposure work for both sessions. If you observe the log window with the sample code below you can see the key-value-observing messages coming through; but never the changing focus message when the top session is running.

Sidenote: Unfortunately I have a bug in a third party library that I am using which prevents me from simply recreating the sessions entirely as I switch between them (it is leaking its AVCaptureSessions which eventually cause the app to be killed). The full story is that this library is creating one of the capture sessions for me, it has a public API to start and stop the session and I wish to create another session. The code below demonstrates the problem though without using the third party library.

I've created a test application with the code listed below and a XIB file that has two views, one above the other and a button hooked up to the switchSessions method that demonstrates the problem.

It may be related to the problem described here, Focus (Autofocus) not working in camera (AVFoundation AVCaptureSession), although no mention is made of two capture sessions.

Header file:

#import <UIKit/UIKit.h>

@class AVCaptureSession;
@class AVCaptureStillImageOutput;
@class AVCaptureVideoPreviewLayer;
@class AVCaptureDevice;
@class AVCaptureDeviceInput;

@interface AVCaptureSessionFocusBugViewController : UIViewController {

    IBOutlet UIView *_topView;
    IBOutlet UIView *_bottomView;

    AVCaptureDevice *_device;

    AVCaptureSession *_topSession;

    AVCaptureStillImageOutput *_outputTopSession;
    AVCaptureVideoPreviewLayer *_previewLayerTopSession;
    AVCaptureDeviceInput *_inputTopSession;

    AVCaptureSession *_bottomSession;

    AVCaptureStillImageOutput *_outputBottomSession;
    AVCaptureVideoPreviewLayer *_previewLayerBottomSession;
    AVCaptureDeviceInput *_inputBottomSession;
}

- (IBAction)switchSessions:(id)sender;

@end

Implementation file:

#import "AVCaptureSessionFocusBugViewController.h"
#import <AVFoundation/AVFoundation.h>

@interface AVCaptureSessionFocusBugViewController ()

- (void)setupCaptureSession:(AVCaptureSession **)session
                     output:(AVCaptureStillImageOutput **)output
               previewLayer:(AVCaptureVideoPreviewLayer **)previewLayer
                      input:(AVCaptureDeviceInput **)input
                       view:(UIView *)view;

- (void)tearDownSession:(AVCaptureSession **)session
                 output:(AVCaptureStillImageOutput **)output
           previewLayer:(AVCaptureVideoPreviewLayer **)previewLayer
                  input:(AVCaptureDeviceInput **)input
                   view:(UIView *)view;

@end

@implementation AVCaptureSessionFocusBugViewController

- (IBAction)switchSessions:(id)sender
{
    if ([_topSession isRunning]) {
        [_topSession stopRunning];
        [_bottomSession startRunning];
        NSLog(@"Bottom session now running.");
    }
    else {
        [_bottomSession stopRunning];
        [_topSession startRunning];
        NSLog(@"Top session now running.");
    }
}

- (void)observeValueForKeyPath:(NSString *)keyPath 
                      ofObject:(id)object 
                        change:(NSDictionary *)change 
                       context:(void *)context
{
    NSLog(@"Observed value for key at key path %@.", keyPath);
    // Enable to confirm that the focusMode is set correctly.
    //NSLog(@"Autofocus for the device is set to %d.", [_device focusMode]);
}

- (void)viewDidLoad {
    [super viewDidLoad];

    _device = [[AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo] retain];

    [self setupCaptureSession:&_topSession 
                       output:&_outputTopSession
                 previewLayer:&_previewLayerTopSession
                        input:&_inputTopSession
                         view:_topView];

    [self setupCaptureSession:&_bottomSession 
                       output:&_outputBottomSession
                 previewLayer:&_previewLayerBottomSession
                        input:&_inputBottomSession
                         view:_bottomView];

    // NB: We only need to observe one device, since the top and bottom sessions use the same device.
    [_device addObserver:self forKeyPath:@"adjustingFocus" options:NSKeyValueObservingOptionNew context:nil];
    [_device addObserver:self forKeyPath:@"adjustingExposure" options:NSKeyValueObservingOptionNew context:nil];
    [_device addObserver:self forKeyPath:@"adjustingWhiteBalance" options:NSKeyValueObservingOptionNew context:nil];

    [_topSession startRunning];
    NSLog(@"Starting top session.");
}


- (void)setupCaptureSession:(AVCaptureSession **)session
                     output:(AVCaptureStillImageOutput **)output
               previewLayer:(AVCaptureVideoPreviewLayer **)previewLayer
                      input:(AVCaptureDeviceInput **)input
                       view:(UIView *)view
{    
    *session = [[AVCaptureSession alloc] init];

    // Create the preview layer.
    *previewLayer = [[AVCaptureVideoPreviewLayer layerWithSession:*session] retain];

    [*previewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];

    [*previewLayer setFrame:[view bounds]];

    [[view layer] addSublayer:*previewLayer];

    // Configure the inputs and outputs.
    [*session setSessionPreset:AVCaptureSessionPresetMedium];

    NSError *error = nil;

    *input = [[AVCaptureDeviceInput deviceInputWithDevice:_device error:&error] retain];

    if (!*input) {
        NSLog(@"Error creating input device:%@", [error localizedDescription]);
        return;
    }

    [*session addInput:*input];

    *output = [[AVCaptureStillImageOutput alloc] init];

    [*session addOutput:*output];

    NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys:AVVideoCodecJPEG, AVVideoCodecKey, nil];

    [*output setOutputSettings:outputSettings];

    [outputSettings release];
}

- (void)viewDidUnload {
    [_topView release];
    _topView = nil;

    [_bottomView release];
    _bottomView = nil;

    [_device release];
    _device = nil;

    [self tearDownSession:&_topSession
                   output:&_outputTopSession
             previewLayer:&_previewLayerTopSession
                    input:&_inputTopSession
                     view:_topView];

    [self tearDownSession:&_bottomSession 
                       output:&_outputBottomSession
                 previewLayer:&_previewLayerBottomSession
                        input:&_inputBottomSession
                         view:_bottomView];
}

- (void)tearDownSession:(AVCaptureSession **)session
                 output:(AVCaptureStillImageOutput **)output
           previewLayer:(AVCaptureVideoPreviewLayer **)previewLayer
                  input:(AVCaptureDeviceInput **)input
                   view:(UIView *)view
{
    if ([*session isRunning]) {
        [*session stopRunning];
    }

    [*session removeOutput:*output];

    [*output release];
    *output = nil;

    [*session removeInput:*input];

    [*input release];
    *input = nil;

    [*previewLayer removeFromSuperlayer];

    [*previewLayer release];
    *previewLayer = nil;

    [*session release];
    *session = nil;
}

@end
like image 392
Shane Avatar asked Mar 25 '11 01:03

Shane


1 Answers

Apple technical support have confirmed that creating two simultaneous capture sessions is not supported. You must teardown one and then create another.

like image 128
Shane Avatar answered Nov 15 '22 07:11

Shane