Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Focus (Autofocus) not working in camera (AVFoundation AVCaptureSession)

I am using standard AVFoundation classes to capture video and show preview (http://developer.apple.com/library/ios/#qa/qa1702/_index.html)

Here is my code:

- (void)setupCaptureSession {       
    NSError *error = nil;

    [self setCaptureSession: [[AVCaptureSession alloc] init]]; 

    self.captureSession.sessionPreset = AVCaptureSessionPresetMedium;

    device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];

    if ([device isFocusModeSupported:AVCaptureFocusModeContinuousAutoFocus] && [device lockForConfiguration:&error]) {
        [device setFocusMode:AVCaptureFocusModeContinuousAutoFocus];
        [device unlockForConfiguration];
    }

    AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device 
                                                                        error:&error];
    if (!input) {
        // TODO: Obsługa błędu, gdy nie uda się utworzyć wejścia
    }
    [[self captureSession] addInput:input];

    AVCaptureVideoDataOutput *output = [[[AVCaptureVideoDataOutput alloc] init] autorelease];
    [[self captureSession] addOutput:output];

    dispatch_queue_t queue = dispatch_queue_create("myQueue", NULL);
    [output setSampleBufferDelegate:self queue:queue];
    dispatch_release(queue);

    output.videoSettings = 
    [NSDictionary dictionaryWithObject:
     [NSNumber numberWithInt:kCVPixelFormatType_32BGRA] 
                                forKey:(id)kCVPixelBufferPixelFormatTypeKey];


    output.minFrameDuration = CMTimeMake(1, 15);

    [[self captureSession] startRunning];

    AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [AVCaptureVideoPreviewLayer layerWithSession:self.captureSession];
    captureVideoPreviewLayer.frame = previewLayer.bounds;
    [previewLayer.layer insertSublayer:captureVideoPreviewLayer atIndex:0];
    [previewLayer setHidden:NO];

    mutex = YES;
}

// Delegate routine that is called when a sample buffer was written
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer 
       fromConnection:(AVCaptureConnection *)connection { 
    if (mutex && ![device isAdjustingFocus] && ![device isAdjustingExposure] && ![device isAdjustingWhiteBalance]) {
        // something
    }
}

// Create a UIImage from sample buffer data
- (UIImage *) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer {
    // Get a CMSampleBuffer's Core Video image buffer for the media data
    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); 
    // Lock the base address of the pixel buffer
    CVPixelBufferLockBaseAddress(imageBuffer, 0); 

    // Get the number of bytes per row for the pixel buffer
    void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer); 

    // Get the number of bytes per row for the pixel buffer
    size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer); 
    // Get the pixel buffer width and height
    size_t width = CVPixelBufferGetWidth(imageBuffer); 
    size_t height = CVPixelBufferGetHeight(imageBuffer); 

    // Create a device-dependent RGB color space
    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB(); 

    // Create a bitmap graphics context with the sample buffer data
    CGContextRef context = CGBitmapContextCreate(baseAddress, width, height, 8, 
                                                 bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst); 
    // Create a Quartz image from the pixel data in the bitmap graphics context
    CGImageRef quartzImage = CGBitmapContextCreateImage(context); 
    // Unlock the pixel buffer
    CVPixelBufferUnlockBaseAddress(imageBuffer,0);

    // Free up the context and color space
    CGContextRelease(context); 
    CGColorSpaceRelease(colorSpace);

    // Create an image object from the Quartz image
    UIImage *image = [UIImage imageWithCGImage:quartzImage];

    // Release the Quartz image
    CGImageRelease(quartzImage);

    return (image);
}

Everything is working OK but sometimes it have some issues:

  • Camera Focus is not working - Its random, sometimes works, sometimes not. I tried on different devices both iPhone 4 and 3GS. I tried to google it, but no results. People are only mentioning about broken devices, but I checked on 3 iPhone 4 and one iPhone 3GS. Problem is everywhere.
  • Camera is loading quite long. I am using ScannerKit API, which is also using camera for the same reasons and it is loading about two times faster than my implementation.

Any ideas what can be a problem? The first issue is definitely more important.

like image 438
woojtekr Avatar asked Mar 22 '11 12:03

woojtekr


2 Answers

Old question but anyway may save somebody hours of frustration. It's important to set the point of interest before calling setFocusMode, otherwise your camera will set focus to the previous focus point. Think of setFocusMode as COMMIT. Same applies to setExposureMode.

AVCam sample by Apple is totally wrong and broken.

like image 101
Rob Zombie Avatar answered Nov 14 '22 07:11

Rob Zombie


Some points, I've noticed that Video presets take longer to initialise than photo preset.

Are you recording video or taking photos?

I've noticed you have a medium quality setting but with 32BGRA, it could work out better to set capture mode to Photo and downsample the image after capture. Also set AVVideoCodecJPEG instead of 32BGRA.

[device setOutputSettings:[NSDictionary dictionaryWithObject:AVVideoCodecJPEG forKey:AVVideoCodecKey]];

Instead of:

[device setOutputSettings:[NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCMPixelFormat_32BGRA] forKey:(id)kCVPixelBufferPixelFormatTypeKey]];

You also might want to register for notifications to subjectAreaChangeMonitoring and force a refocus, if your changing focus mode to AVCaptureFocusModeAutoFocus at any point.

You also might want to add code to manually set autofocus and reset it to automatic as sometimes this is required.

I've amended the code to set a focus point of interest and log the camera configuration error output to a delegate method.

- (void)setupCaptureSession {       
    NSError *error = nil;

    [self setCaptureSession: [[AVCaptureSession alloc] init]]; 

    self.captureSession.sessionPreset = AVCaptureSessionPresetMedium;

    device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];

    if ([device isFocusModeSupported:AVCaptureFocusModeContinuousAutoFocus] && [device lockForConfiguration:&error]){
        [device setFocusMode:AVCaptureFocusModeContinuousAutoFocus];
        if ([device isFocusPointOfInterestSupported])
            [device setFocusPointOfInterest:CGPointMake(0.5f,0.5f)];
        [device unlockForConfiguration];
    }else {
        if ([[self delegate]   
              respondsToSelector:@selector(captureManager:didFailWithError:)]) {
                 [[self delegate] captureManager:self didFailWithError:error];
        }
    }

    AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device 
                                                                        error:&error];
    if (!input) {
        // TODO: Obsługa błędu, gdy nie uda się utworzyć wejścia
    }
    [[self captureSession] addInput:input];

    AVCaptureVideoDataOutput *output = [[[AVCaptureVideoDataOutput alloc] init] autorelease];
    [[self captureSession] addOutput:output];

    dispatch_queue_t queue = dispatch_queue_create("myQueue", NULL);
    [output setSampleBufferDelegate:self queue:queue];
    dispatch_release(queue);

    output.videoSettings = 
    [NSDictionary dictionaryWithObject:
     [NSNumber numberWithInt:kCVPixelFormatType_32BGRA] 
                                forKey:(id)kCVPixelBufferPixelFormatTypeKey];


    output.minFrameDuration = CMTimeMake(1, 15);

    [[self captureSession] startRunning];

    AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [AVCaptureVideoPreviewLayer layerWithSession:self.captureSession];
    captureVideoPreviewLayer.frame = previewLayer.bounds;
    [previewLayer.layer insertSublayer:captureVideoPreviewLayer atIndex:0];
    [previewLayer setHidden:NO];

    mutex = YES;
}
like image 34
Adam Roberts Avatar answered Nov 14 '22 06:11

Adam Roberts