The Mac OS X app I'm coding is taking a photo capture using the macbook built-in facetime camera.
On MacBookAir3,2, MacBookPro8,2 and MacBookPro10,2 it works fine but on new macbooks it takes "dark" photos. I understand it's because of auto exposure but I have trouble to get it working. The AVCaptureDevice
adjustingExposure
is set to NO
but the captured photo is still completely dark.
The code: setupCamera
is called once during the app launch
-(void) setupCamera
{
session = [[AVCaptureSession alloc] init];
session.sessionPreset = AVCaptureSessionPresetPhoto;
sessionInitialized = YES;
device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
[device lockForConfiguration:NULL];
if ([device isExposureModeSupported:AVCaptureExposureModeContinuousAutoExposure])
[device setExposureMode:AVCaptureExposureModeContinuousAutoExposure];
if ([device isFocusModeSupported:AVCaptureFocusModeContinuousAutoFocus])
[device setFocusMode:AVCaptureFocusModeContinuousAutoFocus];
if ([device isWhiteBalanceModeSupported:AVCaptureWhiteBalanceModeContinuousAutoWhiteBalance])
[device setWhiteBalanceMode:AVCaptureWhiteBalanceModeContinuousAutoWhiteBalance];
[device unlockForConfiguration];
NSError *error = nil;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
if(error != nil) {
// ...
}
if([session canAddInput:input]) {
[session addInput:input];
} else {
// ...
}
output = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = @{ AVVideoCodecKey : AVVideoCodecJPEG };
[output setOutputSettings:outputSettings];
if([session canAddOutput:output]) {
[session addOutput:output];
} else {
// ...
}
}
... then each click on the snap button in the UI calls the shootPhoto
function:
-(void) shootPhoto
{
[session startRunning];
if([device lockForConfiguration:NULL]) {
if ([device isExposureModeSupported:AVCaptureExposureModeContinuousAutoExposure])
[device setExposureMode:AVCaptureExposureModeContinuousAutoExposure];
if ([device isFocusModeSupported:AVCaptureFocusModeContinuousAutoFocus])
[device setFocusMode:AVCaptureFocusModeContinuousAutoFocus];
if ([device isWhiteBalanceModeSupported:AVCaptureWhiteBalanceModeContinuousAutoWhiteBalance])
[device setWhiteBalanceMode:AVCaptureWhiteBalanceModeContinuousAutoWhiteBalance];
[device unlockForConfiguration];
}
if(device.adjustingFocus == NO && device.adjustingExposure == NO && device.adjustingWhiteBalance == NO) {
[self actuallyCapture];
} else {
[device addObserver:self forKeyPath:@"adjustingExposure" options:NSKeyValueObservingOptionNew context:MyAdjustingExposureObservationContext];
[device addObserver:self forKeyPath:@"adjustingFocus" options:NSKeyValueObservingOptionNew context:MyAdjustingFocusObservationContext];
[device addObserver:self forKeyPath:@"adjustingWhiteBalance" options:NSKeyValueObservingOptionNew context:MyAdjustingWhiteBalanceObservationContext];
}
}
-(void) actuallyCapture
{
if ([session isRunning] == NO)
return;
connection = [output connectionWithMediaType:AVMediaTypeVideo];
[output captureStillImageAsynchronouslyFromConnection:connection completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
// save file etc ...
}];
}
The idea is to check if camera device is adjusting focus, exposure or white balance. If not call actuallyCapture
right away. If it is adjusting - add observers and call actuallyCapture
from the observeValueForKeyPath
.
The problem is that the addObserver
calls are never called because the device returns all adjustingX==NO
- but still, the captured photo is dark.
What might be the reason? Am I waiting for white balance and exposure adjustments properly?
It's hard to debug for me because I only own those devices that work fine.
I managed to solve this issue myself. Here's how I did that:
Set observers for adjustingExposure
, adjustingFocus
and adjustingWhiteBalance
:
[self.device addObserver:self forKeyPath:@"adjustingExposure" options:NSKeyValueObservingOptionNew context:MyAdjustingExposureObservationContext];
[self.device addObserver:self forKeyPath:@"adjustingFocus" options:NSKeyValueObservingOptionNew context:MyAdjustingFocusObservationContext];
[self.device addObserver:self forKeyPath:@"adjustingWhiteBalance" options:NSKeyValueObservingOptionNew context:MyAdjustingWhiteBalanceObservationContext];
To capture photo initialise a AVCaptureSession
but set a 1s delay timer and actually capture after it fires:
-(void) shootPhoto
{
dispatch_async(self.sessionQueue, ^{
if([self setupCamera]) {
self.sessionInitialized = YES;
[self.session startRunning];
self.isWaitingToCaptureImage = YES;
dispatch_async(dispatch_get_main_queue(), ^{
self.captureDelayTimer = [NSTimer scheduledTimerWithTimeInterval:1.0
target:self
selector:@selector(actuallyCapture)
userInfo:nil
repeats:NO];
});
}
});
}
In the observeValueForKeyPath:ofObject:change:context
check if all three adjustments are already done and if they are - cancel the timer set above and shoot the photo:
- (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context
{
if(!self.sessionInitialized || !self.isWaitingToCaptureImage) {
[super observeValueForKeyPath:keyPath ofObject:object change:change context:context];
return;
}
if (context != MyAdjustingExposureObservationContext && context != MyAdjustingFocusObservationContext && context != MyAdjustingWhiteBalanceObservationContext) {
[super observeValueForKeyPath:keyPath ofObject:object change:change context:context];
return;
} else {
if (self.device.adjustingExposure || self.device.adjustingFocus || self.device.adjustingWhiteBalance) {
NSLog(@"not ready to capture yet");
return;
} else {
NSLog(@"ready to capture");
if (self.captureDelayTimer && self.captureDelayTimer.isValid) {
[self.captureDelayTimer invalidate];
self.captureDelayTimer = nil;
}
[self actuallyCaptureDispatch];
}
}
}
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With