I've implemented a camera using the AVFoundation framework provided in iOS 4 & 5, but have been experiencing an inconsistent issue with the captureStillImageAsynchronouslyFromConnection:completionHandler:
function where the completion handler block is never called. The shutter sound does not fire and the preview freezes in this situation.
I've followed both Apple (WWDC 2010 & 2011 videos) and non-Apple guides (blog and SO posts) to implement a still image capture at Photo resolution with little to no improvement. I've been able to reproduce somewhat consistently under the following conditions:
AVCaptureFlashModeAuto
and take a photo in low light, it hangs and never runs the completion block. The flash fires, but the shutter sound does not. With the same code I am able to take a photo at normal light where the shutter sound fires, flash does not, and completion block runs.flashMode
or set it to off, I am able to take some photos, but after a small number (between 1 and 5, usually) the completion block stops firing. For clarification, a scenario might be:
My Implementation
I have the interface handled in a ViewController and separate class to handle all the AVFoundation stuff. An instance variable of the separate class that handles the AVFoundation stuff is in the ViewController. It's different than example implementations, but I don't see why it should cause this kind of occasional bug.
More Observations
[stillImageOutput isCapturingStillImage]
, it will continue to return false and continue to run the captureStillImageAsynchronouslyFromConnection
functionI would love to know if there's something that I could be missing, a property left unset, or a known issue with a work around.
Thanks.
(I saw a similar post, but none of the answers solved my problem. I need Photo res in the application. For reference: iPhone SDK 4 AVFoundation - How to use captureStillImageAsynchronouslyFromConnection correctly?)
I should have added code, it probably would have helped anyone viewing this, but anyway:
It turns out that the camera Flash has a significant enough impact on the amount of time it takes to process the image to cause issues when capturing something asynchronously. I had accidentally ended the capture session before the notification was sent from the completionHandler block.
- (void) saveImageFromCamera {
[[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(imageCapturedFromCamera) name:@"FTW_imageCaptured" object:nil];
[cameraController captureStillImage];
// WRONG: Session was stopped here before
}
- (void) imageCapturedFromCamera {
[[NSNotificationCenter defaultCenter] removeObserver:self name:@"FTW_imageCaptured" object:nil];
[sharedAppController setBackgroundImage:[cameraController stillImage]];
sharedAppController.imageFromCamera = YES;
[self endCamera];
// CORRECT: Should have stopped session here
[self updateBackgroundImage];
}
I hope this can help anyone else that runs into something like this.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With