I'm using the AVFoundation classes to capture the live video stream from the camera and to process the video samples. This works nicely. However, I do have problems properly releasing the AVFoundation instances (capture session, preview layer, input and output) once I'm done.
When I no longer need the session and all associated objects, I stop the capture session and release it. This works most of the time. However, sometimes the app crashes with an EXEC_BAD_ACCESS
signal raised in the second thread that was created by the dispatch queue (and where the video samples are processed). The crash is mainly due to my own class instance, which serves as the sample buffer delegate and is freed after I've stopped the capture session.
The Apple documentation mentions the problem: Stopping the capture session is an asynchronous operation. That is: it doesn't happen immediately. In particular, the second thread continues to process video samples and access different instances like the capture session or the input and output devices.
So how do I properly release the AVCaptureSession
and all related instances? Is there a notification that reliably tells me that the AVCaptureSession
has finished?
Here's my code:
Declarations:
AVCaptureSession* session;
AVCaptureVideoPreviewLayer* previewLayer;
UIView* view;
Setup of instances:
AVCaptureDevice* camera = [AVCaptureDevice defaultDeviceWithMediaType: AVMediaTypeVideo];
session = [[AVCaptureSession alloc] init];
AVCaptureDeviceInput* input = [AVCaptureDeviceInput deviceInputWithDevice: camera error: &error];
[session addInput: input];
AVCaptureVideoDataOutput* output = [[[AVCaptureVideoDataOutput alloc] init] autorelease];
[session addOutput: output];
dispatch_queue_t queue = dispatch_queue_create("augm_reality", NULL);
[output setSampleBufferDelegate: self queue: queue];
dispatch_release(queue);
previewLayer = [[AVCaptureVideoPreviewLayer layerWithSession: session] retain];
previewLayer.frame = view.bounds;
[view.layer addSublayer: previewLayer];
[session startRunning];
Cleanup:
[previewLayer removeFromSuperlayer];
[previewLayer release];
[session stopRunning];
[session release];
I've posted a very similar question in the Apple Developer Forum and got an answer from an Apple employee. He says it's a known problem:
This is a problem with the AVCaptureSession / VideoDataOutput in iOS 4.0-4.1 that has been fixed and will appear in a future update. For the time being, you can work around it by waiting for a short period after stopping the AVCaptureSession, e.g. half a second, before disposing of the session and data output.
He/she proposes the following code:
dispatch_after(
dispatch_time(0, 500000000),
dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), // or main queue, or your own
^{
// Do your work here.
[session release];
// etc.
}
);
I still like the approach with the dispatch queue finalizer better because this code just guesses when the second thread might have finished.
As per current apple docs(1) [AVCaptureSession stopRunning]
is a synchronous operation which blocks until the receiver has completely stopped running. So all these issues shouldn't happen any more.
Solved! Perhaps it is the sequence of acions on initializing the session. This one works for me:
NSError *error = nil;
if(session)
[session release];
// Create the session
session = [[AVCaptureSession alloc] init];
// Configure the session to produce lower resolution video frames, if your
// processing algorithm can cope. We'll specify medium quality for the
// chosen device.
session.sessionPreset = AVCaptureSessionPresetMedium;
// Find a suitable AVCaptureDevice
AVCaptureDevice *device = [AVCaptureDevice
defaultDeviceWithMediaType:AVMediaTypeVideo];
// Create a device input with the device and add it to the session.
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device
error:&error];
if (!input) {
// Handling the error appropriately.
}
[session addInput:input];
// Create a VideoDataOutput and add it to the session
AVCaptureVideoDataOutput *output = [[[AVCaptureVideoDataOutput alloc] init] autorelease];
[session addOutput:output];
// Configure your output.
dispatch_queue_t queue = dispatch_queue_create("myQueue", NULL);
[output setSampleBufferDelegate:self queue:queue];
dispatch_release(queue);
// Specify the pixel format
output.videoSettings =
[NSDictionary dictionaryWithObject:
[NSNumber numberWithInt:kCVPixelFormatType_32BGRA]
forKey:(id)kCVPixelBufferPixelFormatTypeKey];
// If you wish to cap the frame rate to a known value, such as 15 fps, set
// minFrameDuration.
output.minFrameDuration = CMTimeMake(1, 15);
previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:session];
[delegate layerArrived:previewLayer];
NSNotificationCenter *notify =
[NSNotificationCenter defaultCenter];
[notify addObserver: self
selector: @selector(onVideoError:)
name: AVCaptureSessionRuntimeErrorNotification
object: session];
[notify addObserver: self
selector: @selector(onVideoStart:)
name: AVCaptureSessionDidStartRunningNotification
object: session];
[notify addObserver: self
selector: @selector(onVideoStop:)
name: AVCaptureSessionDidStopRunningNotification
object: session];
[notify addObserver: self
selector: @selector(onVideoStop:)
name: AVCaptureSessionWasInterruptedNotification
object: session];
[notify addObserver: self
selector: @selector(onVideoStart:)
name: AVCaptureSessionInterruptionEndedNotification
object: session];
// Start the session running to start the flow of data
[session startRunning];
Btw this sequence seems to resolve the synchronous notifications problem :)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With