I am trying to embed a simple view in my iPhone application to take quick snapshots. Everything works fine but I am facing some issues with the cameras startup-time. In an Apple sample project AVCaptureSession's -startRunning
is not getting executed on the main thread, what seems to be necessary. I am setting up the capture session during the view's initialization as well as starting it in a separate thread. Now I add the AVCaptureVideoPreviewLayer
in -didMoveToSuperview
. Everything's fine without multithreading (the UI is blocked for about a second) but with GCD the UI sometimes works, sometimes it takes way too long for the UI to 'unfreeze' or the preview to be shown.
How can I deal with the camera's startup delay in a reliable way, without blocking the main thread (the delay itself is not the problem)?
I hope you guys understand my problem :D
Thanks in advance!
BTW: Here is my proof-of-concept-project (without GCD) I am now reusing for another app: http://github.com/dariolass/QuickShotView
So I figured it out by myself. This code works for me and produces the least UI freezing:
- (void)willMoveToSuperview:(UIView *)newSuperview {
//capture session setup
AVCaptureDeviceInput *newVideoInput = [[AVCaptureDeviceInput alloc] initWithDevice:self.rearCamera error:nil];
AVCaptureStillImageOutput *newStillImageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys:
AVVideoCodecJPEG, AVVideoCodecKey,
nil];
[newStillImageOutput setOutputSettings:outputSettings];
AVCaptureSession *newCaptureSession = [[AVCaptureSession alloc] init];
if ([newCaptureSession canAddInput:newVideoInput]) {
[newCaptureSession addInput:newVideoInput];
}
if ([newCaptureSession canAddOutput:newStillImageOutput]) {
[newCaptureSession addOutput:newStillImageOutput];
self.stillImageOutput = newStillImageOutput;
self.captureSession = newCaptureSession;
}
// -startRunning will only return when the session started (-> the camera is then ready)
dispatch_queue_t layerQ = dispatch_queue_create("layerQ", NULL);
dispatch_async(layerQ, ^{
[self.captureSession startRunning];
AVCaptureVideoPreviewLayer *prevLayer = [[AVCaptureVideoPreviewLayer alloc]initWithSession:self.captureSession];
prevLayer.frame = self.previewLayerFrame;
prevLayer.masksToBounds = YES;
prevLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
prevLayer.cornerRadius = PREVIEW_LAYER_EDGE_RADIUS;
//to make sure were not modifying the UI on a thread other than the main thread, use dispatch_async w/ dispatch_get_main_queue
dispatch_async(dispatch_get_main_queue(), ^{
[self.layer insertSublayer:prevLayer atIndex:0];
});
});
}
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With