I'm using an AVCaptureVideoPreviewLayer to allow the user to frame a shot from the iPhone camera. So I have an AVCaptureSession with the input as an AVCaptureDeviceInput, and the output as an AVCaptureStillImageOutput.
I also have animations and controls on top of the video feed, but these are slow and jerky because the video behind is running at maximum frame rate and tying up the CPU/GPU.
I'd like to cap the frame rate of the AVCaptureVideoPreviewLayer. I see there's the minFrameDuration property on AVCaptureVideoDataOutput, but I can't find anything similar on AVCaptureVideoPreviewLayer.
I do not think that the problem is connected with the frame rate. So I will suggest some tips to improve performance of your app:
1) AVCaptureVideoPreviewLayer it is just a subclass of CALayer which displays output from camera, so it is impossible to cap the frame rate of it.
2) Check if you positioned your animations in the right place, it depends on what kind of animations you have, if it is CALayer then animation layer should be a sublayer of your main canvas view layer(NOT AVCaptureVideoPreviewLayer!!!), if it is UIView, then it must be a subview of your main canvas view.
3) You can improve performance of your app by setting session preset:
[captureSession setSessionPreset:AVCaptureSessionPresetLow];
By default it is set to high, you may set it whatever you need, it is just a quality of video and if it is high performance could not be ideal.
4) I made my own test app, where is random animation overlays video preview layer(but it is subview of my main view!!!) and everything went smooth even on my old iPod, I can give you a code for initialization of capture session:
// Create a capture session
self.captureSession = [AVCaptureSession new];
if([captureSession canSetSessionPreset:AVCaptureSessionPresetHigh]){
[captureSession setSessionPreset:AVCaptureSessionPresetHigh];
}
else{
// HANDLE ERROR
}
// Find a suitable capture device
AVCaptureDevice *cameraDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
// Create and add a device input
NSError *error = nil;
AVCaptureDeviceInput *videoInput = [AVCaptureDeviceInput deviceInputWithDevice:cameraDevice error:&error];
if([captureSession canAddInput:videoInput]){
[captureSession addInput:videoInput];
}
else{
// HANDLE ERROR
}
// Create and add a device still image output
AVCaptureStillImageOutput *stillImageOutput = [AVCaptureStillImageOutput new];
[stillImageOutput addObserver:self forKeyPath:@"capturingStillImage" options:NSKeyValueObservingOptionNew context:AVCaptureStillImageIsCapturingStillImageContext];
if([captureSession canAddOutput:stillImageOutput]){
[captureSession addOutput:stillImageOutput];
}
else{
// HANDLE ERROR
}
// Setting up the preview layer for the camera
AVCaptureVideoPreviewLayer *previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:captureSession];
previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
previewLayer.frame = self.view.bounds;
// ADDING FINAL VIEW layer TO THE CAMERA CANVAS VIEW sublayer
[self.view.layer addSublayer:previewLayer];
// start the session
[captureSession startRunning];
5) And finally, in iOS5 you can set min and max video frame rate, what also can improve performance of your app, I guess it is what you asked. Check this link(Setting minimum and maximum video frame rate):
http://developer.apple.com/library/mac/#releasenotes/AudioVideo/RN-AVFoundation/_index.html
Hope that my answer was clear.
Best wishes,
Artem
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With