Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

ios capturing image using AVFramework

I'm capturing images using this code

#pragma mark - image capture  // Create and configure a capture session and start it running - (void)setupCaptureSession  {     NSError *error = nil;      // Create the session     AVCaptureSession *session = [[AVCaptureSession alloc] init];      // Configure the session to produce lower resolution video frames, if your      // processing algorithm can cope. We'll specify medium quality for the     // chosen device.     session.sessionPreset = AVCaptureSessionPresetMedium;      // Find a suitable AVCaptureDevice     AVCaptureDevice *device = [AVCaptureDevice                            defaultDeviceWithMediaType:AVMediaTypeVideo];      // Create a device input with the device and add it to the session.     AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device                                                                      error:&error];     if (!input)     {         NSLog(@"PANIC: no media input");     }     [session addInput:input];      // Create a VideoDataOutput and add it to the session     AVCaptureVideoDataOutput *output = [[AVCaptureVideoDataOutput alloc] init];     [session addOutput:output];      // Configure your output.     dispatch_queue_t queue = dispatch_queue_create("myQueue", NULL);     [output setSampleBufferDelegate:self queue:queue];     dispatch_release(queue);      // Specify the pixel format     output.videoSettings =      [NSDictionary dictionaryWithObject:     [NSNumber numberWithInt:kCVPixelFormatType_32BGRA]                              forKey:(id)kCVPixelBufferPixelFormatTypeKey];       // If you wish to cap the frame rate to a known value, such as 15 fps, set      // minFrameDuration.      // Start the session running to start the flow of data     [session startRunning];      // Assign session to an ivar.     [self setSession:session]; }     // Delegate routine that is called when a sample buffer was written - (void)captureOutput:(AVCaptureOutput *)captureOutput  didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer     fromConnection:(AVCaptureConnection *)connection {      NSLog(@"captureOutput: didOutputSampleBufferFromConnection");      // Create a UIImage from the sample buffer data     UIImage *image = [self imageFromSampleBuffer:sampleBuffer];      //< Add your code here that uses the image >     [self.imageView setImage:image];     [self.view setNeedsDisplay]; }   // Create a UIImage from sample buffer data - (UIImage *) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer  {     NSLog(@"imageFromSampleBuffer: called");     // Get a CMSampleBuffer's Core Video image buffer for the media data     CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);      // Lock the base address of the pixel buffer     CVPixelBufferLockBaseAddress(imageBuffer, 0);       // Get the number of bytes per row for the pixel buffer     void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer);       // Get the number of bytes per row for the pixel buffer     size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);      // Get the pixel buffer width and height     size_t width = CVPixelBufferGetWidth(imageBuffer);      size_t height = CVPixelBufferGetHeight(imageBuffer);       // Create a device-dependent RGB color space     CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();       // Create a bitmap graphics context with the sample buffer data     CGContextRef context = CGBitmapContextCreate(baseAddress, width, height, 8,                                               bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);      // Create a Quartz image from the pixel data in the bitmap graphics context     CGImageRef quartzImage = CGBitmapContextCreateImage(context);      // Unlock the pixel buffer     CVPixelBufferUnlockBaseAddress(imageBuffer,0);       // Free up the context and color space     CGContextRelease(context);      CGColorSpaceRelease(colorSpace);      // Create an image object from the Quartz image     UIImage *image = [UIImage imageWithCGImage:quartzImage];      // Release the Quartz image     CGImageRelease(quartzImage);      return (image); }  -(void)setSession:(AVCaptureSession *)session {     NSLog(@"setting session...");     self.captureSession=session; } 

Capturing code works. But! I need to change to things: - video stream from the camera in my view. - getting images every (for example 5 seconds) from it. Help me please, how can it be done?

like image 417
Oleg Avatar asked Jan 19 '12 10:01

Oleg


1 Answers

Add the following line

output.minFrameDuration = CMTimeMake(5, 1); 

below the comment

 // If you wish to cap the frame rate to a known value, such as 15 fps, set  // minFrameDuration. 

but above the

[session startRunning]; 

Edit

Use the following code to preview the camera output.

AVCaptureVideoPreviewLayer *previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:session]; UIView *aView = self.view; CGRect videoRect = CGRectMake(0.0, 0.0, 320.0, 150.0); previewLayer.frame = videoRect; // Assume you want the preview layer to fill the view. [aView.layer addSublayer:previewLayer]; 

Edit 2: Ok fine..

Apple has provided a way to set the minFrameDuration here

So now, use the following code to set the frame duration

AVCaptureConnection *conn = [output connectionWithMediaType:AVMediaTypeVideo];  if (conn.supportsVideoMinFrameDuration)     conn.videoMinFrameDuration = CMTimeMake(5,1); if (conn.supportsVideoMaxFrameDuration)     conn.videoMaxFrameDuration = CMTimeMake(5,1); 
like image 133
Ilanchezhian Avatar answered Sep 21 '22 23:09

Ilanchezhian