Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

AVCaptureSession only getting one frame for iPhone 3gs

I have a piece of code that sets up a capture session from the camera to process the frames using OpenCV and then set the image property of a UIImageView with a generated UIImage from the frame. When the app starts, the image view's image is nil and no frames show up until I push another view controller on the stack and then pop it off. Then the image stays the same until I do it again. NSLog statements show that the callback is called at approximately the correct frame rate. Any ideas why it doesn't show up? I reduced the framerate all the way to 2 frames a second. Is it not processing fast enough?

Here's the code:

- (void)setupCaptureSession {
    NSError *error = nil;

    // Create the session
    AVCaptureSession *session = [[AVCaptureSession alloc] init];

    // Configure the session to produce lower resolution video frames, if your 
    // processing algorithm can cope. We'll specify medium quality for the
    // chosen device.
    session.sessionPreset = AVCaptureSessionPresetLow;

    // Find a suitable AVCaptureDevice
    AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];

    // Create a device input with the device and add it to the session.
    AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device 
                                                                        error:&error];
    if (!input) {
        // Handling the error appropriately.
    }
    [session addInput:input];

    // Create a VideoDataOutput and add it to the session
    AVCaptureVideoDataOutput *output = [[[AVCaptureVideoDataOutput alloc] init] autorelease];
    output.alwaysDiscardsLateVideoFrames = YES;
    [session addOutput:output];

    // Configure your output.
    dispatch_queue_t queue = dispatch_queue_create("myQueue", NULL);
    [output setSampleBufferDelegate:self queue:queue];
    dispatch_release(queue);

    // Specify the pixel format
    output.videoSettings = 
    [NSDictionary dictionaryWithObject:
     [NSNumber numberWithInt:kCVPixelFormatType_32BGRA] 
                                forKey:(id)kCVPixelBufferPixelFormatTypeKey];


    // If you wish to cap the frame rate to a known value, such as 15 fps, set 
    // minFrameDuration.
    output.minFrameDuration = CMTimeMake(1, 1);

    // Start the session running to start the flow of data
    [session startRunning];

    // Assign session to an ivar.
    [self setSession:session];
}

// Create a UIImage from sample buffer data
- (UIImage *) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer {
    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
    // Lock the base address of the pixel buffer
    CVPixelBufferLockBaseAddress(imageBuffer,0);

    // Get the number of bytes per row for the pixel buffer
    size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer); 
    // Get the pixel buffer width and height
    size_t width = CVPixelBufferGetWidth(imageBuffer); 
    size_t height = CVPixelBufferGetHeight(imageBuffer); 

    // Create a device-dependent RGB color space
    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB(); 
    if (!colorSpace) 
     {
        NSLog(@"CGColorSpaceCreateDeviceRGB failure");
        return nil;
     }

    // Get the base address of the pixel buffer
    void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer);
    // Get the data size for contiguous planes of the pixel buffer.
    size_t bufferSize = CVPixelBufferGetDataSize(imageBuffer); 

    // Create a Quartz direct-access data provider that uses data we supply
    CGDataProviderRef provider = CGDataProviderCreateWithData(NULL, baseAddress, bufferSize, 
                                                              NULL);
    // Create a bitmap image from data supplied by our data provider
    CGImageRef cgImage = 
    CGImageCreate(width,
                  height,
                  8,
                  32,
                  bytesPerRow,
                  colorSpace,
                  kCGImageAlphaNoneSkipFirst | kCGBitmapByteOrder32Little,
                  provider,
                  NULL,
                  true,
                  kCGRenderingIntentDefault);
    CGDataProviderRelease(provider);
    CGColorSpaceRelease(colorSpace);

    // Create and return an image object representing the specified Quartz image
    UIImage *image = [UIImage imageWithCGImage:cgImage];
    CGImageRelease(cgImage);

    CVPixelBufferUnlockBaseAddress(imageBuffer, 0);

    return image;
}


// Delegate routine that is called when a sample buffer was written
- (void)captureOutput:(AVCaptureOutput *)captureOutput 
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer 
       fromConnection:(AVCaptureConnection *)connection {
    // Create a UIImage from the sample buffer data
    UIImage *image = [self imageFromSampleBuffer:sampleBuffer];
    [self.delegate cameraCaptureGotFrame:image];
}
like image 294
Matt Williamson Avatar asked Aug 30 '10 17:08

Matt Williamson


2 Answers

This could be related to threading—Try:

[self.delegate performSelectorOnMainThread:@selector(cameraCaptureGotFrame:) withObject:image waitUntilDone:NO];
like image 99
Art Gillespie Avatar answered Oct 16 '22 17:10

Art Gillespie


This looks like a threading issue. You cannot update your views in any other thread than in the main thread. In your setup, which is good, the delegate function captureOutput:didOutputSampleBuffer: is called in a secondary thread. So you cannot set the image view from there. Art Gillespie's answer is one way of solving it if you can get rid of the bad access error.

Another way is to modify the sample buffer in captureOutput:didOutputSampleBuffer: and have is shown by adding a AVCaptureVideoPreviewLayer instance to your capture session. That's certainly the preferred way if you only modify a small part of the image such as highlighting something.

BTW: Your bad access error could arise because you don't retain the created image in the secondary thread and so it will be freed before cameraCaptureGotFrame is called on the main thread.

Update: To properly retain the image, increase the reference count in captureOutput:didOutputSampleBuffer: (in the secondary thread) and decrement it in cameraCaptureGotFrame: (in the main thread).

// Delegate routine that is called when a sample buffer was written
- (void)captureOutput:(AVCaptureOutput *)captureOutput 
        didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer 
        fromConnection:(AVCaptureConnection *)connection
{
    // Create a UIImage from the sample buffer data
    UIImage *image = [self imageFromSampleBuffer:sampleBuffer];

    // increment ref count
    [image retain];
    [self.delegate performSelectorOnMainThread:@selector(cameraCaptureGotFrame:)
        withObject:image waitUntilDone:NO];

}

- (void) cameraCaptureGotFrame:(UIImage*)image
{
    // whatever this function does, e.g.:
    imageView.image = image;

    // decrement ref count
    [image release];
}

If you don't increment the reference count, the image is freed by the auto release pool of the second thread before the cameraCaptureGotFrame: is called in the main thread. If you don't decrement it in the main thread, the images are never freed and you run out of memory within a few seconds.

like image 29
Codo Avatar answered Oct 16 '22 16:10

Codo