Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

UIImage created from CMSampleBufferRef not displayed in UIImageView?

I'm trying to display a UIImage in real-time coming from the camera, and it seems that my UIImageView is not displaying the image properly. This is the method which a AVCaptureVideoDataOutputSampleBufferDelegate has to implement

- (void)captureOutput:(AVCaptureOutput *)captureOutput 
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer 
       fromConnection:(AVCaptureConnection *)connection
{ 
    // Create a UIImage from the sample buffer data
    UIImage *theImage = [self imageFromSampleBuffer:sampleBuffer];
//    NSLog(@"Got an image! %f %f", theImage.size.width, theImage.size.height);
//    NSLog(@"The image view is %@", imageView);
//    UIImage *theImage = [[UIImage alloc] initWithData:[NSData 
//        dataWithContentsOfURL:[NSURL 
//        URLWithString:@"http://farm4.static.flickr.com/3092/2915896504_a88b69c9de.jpg"]]];
    [self.session stopRunning];
    [imageView setImage: theImage];
}

To get the easy problems out of the way:

  • Using UIImagePickerController is not an option (eventually we will actually do things with the image)
  • I know the handler is being called (the NSLog calls are made, and I see the output)
  • I know I have the IBOutlet declarations set up correctly. If I use the commented code above to load an arbitrary image from the web instead of simply sending setImage:theImage to the imageView, the image is loaded correctly (and the second call to NSLog reports a non-nil object).
  • At least to a basic extent, the image I get from imageFromSampleBuffer: is fine, since NSLog reports the size to be 360x480, which is the size I expected.

The code I'm using is the recently-posted AVFoundation snippet from Apple available here.

In particular, that is the code I use which sets up the AVCaptureSession object and friends (of which I understand very little), and creates the UIImage object from the Core Video buffers (that's the imageFromSampleBuffer method).

Finally, I can get the application to crash if I try to send drawInRect: to a plain UIView subclass with the UIImage returned by imageFromSamplerBuffer, while it doesn't crash if I use an UIImage from a URL as above. Here is the stack trace from the debugger inside the crash (I get a EXC_BAD_ACCESS signal):

#0  0x34a977ee in decode_swap ()
#1  0x34a8f80e in decode_data ()
#2  0x34a8f674 in img_decode_read ()
#3  0x34a8a76e in img_interpolate_read ()
#4  0x34a63b46 in img_data_lock ()
#5  0x34a62302 in CGSImageDataLock ()
#6  0x351ab812 in ripc_AcquireImage ()
#7  0x351a8f28 in ripc_DrawImage ()
#8  0x34a620f6 in CGContextDelegateDrawImage ()
#9  0x34a61fb4 in CGContextDrawImage ()
#10 0x321fd0d0 in -[UIImage drawInRect:blendMode:alpha:] ()
#11 0x321fcc38 in -[UIImage drawInRect:] ()

EDIT: Here's some more information about the UIImage being returned by that bit of code.

Using the method described here, I can get to the pixels and print them, and they look ok at first glance (every value in the alpha channel is 255, for example). However, there's something slightly off with the buffer sizes. The image I get from Flickr from that URL is 375x500, and its [pixelData length] gives me 750000 = 375*500*4, which is the expected value. However, the pixel data of image returned from imageFromSampleBuffer: has size 691208 = 360*480*4 + 8, so there's 8 extra bytes in the pixel data. CVPixelBufferGetDataSize itself returns this off-by-8 value. I thought for a moment that it could be due to allocating buffers at aligned positions in memory, but 691200 is a multiple of 256, so that doesn't explain it either. This size discrepancy is the only difference I can tell between the two UIImages, and it could be causing the trouble. Still, there's no reason allocating extra memory for the buffer should cause a EXC_BAD_ACCESS violation.

Thanks a lot for any help, and let me know if you need more information.

like image 560
Carlos Scheidegger Avatar asked Jul 22 '10 04:07

Carlos Scheidegger


People also ask

What is the difference between UIImage and UIImageView?

UIImage contains the data for an image. UIImageView is a custom view meant to display the UIImage .

How do you declare UIImage in Objective C?

For example: UIImage *img = [[UIImage alloc] init]; [img setImage:[UIImage imageNamed:@"anyImageName"]]; My UIImage object is declared in .

What is UIImageView in swift?

An object that displays a single image or a sequence of animated images in your interface.

How do I copy UIImage?

Deep copying UIImage *newImage = [UIImage imageWithData:UIImagePNGRepresentation(oldImage)]; This will copy the data but will require setting the orientation property before handing it to something like UIImageView for proper display. Another way to deep copy would be to draw into the context and grab the result.


5 Answers

I had the same problem ... but I found this old post, and its method of creating the CGImageRef works!

http://forum.unity3d.com/viewtopic.php?p=300819

Here's a working sample:

app has a member UIImage theImage;

// Delegate routine that is called when a sample buffer was written
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer  fromConnection:(AVCaptureConnection *)connection
{
        //... just an example of how to get an image out of this ...

    CGImageRef cgImage = [self imageFromSampleBuffer:sampleBuffer];
    theImage.image =     [UIImage imageWithCGImage: cgImage ];
    CGImageRelease( cgImage );
}

- (CGImageRef) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer // Create a CGImageRef from sample buffer data
{
    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); 
    CVPixelBufferLockBaseAddress(imageBuffer,0);        // Lock the image buffer 

    uint8_t *baseAddress = (uint8_t *)CVPixelBufferGetBaseAddressOfPlane(imageBuffer, 0);   // Get information of the image 
    size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer); 
    size_t width = CVPixelBufferGetWidth(imageBuffer); 
    size_t height = CVPixelBufferGetHeight(imageBuffer); 
    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB(); 

    CGContextRef newContext = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst); 
    CGImageRef newImage = CGBitmapContextCreateImage(newContext); 
    CGContextRelease(newContext); 

    CGColorSpaceRelease(colorSpace); 
    CVPixelBufferUnlockBaseAddress(imageBuffer,0); 
    /* CVBufferRelease(imageBuffer); */  // do not call this!

    return newImage;
}
like image 72
Ken Pletzer Avatar answered Sep 22 '22 13:09

Ken Pletzer


Live capture of video frames is now well explained by Apple's Technical Q&A QA1702:

https://developer.apple.com/library/ios/#qa/qa1702/_index.html

like image 31
matt Avatar answered Sep 25 '22 13:09

matt


It is also important to set right output format. I had a problem with image capturing when used default format settings. It should be:

[videoDataOutput setVideoSettings:[NSDictionary dictionaryWithObject:[NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA] forKey:(NSString*)kCVPixelBufferPixelFormatTypeKey]];
like image 33
Vladimir Avatar answered Sep 24 '22 13:09

Vladimir


Ben Loulier has a good write up on how to do this.

I am using his example app as a starting point, and it is working for me. Along with replacing the imageFromSamplerBuffer function with something that creates CGImageRef with CGBitmapContextCreate, he is using the main dispatch queue (via dispatch_get_main_queue()) when setting the output sample buffer delegate. This isn't the best solution because it needs a serial queue, and from what I understand the main queue is not a serial queue. So while you aren't guaranteed to get the frames in the correct order it seems to work for me so far :)

like image 34
sroske Avatar answered Sep 24 '22 13:09

sroske


Another thing to look out for is whether you're actually updating your UIImageView on the main thread: if you aren't, chances are it won't reflect any changes.

captureOutput:didOutputSampleBuffer:fromConnection delegate method is often called on a background thread. So you want to do this:

- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
       fromConnection:(AVCaptureConnection *)connection
{   
    CFRetain(sampleBuffer);

    [[NSOperationQueue mainQueue] addOperationWithBlock:^{

        //Now we're definitely on the main thread, so update the imageView:
        UIImage *capturedImage = [self imageFromSampleBuffer:sampleBuffer];

        //Display the image currently being captured:
        imageView.image = capturedImage;

        CFRelease(sampleBuffer);
    }];
}
like image 38
Eric Avatar answered Sep 21 '22 13:09

Eric