I'm trying to display a UIImage in real-time coming from the camera, and it seems that my UIImageView is not displaying the image properly. This is the method which a AVCaptureVideoDataOutputSampleBufferDelegate
has to implement
- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
fromConnection:(AVCaptureConnection *)connection
{
// Create a UIImage from the sample buffer data
UIImage *theImage = [self imageFromSampleBuffer:sampleBuffer];
// NSLog(@"Got an image! %f %f", theImage.size.width, theImage.size.height);
// NSLog(@"The image view is %@", imageView);
// UIImage *theImage = [[UIImage alloc] initWithData:[NSData
// dataWithContentsOfURL:[NSURL
// URLWithString:@"http://farm4.static.flickr.com/3092/2915896504_a88b69c9de.jpg"]]];
[self.session stopRunning];
[imageView setImage: theImage];
}
To get the easy problems out of the way:
setImage:theImage
to the imageView, the image is loaded correctly (and the second call to NSLog reports a non-nil object).imageFromSampleBuffer:
is fine, since NSLog reports the size to be 360x480, which is the size I expected.The code I'm using is the recently-posted AVFoundation
snippet from Apple available here.
In particular, that is the code I use which sets up the AVCaptureSession
object and friends (of which I understand very little), and creates the UIImage object from the Core Video buffers (that's the imageFromSampleBuffer
method).
Finally, I can get the application to crash if I try to send drawInRect:
to a plain UIView subclass with the UIImage
returned by imageFromSamplerBuffer
, while it doesn't crash if I use an UIImage
from a URL as above. Here is the stack trace from the debugger inside the crash (I get a EXC_BAD_ACCESS signal):
#0 0x34a977ee in decode_swap ()
#1 0x34a8f80e in decode_data ()
#2 0x34a8f674 in img_decode_read ()
#3 0x34a8a76e in img_interpolate_read ()
#4 0x34a63b46 in img_data_lock ()
#5 0x34a62302 in CGSImageDataLock ()
#6 0x351ab812 in ripc_AcquireImage ()
#7 0x351a8f28 in ripc_DrawImage ()
#8 0x34a620f6 in CGContextDelegateDrawImage ()
#9 0x34a61fb4 in CGContextDrawImage ()
#10 0x321fd0d0 in -[UIImage drawInRect:blendMode:alpha:] ()
#11 0x321fcc38 in -[UIImage drawInRect:] ()
EDIT: Here's some more information about the UIImage being returned by that bit of code.
Using the method described here, I can get to the pixels and print them, and they look ok at first glance (every value in the alpha channel is 255, for example). However, there's something slightly off with the buffer sizes. The image I get from Flickr from that URL is 375x500, and its [pixelData length]
gives me 750000 = 375*500*4
, which is the expected value. However, the pixel data of image returned from imageFromSampleBuffer:
has size 691208 = 360*480*4 + 8
, so there's 8 extra bytes in the pixel data. CVPixelBufferGetDataSize
itself returns this off-by-8 value. I thought for a moment that it could be due to allocating buffers at aligned positions in memory, but 691200 is a multiple of 256, so that doesn't explain it either. This size discrepancy is the only difference I can tell between the two UIImages, and it could be causing the trouble. Still, there's no reason allocating extra memory for the buffer should cause a EXC_BAD_ACCESS violation.
Thanks a lot for any help, and let me know if you need more information.
UIImage contains the data for an image. UIImageView is a custom view meant to display the UIImage .
For example: UIImage *img = [[UIImage alloc] init]; [img setImage:[UIImage imageNamed:@"anyImageName"]]; My UIImage object is declared in .
An object that displays a single image or a sequence of animated images in your interface.
Deep copying UIImage *newImage = [UIImage imageWithData:UIImagePNGRepresentation(oldImage)]; This will copy the data but will require setting the orientation property before handing it to something like UIImageView for proper display. Another way to deep copy would be to draw into the context and grab the result.
I had the same problem ... but I found this old post, and its method of creating the CGImageRef works!
http://forum.unity3d.com/viewtopic.php?p=300819
Here's a working sample:
app has a member UIImage theImage;
// Delegate routine that is called when a sample buffer was written
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
//... just an example of how to get an image out of this ...
CGImageRef cgImage = [self imageFromSampleBuffer:sampleBuffer];
theImage.image = [UIImage imageWithCGImage: cgImage ];
CGImageRelease( cgImage );
}
- (CGImageRef) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer // Create a CGImageRef from sample buffer data
{
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
CVPixelBufferLockBaseAddress(imageBuffer,0); // Lock the image buffer
uint8_t *baseAddress = (uint8_t *)CVPixelBufferGetBaseAddressOfPlane(imageBuffer, 0); // Get information of the image
size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
size_t width = CVPixelBufferGetWidth(imageBuffer);
size_t height = CVPixelBufferGetHeight(imageBuffer);
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef newContext = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
CGImageRef newImage = CGBitmapContextCreateImage(newContext);
CGContextRelease(newContext);
CGColorSpaceRelease(colorSpace);
CVPixelBufferUnlockBaseAddress(imageBuffer,0);
/* CVBufferRelease(imageBuffer); */ // do not call this!
return newImage;
}
Live capture of video frames is now well explained by Apple's Technical Q&A QA1702:
https://developer.apple.com/library/ios/#qa/qa1702/_index.html
It is also important to set right output format. I had a problem with image capturing when used default format settings. It should be:
[videoDataOutput setVideoSettings:[NSDictionary dictionaryWithObject:[NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA] forKey:(NSString*)kCVPixelBufferPixelFormatTypeKey]];
Ben Loulier has a good write up on how to do this.
I am using his example app as a starting point, and it is working for me. Along with replacing the imageFromSamplerBuffer
function with something that creates CGImageRef with CGBitmapContextCreate, he is using the main dispatch queue (via dispatch_get_main_queue()
) when setting the output sample buffer delegate. This isn't the best solution because it needs a serial queue, and from what I understand the main queue is not a serial queue. So while you aren't guaranteed to get the frames in the correct order it seems to work for me so far :)
Another thing to look out for is whether you're actually updating your UIImageView
on the main thread: if you aren't, chances are it won't reflect any changes.
captureOutput:didOutputSampleBuffer:fromConnection
delegate method is often called on a background thread. So you want to do this:
- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
fromConnection:(AVCaptureConnection *)connection
{
CFRetain(sampleBuffer);
[[NSOperationQueue mainQueue] addOperationWithBlock:^{
//Now we're definitely on the main thread, so update the imageView:
UIImage *capturedImage = [self imageFromSampleBuffer:sampleBuffer];
//Display the image currently being captured:
imageView.image = capturedImage;
CFRelease(sampleBuffer);
}];
}
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With