Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

What is the most efficient way to display CVImageBufferRef on iOS

I have CMSampleBufferRef(s) which I decode using VTDecompressionSessionDecodeFrame which results in CVImageBufferRef after decoding of a frame has completed, so my questions is..

What would be the most efficient way to display these CVImageBufferRefs in UIView?

I have succeeded in converting CVImageBufferRef to CGImageRef and displaying those by settings CGImageRef as CALayer's content but then DecompressionSession has been configured with @{ (id)kCVPixelBufferPixelFormatTypeKey: [NSNumber numberWithInt:kCVPixelFormatType_32BGRA] };

Here is example/code how I've converted CVImageBufferRef to CGImageRef (note: cvpixelbuffer data has to be in 32BGRA format for this to work)

    CVPixelBufferLockBaseAddress(cvImageBuffer,0);
    // get image properties 
    uint8_t *baseAddress = (uint8_t *)CVPixelBufferGetBaseAddress(cvImageBuffer);
    size_t bytesPerRow   = CVPixelBufferGetBytesPerRow(cvImageBuffer);
    size_t width         = CVPixelBufferGetWidth(cvImageBuffer);
    size_t height        = CVPixelBufferGetHeight(cvImageBuffer);

    /*Create a CGImageRef from the CVImageBufferRef*/
    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
    CGContextRef    cgContext  = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
    CGImageRef cgImage = CGBitmapContextCreateImage(cgContext);

    // release context and colorspace 
    CGContextRelease(cgContext);
    CGColorSpaceRelease(colorSpace);

    // now CGImageRef can be displayed either by setting CALayer content
    // or by creating a [UIImage withCGImage:geImage] that can be displayed on
    // UIImageView ...

The #WWDC14 session 513 (https://developer.apple.com/videos/wwdc/2014/#513) hints that YUV -> RGB colorspace conversion (using CPU?) can be avoided and if YUV capable GLES magic is used - wonder what that might be and how this could be accomplished?

Apple's iOS SampleCode GLCameraRipple shows an example of displaying YUV CVPixelBufferRef captured from camera using 2 OpenGLES with separate textures for Y and UV components and a fragment shader program that does the YUV to RGB colorspace conversion calculations using GPU - is all that really required, or is there some more straightforward way how this can be done?

NOTE: In my use case I'm unable to use AVSampleBufferDisplayLayer, due to fact how the input to decompression becomes available.

like image 514
user2690268 Avatar asked Sep 29 '15 17:09

user2690268


1 Answers

Update: The original answer below does not work because kCVPixelBufferIOSurfaceCoreAnimationCompatibilityKey is unavailable for iOS.


UIView is backed by a CALayer whose contents property supports multiple types of images. As detailed in my answer to a similar question for macOS, it is possible to use CALayer to render a CVPixelBuffer’s backing IOSurface. (Caveat: I have only tested this on macOS.)

like image 147
fumoboy007 Avatar answered Oct 03 '22 10:10

fumoboy007