Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

iOS6 : How to use the conversion feature of YUV to RGB from cvPixelBufferref to CIImage

From iOS6, Apple has given the provision to use native YUV to CIImage through this call

initWithCVPixelBuffer:options:

In the core Image Programming guide, they have mentioned about this feature

Take advantage of the support for YUV image in iOS 6.0 and later. Camera pixel buffers are natively YUV but most image processing algorithms expect RBGA data. There is a cost to converting between the two. Core Image supports reading YUB from CVPixelBuffer objects and applying the appropriate color transform.

options = @{ (id)kCVPixelBufferPixelFormatTypeKey : @(kCVPixelFormatType_420YpCvCr88iPlanarFullRange) };

But, I am unable to use it properly. I have a raw YUV data. So, this is what i did

                void *YUV[3] = {data[0], data[1], data[2]};
                size_t planeWidth[3] = {width, width/2, width/2};
                size_t planeHeight[3] = {height, height/2, height/2};
                size_t planeBytesPerRow[3] = {stride, stride/2, stride/2};
                CVPixelBufferRef pixelBuffer = NULL;
                CVReturn ret = CVPixelBufferCreateWithPlanarBytes(kCFAllocatorDefault,
                               width, 
                               height,
                               kCVPixelFormatType_420YpCbCr8PlanarFullRange, 
                               nil,
                               width*height*1.5,
                               3, 
                               YUV,
                               planeWidth,
                               planeHeight, 
                               planeBytesPerRow, 
                               nil,
                               nil, nil, &pixelBuffer); 

    NSDict *opt =  @{ (id)kCVPixelBufferPixelFormatTypeKey :
                        @(kCVPixelFormatType_420YpCbCr8PlanarFullRange) };

CIImage *image = [[CIImage alloc]   initWithCVPixelBuffer:pixelBuffer options:opt];

I am getting nil for image. Anyy idea what I am missing.

EDIT: I added lock and unlock base address before call. Also, I dumped the data of pixelbuffer to ensure pixellbuffer propely hold the data. It looks like something wrong with the init call only. Still CIImage object is returning nil.

 CVPixelBufferLockBaseAddress(pixelBuffer, 0);
CIImage *image = [[CIImage alloc]   initWithCVPixelBuffer:pixelBuffer options:opt];
 CVPixelBufferUnlockBaseAddress(pixelBuffer,0);
like image 435
Rugger Avatar asked Nov 08 '13 04:11

Rugger


1 Answers

There should be error message in console: initWithCVPixelBuffer failed because the CVPixelBufferRef is not IOSurface backed. See Apple's Technical Q&A QA1781 for how to create an IOSurface-backed CVPixelBuffer.

Calling CVPixelBufferCreateWithBytes() or CVPixelBufferCreateWithPlanarBytes() will result in CVPixelBuffers that are not IOSurface-backed...

...To do that, you must specify kCVPixelBufferIOSurfacePropertiesKey in the pixelBufferAttributes dictionary when creating the pixel buffer using CVPixelBufferCreate().

NSDictionary *pixelBufferAttributes = [NSDictionary dictionaryWithObjectsAndKeys:
    [NSDictionary dictionary], (id)kCVPixelBufferIOSurfacePropertiesKey,
    nil];
// you may add other keys as appropriate, e.g. kCVPixelBufferPixelFormatTypeKey,     kCVPixelBufferWidthKey, kCVPixelBufferHeightKey, etc.
 
CVPixelBufferRef pixelBuffer;
CVPixelBufferCreate(... (CFDictionaryRef)pixelBufferAttributes,  &pixelBuffer);

Alternatively, you can make IOSurface-backed CVPixelBuffers using CVPixelBufferPoolCreatePixelBuffer() from an existing pixel buffer pool, if the pixelBufferAttributes dictionary provided to CVPixelBufferPoolCreate() includes kCVPixelBufferIOSurfacePropertiesKey.

like image 193
Quotation Avatar answered Nov 05 '22 10:11

Quotation