Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

CVPixelBuffer to NSData And Back

new to the video processing and am stuck here for a few days.

I have a CVPixelBufferRef that is in YUV (YCbCr 4:2:0) format. I grab the base address using CVPixelBufferGetBaseAddress.

How do I take the bytes at the base address and create a new CVPixelBufferRef, one that is also in the same YUV format?

I tried:

CVPixelBufferCreateWithBytes(CFAllocatorGetDefault(), 1440, 900, kCVPixelFormatType_420YpCbCr8BiPlanarFullRange, currentFrame, 2208, NULL, NULL, (pixelBufferAttributes), &imageBuffer);

Which creates a CVPixelBufferRef, but I can't do anything with it (i.e. convert it to a CIImage, render it, etc.).

Ultimately, my goal is to take the bytes I receive that are from the base address call and to just display them on the screen. I know I can do that directly without the base address call, but I have a limitation that only allows me to receive the base address bytes.

like image 223
Justin Time Avatar asked Oct 15 '14 21:10

Justin Time


1 Answers

For reference,

The reason I could not get a CIImage from the CVPixelBuffer is because it is not IOSurface backed. To ensure it is IOSurface backed, use CVPixelBufferCreate and then CVPixelBufferGetBaseAddress (or CVPixelBufferGetBaseAddressOfPlane if planar data) and memcpy your bytes into that address.

Hope this helps someone in the future.

like image 62
Justin Time Avatar answered Sep 19 '22 10:09

Justin Time