Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Pixel formats, CVPixelBufferRefs and glReadPixels

I'm using glReadPixels to read data into a CVPixelBufferRef. I use the CVPixelBufferRef as the input into an AVAssetWriter. Unfortunately the pixel formats seem to be mismatched.

I think glReadPixels is returning pixel data in RGBA format while AVAssetWriter wants pixel data in ARGB format. What's the best way to convert RGBA to ARGB?

Here's what I've tried so far:

  • bit manipulation along the lines of argb = (rgba >> 8) | (rgba << 24)
  • using a CGImageRef as an intermediate step

The bit manipulation didn't work because CVPixelBufferRef doesn't seem to support subscripts. The CGImageRef intermediate step does work... but I'd prefer not to have 50 extra lines of code that could potentially be a performance hit.

like image 943
MrDatabase Avatar asked Oct 25 '11 21:10

MrDatabase


1 Answers

Better than using the CPU to swap the components would be to write a simple fragment shader to efficiently do it on the GPU as you render the image.

And the best bestest way is to completely remove the copying stage by using an iOS5 CoreVideo CVOpenGLESTextureCache which allows you to render straight to the CVPixelBufferRef, eliminating the call to glReadPixels.

p.s. I'm pretty sure AVAssetWriter wants data in BGRA format (actually it probably wants it in yuv, but that's another story).

UPDATE: as for links, the doco seems to still be under NDA, but there are two pieces of freely downloadable example code available:

GLCameraRipple and RosyWriter

The header files themselves contain good documentation, and the mac equivalent is very similar (CVOpenGLTextureCache), so you should have plenty to get you started.

like image 180
Rhythmic Fistman Avatar answered Sep 21 '22 01:09

Rhythmic Fistman