I'm trying to use cocoa to grab images from a webcam. I'm able to get the image in RGBA format using the QTKit and the didOutputVideoFrame delegate call, and converting the CVImageBuffer to a CIImage and then to a NSBitmapImageRep.
I know my camera grabs natively in YUV, what I want is to get the YUV data directly from the CVImageBuffer, and proccess the YUV frame before displaying it.
My question is: How can I get the YUV data from the CVImageBuffer?
thanks.
You might be able to create a CIImage
from the buffer using +[CIImage imageWithCVBuffer:]
and then render that CIImage into a CGBitmapContext
of the desired pixel format.
Note, I have not tested this solution.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With