I want to convert a UIImage object to a CVPixelBufferRef object, but I have absolutly no idea. And I can't find any example code doing anything like this.
Can someone please help me? THX in advance!
C YA
You can use Core Image to create a CVPixelBuffer from a UIImage.
// 1. Create a CIImage with the underlying CGImage encapsulated by the UIImage (referred to as 'image'):
CIImage *inputImage = [CIImage imageWithCGImage:image.CGImage];
// 2. Create a CIContext:
Context *ciContext = [CIContext contextWithCGContext:UIGraphicsGetCurrentContext() options:nil];
// 3. Render the CIImage to a CVPixelBuffer (referred to as 'outputBuffer'):
[self.ciContext render:img toCVPixelBuffer:outputBuffer];
AVFoundation provides classes that read video files (called assets) and from the output of other AVFoundation objects that handle (or have already read) assets into pixel buffers. If that is your only concern, you'll find what you're looking for in the Sample Photo Editing Extension sample code.
If your source is generated from a series of UIImage objects (perhaps there was no source file, and you are creating a new file from user-generated content), then the sample code provided above will suffice.
NOTE: It is not the most efficient means nor the only means to convert a UIImage into a CVPixelBuffer; but, it is BY FAR the easiest means. Using Core Graphics to convert a UIImage into a CVPixelBuffer requires a lot more code to set up attributes, such as pixel buffer size and colorspace, which Core Image takes care of for you.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With