Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to determine and interpret the pixel format of a CGImage

I'm loading this (very small) image using:

UIImage* image = [UIImage named:@"someFile.png"];

The image is 4x1 and it contains a red, green, blue and white pixel from left to right, in that order.

Next, I get the pixel data out of the underlying CGImage:

NSData* data = (NSData*)CGDataProviderCopyData(CGImageGetDataProvider(image.CGImage));

Now, for some reason, the pixel data is laid out differently depending on the iOS device.

When I run the app in the simulator or on my iPhone 4, the pixel data looks like this:

(255,0,0),(0,255,0),(0,0,255),(255,255,255)

So, the pixels are 3 bytes per pixel, with blue as the most significant byte and red as the least significant. So I guess you call that BGR?

When I check the CGBitmapInfo, I can see that the kCGBitmapByteOrderMask is kCGBitmapByteOrderDefault. I can't find anywhere that explains what "default" is.

On the other hand, when I run it on my first gen iPhone, the pixel data looks like this:

(0,0,255,255),(0,255,0,255),(255,0,0,255),(255,255,255,255)

So 4 bytes per channel, alpha as the most significant byte, and blue as the least significant. So... that's called ARGB?

I've been looking at the CGBitmapInfo for clues on how to detect the layout. On the first gen iPhone, the kCGBitmapAlphaInfoMask is kCGImageAlphaNoneSkipFirst. That means that the most significant bits are ignored. So that makes sense. On the first gen iPhone the kCGBitmapByteOrderMask is kCGBitmapByteOrder32Little. I don't know what that means or how to relate it back to how the R, G and B components are laid out in memory. Can anyone shed some light on this?

Thanks.

like image 486
Matt Comi Avatar asked Feb 03 '23 14:02

Matt Comi


2 Answers

To ensure device independence, it may be better to use a CGBitmapContext to populate the data for you.

Something like this should work

// Get the CGImageRef
CGImageRef imageRef = [theImage CGImage];

// Find width and height
NSUInteger width = CGImageGetWidth(imageRef);
NSUInteger height = CGImageGetHeight(imageRef);

// Setup color space
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();

// Alloc data that the image data will be put into
unsigned char *rawData = malloc(height * width * 4);

// Create a CGBitmapContext to draw an image into
NSUInteger bytesPerPixel = 4;
NSUInteger bytesPerRow = bytesPerPixel * width;
NSUInteger bitsPerComponent = 8;
CGContextRef context = CGBitmapContextCreate(rawData, width, height,
                                             bitsPerComponent, bytesPerRow, colorSpace,
                                             kCGImageAlphaPremultipliedLast | kCGBitmapByteOrder32Big);
CGColorSpaceRelease(colorSpace);

// Draw the image which will populate rawData
CGContextDrawImage(context, CGRectMake(0, 0, width, height), imageRef);
CGContextRelease(context);


for (NSUInteger y = 0; y < height; y++) {
    for (NSUInteger x = 0; x < width; x++) {        
        int byteIndex = (bytesPerRow * y) + x * bytesPerPixel;

        CGFloat red = rawData[byteIndex];
        CGFloat green = rawData[byteIndex + 1];
        CGFloat blue = rawData[byteIndex + 2];
        CGFloat alpha = rawData[byteIndex + 3];
    }
}

free(rawData);
like image 161
David Fumberger Avatar answered Feb 05 '23 04:02

David Fumberger


I'm sure in 5+ years you've found a solution, but this is still a shady area of Core Graphics, so wanted to drop in my two cents.

Different devices and file formats may use different byte order for various reasons, mostly because they can and because of performance. There's plenty of information around on this, including RGBA color space representation on Wikipedia.

Core Graphics often uses kCGBitmapByteOrderDefault, which is rather useless, but it also defines host endian bitmap formats, which you can use for cross reference:

#ifdef __BIG_ENDIAN__
#define kCGBitmapByteOrder16Host kCGBitmapByteOrder16Big
#define kCGBitmapByteOrder32Host kCGBitmapByteOrder32Big
#else
#define kCGBitmapByteOrder16Host kCGBitmapByteOrder16Little
#define kCGBitmapByteOrder32Host kCGBitmapByteOrder32Little
#endif

When used with Swift, this is also useless, because those #define's aren't available as is. One way to work around this is to create a bridging header and equivalent implementation and redefine those constants.

// Bridge.h

extern const int CGBitmapByteOrder16Host;
extern const int CGBitmapByteOrder32Host;

// Bridge.m

#import "Bridge.h"

const int CGBitmapByteOrder16Host = kCGBitmapByteOrder16Host;
const int CGBitmapByteOrder32Host = kCGBitmapByteOrder32Host;

Now CGBitmapByteOrder16Host and CGBitmapByteOrder32Host constants should be available from Swift.

like image 42
Ian Bytchek Avatar answered Feb 05 '23 04:02

Ian Bytchek