Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

iOS Performance Tuning: fastest way to get pixel color for large images

There are a number of questions/answers regarding how to get the pixel color of an image for a given point. However, all of these answers are really slow (100-500ms) for large images (even as small as 1000 x 1300, for example).

Most of the code samples out there draw to an image context. All of them take time when the actual draw takes place:

CGContextDrawImage(context, CGRectMake(0.0f, 0.0f, (CGFloat)width, (CGFloat)height), cgImage)

Examining this in Instruments reveals that the draw is being done by copying the data from the source image:

enter image description here

I have even tried a different means of getting at the data, hoping that getting to the bytes themselves would actually prove much more efficient.

NSInteger pointX = trunc(point.x);
NSInteger pointY = trunc(point.y);
CGImageRef cgImage = CGImageCreateWithImageInRect(self.CGImage, 
                           CGRectMake(pointX * self.scale, 
                                      pointY * self.scale, 
                                      1.0f, 
                                      1.0f));

CGDataProviderRef provider = CGImageGetDataProvider(cgImage);
CFDataRef data = CGDataProviderCopyData(provider);

CGImageRelease(cgImage);

UInt8* buffer = (UInt8*)CFDataGetBytePtr(data);

CGFloat red   = (float)buffer[0] / 255.0f;
CGFloat green = (float)buffer[1] / 255.0f;
CGFloat blue  = (float)buffer[2] / 255.0f;
CGFloat alpha = (float)buffer[3] / 255.0f;

CFRelease(data);

UIColor *pixelColor = [UIColor colorWithRed:red green:green blue:blue alpha:alpha];

return pixelColor;

This method takes it's time on the data copy:

CFDataRef data = CGDataProviderCopyData(provider);

It would appear that it too is reading the data from disk, instead of the CGImage instance I am creating:

enter image description here

Now, this method, in some informal testing does perform better, but it is still not as fast I want it to be. Does anyone know of an even faster way of getting the underlying pixel data???

like image 486
Wayne Hartman Avatar asked May 01 '12 22:05

Wayne Hartman


3 Answers

If it's possible for you to draw this image to the screen via OpenGL ES, you can get extremely fast random access to the underlying pixels in iOS 5.0 via the texture caches introduced in that version. They allow for direct memory access to the underlying BGRA pixel data stored in an OpenGL ES texture (where your image would be residing), and you could pick out any pixel from that texture almost instantaneously.

I use this to read back the raw pixel data of even large (2048x2048) images, and the read times are at worst in the range of 10-20 ms to pull down all of those pixels. Again, random access to a single pixel there takes almost no time, because you're just reading from a location in a byte array.

Of course, this means that you'll have to parse and upload your particular image to OpenGL ES, which will involve the same reading from disk and interactions with Core Graphics (if going through a UIImage) that you'd see if you tried to read pixel data from a random PNG on disk, but it sounds like you just need to render once and sample from it multiple times. If so, OpenGL ES and the texture caches on iOS 5.0 would be the absolute fastest way to read back this pixel data for something also displayed onscreen.

I encapsulate these processes in the GPUImagePicture (image upload) and GPUImageRawData (fast raw data access) classes within my open source GPUImage framework, if you want to see how something like that might work.

like image 135
Brad Larson Avatar answered Nov 01 '22 11:11

Brad Larson


I have yet to find a way to get access to the drawn (in frame buffer) pixels. The fastest method I've measured is:

  1. Indicate you want the image to be cached by specifying kCGImageSourceShouldCache when creating it.
  2. (optional) Precache the image by forcing it to render.
  3. Draw the image a 1x1 bitmap context.

The cost of this method is the cached bitmap, which may have a lifetime as long as the CGImage it is associated with. The code ends up looking something like this:

  1. Create image w/ ShouldCache flag

    NSDictionary *options = @{ (id)kCGImageSourceShouldCache: @(YES) };
    CGImageSourceRef imageSource = CGImageSourceCreateWithData((__bridge CFDataRef)imageData, NULL);
    CGImageRef cgimage = CGImageSourceCreateImageAtIndex(imageSource, 0, (__bridge CFDictionaryRef)options);
    UIImage *image = [UIImage imageWithCGImage:cgimage];
    CGImageRelease(cgimage);
    
  2. Precache image

    UIGraphicsBeginImageContext(CGSizeMake(1, 1));
    [image drawAtPoint:CGPointZero];
    UIGraphicsEndImageContext();
    
  3. Draw image to a 1x1 bitmap context

    unsigned char pixelData[] = { 0, 0, 0, 0 };
    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
    CGContextRef context = CGBitmapContextCreate(pixelData, 1, 1, 8, 4, colorSpace, kCGImageAlphaPremultipliedLast | kCGBitmapByteOrder32Big);
    CGImageRef cgimage = image.CGImage;
    int imageWidth = CGImageGetWidth(cgimage);
    int imageHeight = CGImageGetHeight(cgimage);
    CGContextDrawImage(context, CGRectMake(-testPoint.x, testPoint.y - imageHeight, imageWidth, imageHeight), cgimage);
    CGColorSpaceRelease(colorSpace);
    CGContextRelease(context);
    

pixelData has the R, G, B, and A values of the pixel at testPoint.

like image 4
darrinm Avatar answered Nov 01 '22 10:11

darrinm


A CGImage context is possibly nearly empty and contains no actual pixel data until you try to read the first pixel or draw it, so trying to speed up getting pixels from an image might not get you anywhere. There's nothing to get yet.

Are you trying to read pixels from a PNG file? You could try going directly after the file and mmap'ing it and decoding the PNG format yourself. It will still take awhile to pull the data from storage.

like image 1
hotpaw2 Avatar answered Nov 01 '22 11:11

hotpaw2