Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Converting a UIImage black'n white and not grayscale for using tesseract

I'm using tesseract in my iPhone application.

I tried several filters on my image for converting it to a grayscale image, however I'd like to have the result where a threshold is being set so that the only pixels which are inside the image are black or white.

I succeeded with using apples grayscale filter which gives the appropriate result. However it's still a 16 bit image (correct me if I'm wrong). The filtering which I'm using at the moment is as follows:

- (UIImage *) grayishImage:(UIImage *)i {

    // Create a graphic context.
    UIGraphicsBeginImageContextWithOptions(i.size, YES, 1.0);
    CGRect imageRect = CGRectMake(0, 0, i.size.width, i.size.height);
// Draw the image with the luminosity blend mode.
[i drawInRect:imageRect blendMode:kCGBlendModeLuminosity alpha:1.0];
    // Get the resulting image.
    UIImage *filteredImage = UIGraphicsGetImageFromCurrentImageContext();
    UIGraphicsEndImageContext();

    return filteredImage;
}

Can anyone supply me with the filter to get pure black and white pixels and not grayscale images?

like image 725
BarryK88 Avatar asked Apr 03 '12 11:04

BarryK88


1 Answers

Probably the fastest way to do this would be to use OpenGL ES 2.0 shaders to apply the threshold to your image. My GPUImage framework encapsulates this so that you don't need to worry about the more technical aspects behind the scenes.

Using GPUImage, you could obtain a thresholded version of your UIImage using a GPUImageLuminanceThresholdFilter and code like the following:

GPUImagePicture *stillImageSource = [[GPUImagePicture alloc] initWithImage:inputImage];
GPUImageLuminanceThresholdFilter *stillImageFilter = [[GPUImageLuminanceThresholdFilter alloc] init];
stillImageFilter.threshold = 0.5;
[stillImageSource addTarget:stillImageFilter];
[stillImageFilter useNextFrameForImageCapture];
[stillImageSource processImage];

UIImage *imageWithAppliedThreshold = [stillImageFilter imageFromCurrentFramebuffer];

You can just pass your color image into this, because this automatically extracts the luminance from each pixel and applies the threshold to that. Any pixel above the threshold goes to white, and any one below that is black. You can adjust the threshold to meet your particular conditions.

However, an even better choice for something you're going to pass into Tesseract would be my GPUImageAdaptiveThresholdFilter, which can be used in the same way as the GPUImageLuminanceThresholdFilter, only without a threshold value. The adaptive thresholding does a thresholding operation based on a 9 pixel region around the current pixel, adjusting for local lighting conditions. This is specifically designed to help with OCR applications, so it might be the way to go here.

Examples images from both types of filters can be found in this answer.

Note that the roundtrip through UIImage is slower than handling raw data, so these filters are much faster when acting on direct video or movie sources, and can run in realtime for those inputs. I also have a raw pixel data output, which might be faster for use with Tesseract.

like image 94
Brad Larson Avatar answered Oct 05 '22 05:10

Brad Larson