Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Detection of sharpness of a photo

I'm looking for a framework helps detecting the sharpness of a photo. I have read this post which points to the methodology of doing so. But I'd rather work with a library than getting my hands dirty.

In the documentation of Core Image Apple says:

Core Image can analyze the quality of an image and provide a set of filters with optimal settings for adjusting such things as hue, contrast, and tone color, and for correcting for flash artifacts such as red eye. It does all this with one method call on your part.

How can I do the 'analyze image quality' part? I'd love to see some example code.

like image 999
brainray Avatar asked Aug 05 '15 11:08

brainray


3 Answers

Perhaps the best way to do this is the Polar Edge Coherence metric:

Baroncini, V., et al. "The polar edge coherence: a quasi blind metric for video quality assessment." EUSIPCO 2009, Glasgow (2009): 564-568.

It works just as well for images as for video. This directly measures the sharpness of edges. If you apply a sharpening filter you can compare the before and after values, and if you overdo the sharpening the values will start dropping again. It requires doing a couple of convolutions using complex-valued kernels as described in the paper.

like image 50
Alex I Avatar answered Sep 30 '22 13:09

Alex I


We did it with GPUimage framework like this (calculate brightness and sharpness): (here are some snippets that might help you)

-(BOOL) calculateBrightness:(UIImage *) image {
float result  = 0;
int i = 0;
for (int y = 0; y < image.size.height; y++) {
    for (int x = 0; x < image.size.width; x++) {
        UIColor *color = [self colorAt:image
                                   atX:x
                                  andY:y];
        const CGFloat * colors = CGColorGetComponents(color.CGColor);
        float r = colors[0];
        float g = colors[1];
        float b = colors[2];
        result += .299 * r + 0.587 * g + 0.114 * b;
        i++;
    }
}
float brightness = result / (float)i;
NSLog(@"Image Brightness : %f",brightness);
if (brightness > 0.8 || brightness < 0.3) {
    return NO;
}
return YES;

}

-(BOOL) calculateSharpness:(UIImage *) image {
GPUImageCannyEdgeDetectionFilter *filter = [[GPUImageCannyEdgeDetectionFilter alloc] init];
BinaryImageDistanceTransform *binImagTrans = [[BinaryImageDistanceTransform alloc] init ];
NSArray *resultArray = [binImagTrans twoDimDistanceTransform:[self getBinaryImageAsArray:[filter imageByFilteringImage:image]]];

if (resultArray == nil) {
    return NO;
}

int sum = 0;
for (int x = 0; x < resultArray.count; x++) {
    NSMutableArray *col = resultArray[x];
    sum += (int)[col valueForKeyPath:@"@max.intValue"];
}

// Values under analysis
NSLog(@"Image Sharp : %i",sum);
if (sum < 26250000) { // tested - bad sharpness is under ca. 26250000
    return NO;
}
return YES;

}

But it is very slow. It takes ca. 40 seconds for one image from iPad camera.

like image 24
mechtaj Avatar answered Sep 30 '22 13:09

mechtaj


I don't think Core Image will help you. You could use the auto enhancement feature to get an array of proposed filters and values. There's no sharpness (edge contrast) filter however, just overall image contrast. See the full list here.

There's an Apple vDSP API which can do Fast Fourier Transform:

The vDSP API provides mathematical functions for applications such as speech, sound, audio, and video processing, diagnostic medical imaging, radar signal processing, seismic analysis, and scientific data processing.

You should be able to use it to analyze your image.

For a conceptual overview see: Using Fourier Transforms and search for tutorials on vDSP. There are also Q&A here on stack.

like image 36
aergistal Avatar answered Sep 30 '22 11:09

aergistal