Is there some robust metric of image sharpness or bluriness? I have various set of images with different parameters of saturation and captured from different optical systems, and i heed to show user something like "quality" of focusing. For getting most focused image i use metric getted with Sobel-Tenengrad operator(summ of high-contrast pixels), but the problem is that for different objects are quite different range of metric(depends on unknown parameters of image intensity, optical system ) - needed some metric where possible to say that image has bad focus whithout comparing with reference image, like this is "bad" or "good" focused image.
Image sharpness can be measured by the “rise distance” of an edge within the image. With this technique, sharpness can be determined by the distance of a pixel level between 10% to 90% of its final value (also called 10-90% rise distance; see Figure 3).
In photography, the term "acutance" describes a subjective perception of sharpness that is related to the edge contrast of an image. Acutance is related to the amplitude of the derivative of brightness with respect to space.
The Tenengrad function is a gradient-based function that extracts the gradient values in the horizontal and vertical directions through the Sobel operator.
The MTF10 value describes the absolute maximum of details that you can find in an image in practice. These details are shown with low contrast but are still visible. As the limiting resolution represents the maximum performance, this does not correlate with the subjective impression of sharpness.
You can calculate the accutance of the image by calculating the mean of the Gradient Filter.
Reference this StackOverflow answer to a similar question.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With