Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Gaussian Blur - standard deviation, radius and kernel size

I've implemented a gaussian blur fragment shader in GLSL. I understand the main concepts behind all of it: convolution, separation of x and y using linearity, multiple passes to increase radius...

I still have a few questions though:

What's the relationship between sigma and radius?

I've read that sigma is equivalent to radius, I don't see how sigma is expressed in pixels. Or is "radius" just a name for sigma, not related to pixels?

How do I choose sigma?

Considering I use multiple passes to increase sigma, how do I choose a good sigma to obtain the sigma I want at any given pass? If the resulting sigma is equal to the square root of the sum of the squares of the sigmas and sigma is equivalent to radius, what's an easy way to get any desired radius?

What's the good size for a kernel, and how does it relate to sigma?

I've seen most implementations use a 5x5 kernel. This is probably a good choice for a fast implementation with decent quality, but is there another reason to choose another kernel size? How does sigma relate to the kernel size? Should I find the best sigma so that coefficients outside my kernel are negligible and just normalize?

like image 221
LodeRunner Avatar asked Jul 24 '13 17:07

LodeRunner


People also ask

How does kernel size affect Gaussian blur?

Larger kernels spread the blur around a wider region, as each pixel is modified by more of its surrounding pixels.

What is standard deviation in Gaussian blur?

The standard deviation of the Gaussian function controls the amount of blurring. A large standard deviation (i.e., > 2) significantly blurs, while a small standard deviation (i.e., 0.5) blurs less. If the objective is to achieve noise reduction, a rank filter (median) might be more useful in some circumstances.

What happens as the size of the blurring kernel increases?

Hence, this brings us to an important rule: as the size of the kernel increases, so will the amount in which the image is blurred. Simply put: the larger your smoothing kernel is, the more blurred your image will look.

How does sigma affect Gaussian blur?

The role of sigma in the Gaussian filter is to control the variation around its mean value. So as the Sigma becomes larger the more variance allowed around mean and as the Sigma becomes smaller the less variance allowed around mean. Filtering in the spatial domain is done through convolution.


1 Answers

What's the relationship between sigma and radius?

I think your terms here are interchangeable depending on your implementation. For most glsl implementations of Gaussian blur they use the sigma value to define the amount of blur. In the Gaussian blur definition the radius can be considered the 'blur radius'. These terms are in pixel space.

How do I choose sigma?

This will define how much blur you want, which corresponds to the size of the kernel to be used in the convolution. Bigger values will result in more blurring.

The NVidia implementation uses a kernel size of int(sigma*3).

You may experiment using a smaller kernel size with higher values of sigma for performance considerations. These are free parameters to experiment with, which define how many pixels to use for modulation and how much of the corresponding pixel to include in the result.

What's the good size for a kernel, and how does it relate to sigma?

Based on the sigma value you will want to choose a corresponding kernel size. The kernel size will determine how many pixels to sample during the convolution and the sigma will define how much to modulate them by.

You may want to post some code for a more detailed explanation. NVidia has a pretty good chapter on how to build a Gaussian Kernel. Look at Example 40-1.

like image 78
Aaron Hagan Avatar answered Oct 27 '22 08:10

Aaron Hagan