Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Creating blur filter with a shader - access adjacent pixels from fragment shader?

I want to create a blur effect using a fragment shader in OpenGL ES 2.0. The algorithm I am interested in is simply an averaging blur - add all adjacent pixels to myself and divide by 9 to normalize.

However I have 2 issues:

1) does this require me to first render to a framebuffer, then switch rendering targets? Or is there an easier way

2) assume I bind my "source" image to blur as texture 0, and I'm outputting my blurred texture. How do I access the pixels that aren't the one I'm current dealing with. The vert shader has invoked me for pixel i, but I need to access the pixels around me. How? And how do I know if I'm an edge case (literally at edge of screen)

(3: is there a more suitable algorithm to get a fuzzy frosted glass looking blur)

like image 450
Nektarios Avatar asked Jul 27 '11 18:07

Nektarios


People also ask

Is fragment shader same as pixel shader?

Pixel shaders, also known as fragment shaders, compute color and other attributes of each "fragment": a unit of rendering work affecting at most a single output pixel. The simplest kinds of pixel shaders output one screen pixel as a color value; more complex shaders with multiple inputs/outputs are also possible.

How do you blur a texture in OpenGL?

The trick is that it's separable in the x and y direction, so you apply the shader twice: once with dir = (0, 1) and once with dir = (1, 0) . This reduces the number of texture fetches from n² to 2n, so it's much more efficient especially for large kernels.

What happens between vertex shader and fragment shader?

The difference between vertex and fragment shaders is the process developed in the render pipeline. Vertex shaders could be define as the shader programs that modifies the geometry of the scene and made the 3D projection. Fragment shaders are related to the render window and define the color for each pixel.

Is pixel shader a fragment?

Fragment shaders are responsible for all per-pixel calculations. The OpenGL rasterizer automatically interpolates all per-vertex data for each pixel. The default pipeline simply sets output color of the pixel to the interpolated vertex color.


1 Answers

Elaborating a bit more on what Matias said:

  1. Yes. You render the image into a texture (best done using FBOs) and in the second (blur) pass you bind this texture and read from it. You cannot perform the render and blur passes in one step, as you cannot access the framebuffer you're currently rendering into. This would introduce data dependencies, as your neighbours need not have their final color yet, or worse there color depends on you.

  2. You get the current pixel's coordinates in the special fragment shader variable gl_FragCoord and use these as texture coordinates into the texture containing the previously rendered image and likewise gl_FragCoord.x +/- 1 and gl_FragCoord.y +/- 1 for the neighbours. But like Matias said, you need to devide these values by width and height (of the image) respectively, as texture coordinates are in [0,1]. By using GL_CLAMP_TO_EDGE as wrapping mode for the texture, the edge cases are handled automatically by the texturing hardware. So at an edge you still get 9 values, but only 6 distinct ones (the other 3, the ones actually outside the image, are just duplicates of their inside neighbours).

like image 91
Christian Rau Avatar answered Oct 03 '22 07:10

Christian Rau