Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

GLSL vertex shader cancel render

Can the rendering for a pixel be terminated in a vertex shader. For example if a vertex does not meet a certain requirement cancel the rendering of that vertex?

like image 816
user346443 Avatar asked Aug 13 '13 13:08

user346443


3 Answers

I am elaborating on Andon M. Coleman answer, which deserves IMHO to be marked as the right one.

Even though the OpenGL specification is adamant about the fact that you cannot skip the fragment shader step (unless you actually remove the whole primitive in the geometry shader, as Nicol Bolas correctly pointed out, which is a bit overkill imho), you can do it in practice by letting OpenGL cull the whole geometry, as modern GPUs have early fragment rejection optimizations which will likely produce the same effect.

And, for the records, making the whole geometry get discarded is really really easy: just write the vertex outside the (-1, -1, -1),(1,1,1) cube,

gl_Position = vec4(2.0, 2.0, 2.0, 1.0);

...and off you go!

Hope this helps

like image 168
Rick77 Avatar answered Oct 18 '22 15:10

Rick77


I'll assuming you said "rendering for a vertex be terminated". And no, you can't; OpenGL is very strict about the 1:1 ratio of input vertices to outputs for a VS. Also, it wouldn't really mean what you want it to, since vertices don't get rendered. Primitives do, and a primitive can be composed of more than one vertex. What would it mean to discard a vertex in the middle of a triangle strip, for example.

This is why Geometry Shaders have the ability to "cull" primitives; they deal specifically with a primitive, not merely a single vertex. This is done by simply not emitting any vertices; GS's must explicitly emit the vertices that it wants to output.


Vertex shaders now have the ability to cull primitives. This is done using the "cull distance" feature of OpenGL 4.5. It's like gl_ClipDistance, only instead of clipping, it culls the entire primitive if one of the vertices crosses the threshold.

like image 16
Nicol Bolas Avatar answered Oct 26 '22 18:10

Nicol Bolas


In theory, you can use a vertex shader to produce a degenerate (zero-area) primitive. A primitive with zero area should not result in anything rasterized, and thus no fragment will be rendered. It is not particularly intuitive, however, especially if you are using primitives that share vertices.

But no, canceling a vertex is almost meaningless. It is the fundamental unit upon which primitives are constructed. If you simply remove a single vertex, then you will alter the rasterized output in undefined ways.

Put simply, vertices are not what create pixels on screen. It is the connectivity between vertices, which creates primitives, that ultimately lead to pixels. Geometry Shaders operate on a primitive-by-primitive basis, so they are generally where you would cancel rasterization and fragment shading in a programatic fashion.


UPDATE:

It has come to my attention that you are using GL_POINTS as your primitive type. In this special case, all you have to do to prevent your vertex from going further down the pipeline is set its position somewhere outside of your camera's viewing volume. The vertex will be clipped and no rasterization or fragment shading will occur.

This is a much more efficient solution to testing for some condition in a fragment shader and then discarding, because you skip rasterization and do not have to execute a fragment shader at all. Not to mention, discard usually winds up working as a post-shader execution flag that tells the GPU to discard the result - the GPU is often forced to execute the entire shader no matter where in the shader you issue the discard instruction. Thus discard rarely gives a performance benefit, and in many cases it can disable other potentially more useful hardware optimizations. This is the nature of the way GPUs schedule their shader workload, unfortunately.

The cheapest fragment is the one you never have to process :)

like image 14
Andon M. Coleman Avatar answered Oct 26 '22 20:10

Andon M. Coleman