Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Transparent textures behaviour in WebGL

Environment: WebGL, Chrome. I have the following behavior when using transparent png's as textures for models:

  1. Image A - the tree hides the building behind it and I see the world box texture. It also hides itself (back branches are not visible)
  2. At the same time - Image B - works properly, the window is transparent and I see what's behind

A: Tree over house B: Window transparency

Both screenshots were made on same the scene at the same time from different camera positions. Textures are produced by the same algorithm.

I can't understand what's the difference between window and branches transparency. My main question is - how to fix the branches so as to not hide the objects behind them? Shader code is:

gl_FragColor = vec4(textureColor.rgb * vLightWeighting, textureColor.a);

I played with enable/disable blending and depth_test, sometimes getting the desired results, but not sure if it's the proper way to do things.

like image 798
Vecnas Avatar asked Jan 06 '12 19:01

Vecnas


2 Answers

You're running into depth buffer issues, it has nothing to do with your shader or blend modes.

What's happening is that the order that you render your transparent geometry in is affecting your ability to render behind it. This is because the depth buffer has no concept of transparent or non-transparent. As a result, even though they don't visually contribute to the scene those transparent pixels write themselves into the depth buffer anyway, and after that any pixels that you draw behind them will be discard because they're "not visible". If you drew the geometry behind the transparent object first, though, it would show correctly because it gets written into the frame before the transparent depth is put in place to discard it.

This is something that even large commercial game engines still struggle with to some degree, so don't feel bad about it causing some confusion. :)

There's no "perfect solution" to this problem, but what it really boils down to is trying to structure your scene like so:

  1. Render any opaque geometry sorted by state (shader/texture/etc)
  2. Render any transparent geometry next. If possible sort these by depth, so that you draw the furthest one from the camera first.

Simply by flagging the bits of geometry that are transparent and rendering them after everything else you'll solve 90% of this problem, but the issue may still remain for overlapping transparent objects. That may not be an issue for you, depending on your scene, but if it's still causing artifacts you'll need to sort transparent objects by depth before you draw.

like image 197
Toji Avatar answered Nov 05 '22 20:11

Toji


Discard fragments with alpha lower than, for example 0.5 might help (of course, there is side effect).

if(gl_FragColor.a < 0.5) discard;

AlphaFunctions in WebGL?

like image 28
Punyapat Avatar answered Nov 05 '22 20:11

Punyapat