Quick background of where I'm at (to make sure we're on the same page, and sanity check if I'm missing/assuming something stupid):
Here's what I'm currently doing:
In the fragment shader of my final rendering pass (the one that actually calculates final frag values), I have the MVP matrices from the pass from the light's point of view, the depth texture from said pass (aka the "shadow map"), and the position/normal/color textures from my geometry buffer.
From what I understand, I need to find what UV of the shadow map the position of the current fragment corresponds to. I do that by the following:
//Bring position value at fragment (in world space) to screen space from lights POV
vec4 UVinShadowMap = (lightProjMat * lightViewMat * vec4(texture(pos_tex, UV).xyz,1.0)).xy;
//Convert screen space to 'texture space' (from -1to1 to 0to1)
UVinShadowMap = (UVinShadowMap+1)/2;
Now that I have this UV, I can get the percieved 'depth' from the light's pov with
float depFromLightPOV = texture2D(shadowMap, UVinShadowMap).r;
and compare that against the distance between the position at the current fragment and the light:
float actualDistance = distance(texture2D(pos_tex, UV).xyz, lightPos);
The problem comes from that 'depth' is stored in values 0-1, and actual distance is in world coordinates. I've tried to do that conversion manually, but couldn't get it to work. And in searching online, it looks like the way I SHOULD be doing this is with a sampler2DShadow...
So here's my question(s):
What changes do I need to make to instead use shadow2D? What does shadow2D even do? Is it just more-or-less an auto-conversion-from-depth-to-world texture? Can I use the same depth texture? Or do I need to render the depth texture a different way? What do I pass in to shadow2D? The world-space position of the fragment I want to check? Or the same UV as before?
If all these questions can be answered in a simple documentation page, I'd love if someone could just post that. But I swear I've been searching for hours and can't find anything that simply says what the heck is going on with shadow2D!
Thanks!
Beginning with GLSL 1.30, there is no special texture lookup function (name anyway) for use with sampler2DShadow
. GLSL 1.30+ uses a bunch of overloads of texture (...)
that are selected based on the type of sampler
passed and the dimensions of the coordinates.
sampler2DShadow
you need to do two things differently:Texture comparison must be enabled or you will get undefined results
GL_TEXTURE_COMPARE_MODE
= GL_COMPARE_REF_TO_TEXTURE
The coordinates you pass to texture (...)
are 3D instead of 2D. The new 3rd coordinate is the depth value that you are going to compare.
texture (...)
returns when using sampler2DShadow
:If this comparison passes, texture (...)
will return 1.0, if it fails it will return 0.0. If you use a GL_LINEAR
texture filter on your depth texture, then texture (...)
will perform 4 depth comparisons using the 4 closest depth values in your depth texture and return a value somewhere in-between 1.0 and 0.0 to give an idea of the number of samples that passed/failed.
That is the proper way to do hardware anti-aliasing of shadow maps. If you tried to use a regular sampler2D
with GL_LINEAR
and implement the depth test yourself you would get a single averaged depth back and a boolean pass/fail result instead of the behavior described above for sampler2DShadow
.
As for getting a depth value to test from a world-space position, you were on the right track (though you forgot perspective division).
W
componentThe final step assumes you are using the default depth range... if you have not called glDepthRange (...)
then this will work.
The end result of step 3 serves as both a depth value (R
) and texture coordinates (ST
) for lookup into your depth map. This makes it possible to pass this value directly to texture (...)
. Recall that the first 2 components of the texture coordinates are the same as always, and that the 3rd is a depth value to test.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With