This is long but I promise it's interesting. :)
I'm trying to mimic the appearance of another application's texturing using jMonkeyEngine. I have a list of vertices, and faces (triangles) making up a "landscape mesh" which should be textured with about 7-15 different textures (depending on the terrain of the "landscape"). Each triangle has a texture code associated with it, signifying which texture that particular triangle should mostly consist of. And of course, the textures should blend smoothly between each face.
So I'm trying to develop a strategy that allows this (which does NOT utilize pre-made alpha map png files, texture alphas need to be done at run time). Right now I figure if I calculate the "strength" of each texture at each vertex (in the vertex shader)--by factoring in the terrain types of all it's neighboring faces (unsure how to do this yet)--I should be able to set alpha values based on how far a pixel is from a vertex. The generated 'alpha map' would be used by the frag shader to blend each texture per pixel.
Is this even feasible, or should I be looking at a totally different strategy? I have the shader code for the application I'm trying to mimic (but they are HLSL and I'm using GLSL), but it seems like they're doing this blending step elsewhere:
sampler MeshTextureSampler = sampler_state { Texture = diffuse_texture; AddressU = WRAP; AddressV = WRAP; MinFilter = LINEAR; MagFilter = LINEAR; };
I'm not sure what this HLSL "MeshTextureSampler" is but it seems like this application may have pre-blended all the textures as needed, and created a single texture for the entire mesh based on the face/terrain code data. In the pixel/fragment shader all they really seem to do is this:
float4 tex_col = tex2D(MeshTextureSampler, In.Tex0);
After that it's just shadows, lighting, etc -- no sort of texture blending at all as far as I can tell, which leads me to believe this texture blending work is being done on the CPU beforehand, I suppose. Any suggestions welcome.
If I understand you correctly, here is what my first shot would be:
Your problem is, more or less, how to distribute your your per-face value over vertices. This is actually similar to normal generation on a mesh: first you would generate a normal for each triangle, and then calculate them per vertex. Google "normal generation" and you'll get there, but here's the gist. For each adjacent triangle, find a weighing factor (often angle of the corner that uses the vertex, or the surface area of the triangle, or a combination), and then sum up the value (be it normal or your "strengths") multiplied by the weighing factor to a total result. Normalize and you're done.
So then you have your texture "strengths" that you can send to your vertex shader. The modern solution would be to use chars and sample a texture array in the pixel shader, after you've fudged the blend values a bit to give you nicer transfers.
So, if I get your problem correctly :
Preprocess:
forearch vertex in mesh
vertexvalue = 0
normalization = 0
foreach adjacent triangle of vertex
angle = calculateAngleBetween3Vertices(vertex,triangle.someothervertex,triangle.theotherothervertex)
vertexvalue += triangle.value * angle
normalization += angle
vertexvalue/=normalization
Rendering time:
pipe the value(s) of each vertex to the fragmentshader, and do this in the fragment shader:
basecolour = 0;
foreach value
basecolour = mix(basecolour, texture2D(textureSamplerForThisValue,uv), value)
//this is simple, but we could do better once we have this working
Or, alternatively, you can take a good look at your geometry. If you have a combination of big triangles and tiny ones, you will have an unequal spread of data, and since your data is per vertex, you will have more detail where this is more geometry. In that case ,you will probably want to do what everyone else is doing and decouple your texturing from your geometry by using blend maps. These can be low-resolution and shouldn't increase your memory consumption or shader execution time that much.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With