I'm working on procedurally generating patches of dirt using randomized fractals for a video game. I've already generated a height map using the midpoint displacement algorithm and saved it to a texture. I have some ideas for how to turn that into a texture of normals, but some feedback would be much appreciated.
My height texture is currently a 257 x 257 gray-scale image (height values are scaled for visibility purposes):
My thinking is that each pixel of the image represents a lattice coordinate in a 256 x 256 grid (hence, why there are 257 x 257 heights). That would mean that the normal at coordinate (i, j) is determined by the heights at (i, j), (i, j + 1), (i + 1, j), and (i + 1, j + 1) (call those A, B, C, and D, respectively).
So given the 3D coordinates of A, B, C, and D, would it make sense to:
...or is there a much easier method that I'm missing?
Height Maps. While both normal and height maps give our low-poly models the appearance of more detail, they are used for distinctly different purposes. The most obvious difference is that height maps are greyscale only, because they only portray height differences.
Right-click paint layer containing your bump map, then select Normal Map from Bump from the menu that appears. In the Normal Map From Bump options that appear, set the Coordinate Space and Compatibility options you want for the normal map, then click OK.
Normal maps are created by gathering data from a 3D model—or simply a texture—and projecting it onto a 2D image.
Example GLSL code from my water surface rendering shader:
#version 130 uniform sampler2D unit_wave noperspective in vec2 tex_coord; const vec2 size = vec2(2.0,0.0); const ivec3 off = ivec3(-1,0,1); vec4 wave = texture(unit_wave, tex_coord); float s11 = wave.x; float s01 = textureOffset(unit_wave, tex_coord, off.xy).x; float s21 = textureOffset(unit_wave, tex_coord, off.zy).x; float s10 = textureOffset(unit_wave, tex_coord, off.yx).x; float s12 = textureOffset(unit_wave, tex_coord, off.yz).x; vec3 va = normalize(vec3(size.xy,s21-s01)); vec3 vb = normalize(vec3(size.yx,s12-s10)); vec4 bump = vec4( cross(va,vb), s11 );
The result is a bump vector: xyz=normal, a=height
My thinking is that each pixel of the image represents a lattice coordinate in a 256 x 256 grid (hence, why there are 257 x 257 heights). That would mean that the normal at coordinate (i, j) is determined by the heights at (i, j), (i, j + 1), (i + 1, j), and (i + 1, j + 1) (call those A, B, C, and D, respectively).
No. Each pixel of the image represents a vertex of the grid, so intuitively, from symmetry, its normal is determined by heights of neighboring pixels (i-1,j), (i+1,j), (i,j-1), (i,j+1).
Given a function f : ℝ2 → ℝ that describes a surface in ℝ3, a unit normal at (x,y) is given by
v = (−∂f/∂x, −∂f/∂y, 1) and n = v/|v|.
It can be proven that the best approximation to ∂f/∂x by two samples is archived by:
∂f/∂x(x,y) = (f(x+ε,y) − f(x−ε,y))/(2ε)
To get a better approximation you need to use at least four points, thus adding a third point (i.e. (x,y)) doesn't improve the result.
Your hightmap is a sampling of some function f on a regular grid. Taking ε=1 you get:
2v = (f(x−1,y) − f(x+1,y), f(x,y−1) − f(x,y+1), 2)
Putting it into code would look like:
// sample the height map: float fx0 = f(x-1,y), fx1 = f(x+1,y); float fy0 = f(x,y-1), fy1 = f(x,y+1); // the spacing of the grid in same units as the height map float eps = ... ; // plug into the formulae above: vec3 n = normalize(vec3((fx0 - fx1)/(2*eps), (fy0 - fy1)/(2*eps), 1));
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With