I'm learning OpenGL 3 using LWJGL. I have tried to implement an equivalent to gluLookAt()
, and although it works am I somewhat confused as to why.
I confess to just copying this code from various sources on the web, but after much study I think understand the maths behind it, and that I understand what LWJGL is doing.
However the 'correct' gluLookAt
code behaved incorrectly in my application, as the camera seemed to be turning the wrong way. I only managed to get my code working by transposing the orthonormal vectors forward
, side
, and up
(hope I'm using the correct terminology!), which I'm pretty sure is wrong...
private static final Vector3f forward = new Vector3f();
private static final Vector3f side = new Vector3f();
private static final Vector3f up = new Vector3f();
private static final Vector3f eye = new Vector3f();
public static Matrix4f lookAt(float eyeX, float eyeY, float eyeZ,
float centerX, float centerY, float centerZ,
float upX, float upY, float upZ) {
forward.set(centerX - eyeX, centerY - eyeY, centerZ - eyeZ);
forward.normalise();
up.set(upX, upY, upZ);
Vector3f.cross(forward, up, side);
side.normalise();
Vector3f.cross(side, forward, up);
up.normalise();
Matrix4f matrix = new Matrix4f();
matrix.m00 = side.x;
matrix.m01 = side.y;
matrix.m02 = side.z;
matrix.m10 = up.x;
matrix.m11 = up.y;
matrix.m12 = up.z;
matrix.m20 = -forward.x;
matrix.m21 = -forward.y;
matrix.m22 = -forward.z;
matrix.transpose(); // <------ My dumb hack
eye.set(-eyeX, -eyeY, -eyeZ);
matrix.translate(eye);
return matrix;
}
I dont think I should be doing the transpose, but it doesn't work without it. I put transpose()
because I couldn't be bothered to retype all the matrix cell positions btw!
My understanding is that the form of the lookAt matrix should be as follows
[ side.x up.x fwd.x 0 ] [ 1 0 0 -eye.x ]
[ side.y up.y fwd.y 0 ] [ 0 1 0 -eye.y ]
[ side.z up.z fwd.z 0 ] [ 0 0 1 -eye.z ]
[ 0 0 0 1 ] [ 0 0 0 1 ]
And I think that the LWJGL Matrix4f
class represents matrix cells as m<col><row>
. The translate(Vector3f)
method does the following
public static Matrix4f translate(Vector3f vec, Matrix4f src, Matrix4f dest) {
...
dest.m30 += src.m00 * vec.x + src.m10 * vec.y + src.m20 * vec.z;
dest.m31 += src.m01 * vec.x + src.m11 * vec.y + src.m21 * vec.z;
dest.m32 += src.m02 * vec.x + src.m12 * vec.y + src.m22 * vec.z;
dest.m33 += src.m03 * vec.x + src.m13 * vec.y + src.m23 * vec.z;
...
}
So I am left hugely confused as to what part of this I have screwed up. Is it my understanding of the lookAt matrix, the column/row major-ness (is that a word?!) of Matrix4f
, or something else? Is the rest of my code just broken? Is it actually correct and I'm just worrying too much? Am I just an idiot?
Thanks.
You should transpose nothing .You should negate "eye" vector of the lookAt() matrix and the looking direction."EYE" corresponds to camera position which should be always inverted.All that is done inside the lookAt()
Here is lookAt() method from Java port of the famous GLM math lib.
public static Mat4 lookAt(Vec3 eye, Vec3 center, Vec3 up) {
Vec3 f = normalize(Vec3.sub(center, eye));
Vec3 u = normalize(up);
Vec3 s = normalize(cross(f, u));
u = cross(s, f);
Mat4 result = new Mat4(1.0f);
result.set(0, 0, s.x);
result.set(1, 0, s.y);
result.set(2, 0, s.z);
result.set(0, 1, u.x);
result.set(1, 1, u.y);
result.set(2, 1, u.z);
result.set(0, 2, -f.x);
result.set(1, 2, -f.y);
result.set(2, 2, -f.z);
return translate(result, new Vec3(-eye.x,-eye.y,-eye.z));
}
I use it with my LWJGL based OpenGL 4 renderer and it works like a charm :)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With