Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

OpenGL - Why was the matrix stack removed and what are people using now?

I am reading through the OpenGL Superbible Fifth Edition and they discuss using stacks via their own class. That's all great but they mention that matrix stacks were deprecated. Why were they deprecated and what do people use instead of them?

like image 202
Satchmo Brown Avatar asked Jul 17 '12 05:07

Satchmo Brown


2 Answers

The reason(s) are political, not technical, and date back to the early 2000s.

OpenGL 3 was the first ever version willing to break backwards compatibility. The designers wanted to create an API for the expert users, the game programmers and high end visualization coders who knew all about shaders and wrote their own matrix code. The intent was that the OpenGL 3 API should match the actual hardware quite closely. (Even in OpenGL 1/2, the matrix stack was usually implemented on the CPU side, not the GPU.)

From a game engine programmer point of view, this was better. And hey, if you have to develop a new game engine every couple of years anyway, what's the big deal about throwing away the old code?

The result of this design process is the OpenGL 3/4 core profile.

Once the "new generation" OpenGL was announced, all the not-so-expert coders in universities and companies realized they would be screwed. These are the people (like me) who teach 3D graphics or write utility programs for research or design. We don't need any more advanced lighting than plain ambient-diffuse-specular. We often have to mix code from different sources together, and that is only easy if everyone is using exactly the same matrix, lighting, and texturing conventions - like those supplied by OpenGL 2.

Also, I've heard but cannot verify, the big CAD/CAM companies realized that they'd be screwed as well. Throwing away two million lines of code from ten years of development is not an option when you've got paying (and well-paying: compare prices for Quadro vs GeForce, or FireGL vs Radeon) customers.

So both NVIDIA and ATI announced they'd support the old API for as long as they could.

The result of this pressure is the compatibility profiles. And the OpenGL ARB now seems to have realized that while they'd like everyone to switch to core profile it just isn't going to happen: read the extension spec for tessellation shaders in OpenGL 4 and it mentions that GL_PATCHES will work with glBegin.

like image 184
Hugh Avatar answered Nov 15 '22 08:11

Hugh


Matrix Stack (and the rest of matrix functions) were deprecated only in the core profile. In the Compatibility profile you should still be able to use them.

From my point of view it was removed because most of engines/frameworks have custom Math code and shader uniform style for sending matrices to shaders.

Although for simple programs/tutorials it is very inconvenient to use and search for something else.

I suggest using:

  • glm (http://glm.g-truc.net/)
  • very simple math lib (vsml)
like image 21
fen Avatar answered Nov 15 '22 07:11

fen