Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Why are OpenGL texture params GLint and not GLenum?

Tags:

opengl

Some of the OpenGL texturing functions accept GLints where I would expect a GLenum. For example, glTexImage2D has the following parameter:

GLint internalformat

The docs describe that param as follows:

internalformat

Specifies the internal format of the texture. Must be one of the following symbolic constants: GL_ALPHA, GL_LUMINANCE, GL_LUMINANCE_ALPHA, GL_RGB, GL_RGBA.

Most of the time, the API would use a GLenum when the value must be one of several symbolic constants. And that makes sense. But this one (and some other texture-related params) are GLints. Why?

Of course, they're all integers at heart, and in C the distinction hardly matters. But it's not a purely academic question. In other, more strongly-typed languages' OpenGL bindings, the distinction between GLint and GLenum is important, because one is signed and the other isn't. E.g. in the Haskell OpenGLRaw package, all symbolic constants are GLenums, which means you must explicitly convert with fromIntegral every time you call glTexImage2D and similar functions.

like image 283
rlkw1024 Avatar asked Dec 26 '22 09:12

rlkw1024


1 Answers

Once upon a time... If you go back to the documentation for older OpenGL versions, where the functionality that is now deprecated in the Core Profile is still documented, this will make sense. It used to be legal to pass values 1, 2, 3, 4 for internalFormat, denoting the number of color components in the texture.

For example, this is the man page for glTexImage2D in OpenGL 2.1: http://www.opengl.org/sdk/docs/man2/xhtml/glTexImage2D.xml. Under internalFormat, it says:

Specifies the number of color components in the texture. Must be 1, 2, 3, or 4, or one of the following symbolic constants: ...

like image 171
Reto Koradi Avatar answered Jan 14 '23 12:01

Reto Koradi