I'm trying to bind a texture that I want to interpret as either Alpha, Luminance or Intensity. I'm using OpenGL 4.0. I can bind them as GL_RED OK with no problems, i.e. :
glTexImage2D( GL_TEXTURE_2D,
i,
GL_RED,
mipSizeX,
mipSizeY,
0,
GL_RED,
GL_UNSIGNED_BYTE,
nullptr);
However whenever I try to bind as GL_ALPHA, GL_LUMINANCE or GL_INTENSITY, I get an error 1280. Are those formats deprecated with GL 4.0, or am I doing something wrong? E.g. this fails:
glTexImage2D( GL_TEXTURE_2D,
i,
GL_ALPHA8
mipSizeX,
mipSizeY,
0,
GL_ALPHA,
GL_UNSIGNED_BYTE,
nullptr);
EDIT: Ok as these formats are deprecated, this page contains tables showing what the valid formats actually are.
gl.LUMINANCE: Each color component is a luminance component, alpha is 1.0. gl.LUMINANCE_ALPHA: Each component is a luminance/alpha component. When using a WebGL 2 context, the following values are available additionally: A GLsizei specifying the width of the texture.
gl.LUMINANCE_ALPHA: Each component is a luminance/alpha component. When using a WebGL 2 context, the following values are available additionally: A GLsizei specifying the width of the texture. A GLsizei specifying the height of the texture.
glTexImage2D ( GL_TEXTURE_2D, i, GL_ALPHA8 mipSizeX, mipSizeY, 0, GL_ALPHA, GL_UNSIGNED_BYTE, nullptr); EDIT: Ok as these formats are deprecated, this page contains tables showing what the valid formats actually are. Yes, they are.
The following symbolic values are accepted: GL_UNSIGNED_BYTE, GL_BYTE, GL_BITMAP, GL_UNSIGNED_SHORT, GL_SHORT, GL_UNSIGNED_INT, GL_INT, and GL_FLOAT. A pointer to the image data in memory.
Are those formats deprecated with GL 4.0
Yes, they are.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With