OpenGL Setup:
glEnable( GL_BLEND );
glBlendFunc( GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA );
glEnable( GL_DEPTH_TEST );
glEnable( GL_TEXTURE_2D );
glEnable( GL_CULL_FACE );
glCullFace( GL_BACK );
glClearColor( 0.0, 1.0, 1.0, 1.0 );
Initializing the texture:
// 16x16 X pattern
uint8_t buffer[ 16*16 ] =
{
255, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 255,
0, 255, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 255, 0,
0, 0, 255, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 255, 0, 0,
0, 0, 0, 255, 0, 0, 0, 0, 0, 0, 0, 0, 255, 0, 0, 0,
0, 0, 0, 0, 255, 0, 0, 0, 0, 0, 0, 255, 0, 0, 0, 0,
0, 0, 0, 0, 0, 255, 0, 0, 0, 0, 255, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 255, 0, 0, 255, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 255, 255, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 255, 255, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 255, 0, 0, 255, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 255, 0, 0, 0, 0, 255, 0, 0, 0, 0, 0,
0, 0, 0, 0, 255, 0, 0, 0, 0, 0, 0, 255, 0, 0, 0, 0,
0, 0, 0, 255, 0, 0, 0, 0, 0, 0, 0, 0, 255, 0, 0, 0,
0, 0, 255, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 255, 0, 0,
0, 255, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 255, 0,
255, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 255,
};
GLuint texture_id;
glGenTextures( 1, &texture_id );
glBindTexture( GL_TEXTURE_2D, texture_id );
glTexImage2D( GL_TEXTURE_2D, 0, GL_RGBA8, 16, 16, 0, GL_LUMINANCE, GL_UNSIGNED_BYTE, buffer );
glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR );
glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR );
The textures are drawn as quads using glBegin/glEnd. The color for each vertex is white with full alpha: {r=255,g=255,b=255,a=255}.
Here's an example scene. The photo and cheese are both loaded from PNG images. The cheese has transparent holes, which show the photo and background behind it. The X pattern I expected to be transparent too:
Why is the quad black instead of transparent, any how can I fix my code to draw what I was expecting?
This question may be similar, but so far I am unable to apply the brief answer to my current problem.
Update: I think I solved it, thanks to the answers below. I changed...
glTexImage2D( GL_TEXTURE_2D, 0, GL_RGBA8, 16, 16, 0, GL_LUMINANCE, GL_UNSIGNED_BYTE, buffer );
to
glTexImage2D( GL_TEXTURE_2D, 0, GL_ALPHA, 16, 16, 0, GL_ALPHA, GL_UNSIGNED_BYTE, buffer );
... which gives the desired result.
@Dietrich Epp's answer was almost correct, only that the format you're looking for is GL_ALPHA:
glTexImage2D(GL_TEXTURE_2D, 0, GL_ALPHA8, 16, 16, 0, GL_ALPHA, GL_UNSIGNED_BYTE, buffer);
Then set the blend function to glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
or glBlendFunc(GL_ONE, GL_ONE_MINUS_SRC_ALPHA);
, depending on if your values are premultiplied. Last but not least set the texture environment to GL_MODULATE
glTexEnvi(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_MODULATE);
Now you can set the color of the 'X' with glColor.
Another approach is not using blending but alpha testing.
Becuse luminance is color, and transparency (alpha) is not color.
If you want transparency, use different format - GL_LUMANANCE_ALPHA or something similar, and store transparency in different channel.
Also, this is already explained in documentation.
GL_LUMINANCE
Each element is a single luminance value. The GL converts it to floating point, then assembles it into an RGBA element by replicating the luminance value three times for red, green, and blue and attaching 1 for alpha. Each component is then multiplied by the signed scale factor GL_c_SCALE, added to the signed bias GL_c_BIAS, and clamped to the range [0,1] (see glPixelTransfer).
--edit--
Any way to keep the 8bit per pixel and achieve the same effect?
I think you could "tell" to OpenGL that initial image is indexed color and then set up proper RGBA palette (where for every element R==G==B==A==index). See glPixelTransfer
and glPixelMap
for details.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With