I subclass QGLWidget and have my painting code in paintEvent instead of paintGL as I want to paint a 2D overlay using QPainter over my 3D stuff done with OpenGL.
My depth buffering works fine when I don't have an overlay. If the overlay is painted, my depth buffer goes AWOL: I can see stuff that should be hidden by objects in front.
initializeGL looks like this:
qglClearColor(Qt::black);
glShadeModel(GL_FLAT);
glEnable(GL_DEPTH_TEST);
glEnable(GL_CULL_FACE);
The structure of my paintEvent code is as follows:
makeCurrent();
...openGLStuff...
if (I need my overlay)
{
glPushAttrib(GL_ALL_ATTRIB_BITS);
glMatrixMode(GL_PROJECTION);
glPushMatrix();
glMatrixMode(GL_MODELVIEW);
glPushMatrix();
QPainter painter(this);
... do QPainter stuff ...
glPushAttrib(GL_ALL_ATTRIB_BITS);
glMatrixMode(GL_PROJECTION);
glPushMatrix();
glMatrixMode(GL_MODELVIEW);
glPushMatrix();
}
swapBuffers();
Depending on the if, the same scene looks alright (overlay off) or wrong (overlay on). Apart from the weird depth buffer problem, it works perfectly well.
My (wild) guess is that construction of the QPainter disables depth buffering. Any hint would be greatly appreciated. I suppose a fallback solution would be to render my overlay into a texture and have OpenGL blend it in.
Why don't you just enable and disable depth testing as needed? You don't "initialize" OpenGL, it's a state machine. Those "initializing" statements belong in your drawing code, into the context they where they are needed.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With