I've got a program that uses OpenGL for rendering. In doing so, it depends on having a core profile, but even if there is no core profile available, I always want to create an OpenGL context if only to be able to query the driver vendor and version when reporting the error that no suitable profile is available. For that reason, even if there is no core profile available, I always use some available profile to just attempt to initialize, and check during initialization that the profile fits my prerequisites, which specifically are that the OpenGL version is at least 3.0, and that GL_ARB_compatibility is absent.
However, I'm getting error reports from users whose drivers appear to violate those assumptions. Here's an example:
ATI Technologies Inc.4.2.11411 Core Profile ContextAMD Radeon HD 7610MGL_ARB_compatibility [...]As can be seen, glGetString(GL_VERSION) says it is a "Core Profile Context", but GL_EXTENSIONS contains GL_ARB_compatibility.
How should I interpret this? Is this a valid configuration and, if so, what does it mean for functionality that differs between core and compatibility profiles? Or should it be considered a driver bug? Is there a better way to check whether a given OpenGL context is core or compatibility?
You get a core profile by asking for it when you create the context. If you ask for a core profile, either you get a core profile or context creation fails.
There's nothing to interpret: if you asked for a core profile, and you created a context, you got a core profile. You don't need to ask if a profile is core, because you should already know, since you asked to make one.
If you asked for a core profile, then you shouldn't care whether "GL_ARB_compatibility" is available or not; you shouldn't even be looking at it, and you definitely should not be querying functions based on it.
But if you really want to check if a context is core or compatibility, you can call glGetIntegerv(GL_CONTEXT_PROFILE_MASK. It will return a bitmask that contains either GL_CONTEXT_CORE_PROFILE_BIT or GL_CONTEXT_COMPATIBILITY_PROFILE_BIT.
As can be seen,
glGetString(GL_VERSION) says it is a "Core Profile Context", butGL_EXTENSIONScontainsGL_ARB_compatibility. How should I interpret this?
Let me quote Appendix E.1 of the OpenGL 3.2 core profile specification (emphasis mine):
OpenGL 3.2 is the first version of OpenGL to define multiple profiles. The core profile builds on OpenGL 3.1 by adding features described in section H.1. The compatibility profile builds on the combination of OpenGL 3.1 with the special
GL_ARB_compatibilityextension defined together with OpenGL 3.1, adding the same new features and in some cases extending their definition to interact with existing features of OpenGL 3.1 only found inGL_ARB_compatibility.It is not possible to implement both core and compatibility profiles in a single GL context, since the core profile mandates functional restrictions not present in the compatibility profile. [...]
So, compatibility profile is explicitly defined in terms of relying on GL_ARB_compatibility, and I'd argue that an implementation advertising that extension in a core profile would be in violation of that section of the spec.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With