How can ensure that GLSL shaders are compatible with most modern cards?
I've got a software where I use GLSL code from here. But even though I've added #version 120 to the beginning of my final shader and made sure it compiles, on some users computers they get shader compilation errors (even though they support OpenGL 3.2).
Is there any tool I can use to "validate" or try compiling with different "shader compilers"?
The OpenGL Shading Language (GLSL) is the principal shading language for OpenGL.
GLSL is a high-level shading language, which means that you can write your shader in C/C++ style. This makes shader development much easier!
GLSL also allows user defined varying variables. These must be declared in both the vertex and fragment shaders, for instance: varying float intensity; A varying variable must be written on a vertex shader, where we compute the value of the variable for each vertex.
Ins and outs GLSL defined the in and out keywords specifically for that purpose. Each shader can specify inputs and outputs using those keywords and wherever an output variable matches with an input variable of the next shader stage they're passed along. The vertex and fragment shader differ a bit though.
There is no tool for validating a shader. Even if there was, it wouldn't be useful to you, because what good is a shader that is "valid" if it doesn't run on the hardware you want? You can be right all you want, but if your hardware rejects it even though you're technically right, your shader still won't run.
If a shader of a particular version compiles on one target (call it A) and not on another (call it B), this could be due to one of the following problems:
#1 is more likely to happen if you develop solely on NVIDIA hardware. NVIDIA plays a bit fast-and-loose with the OpenGL specification. They will take a few liberties here and there, smoothing out some of the unpleasant things the specification says. It makes for a smoother developer experience, but it also helps with keeping vendors using NVIDIA hardware if shaders don't run on competitors ;)
#3 is pretty much non-existent, with the noted exception. You linked to a Photoshop shader, so I gather that you are not in control of the creation and management of the OpenGL context. Even so, I highly doubt Photoshop would use a core context; they have too many shaders that need backwards compatibility.
The best way to deal with this is to test on both AMD and NVIDIA hardware (and Intel if you need to run there). You may not need to test on every possible combination of systems, but pick a Radeon HD card and a GeForce 200 or better. They don't even have to be high-end.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With