Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Ensure GLSL compatibility

How can ensure that GLSL shaders are compatible with most modern cards?

I've got a software where I use GLSL code from here. But even though I've added #version 120 to the beginning of my final shader and made sure it compiles, on some users computers they get shader compilation errors (even though they support OpenGL 3.2).

Is there any tool I can use to "validate" or try compiling with different "shader compilers"?

like image 608
ronag Avatar asked Aug 11 '11 06:08

ronag


People also ask

Does OpenGL use GLSL?

The OpenGL Shading Language (GLSL) is the principal shading language for OpenGL.

Is GLSL a C++?

GLSL is a high-level shading language, which means that you can write your shader in C/C++ style. This makes shader development much easier!

What is varying GLSL?

GLSL also allows user defined varying variables. These must be declared in both the vertex and fragment shaders, for instance: varying float intensity; A varying variable must be written on a vertex shader, where we compute the value of the variable for each vertex.

What is in and out in GLSL?

Ins and outs GLSL defined the in and out keywords specifically for that purpose. Each shader can specify inputs and outputs using those keywords and wherever an output variable matches with an input variable of the next shader stage they're passed along. The vertex and fragment shader differ a bit though.


1 Answers

There is no tool for validating a shader. Even if there was, it wouldn't be useful to you, because what good is a shader that is "valid" if it doesn't run on the hardware you want? You can be right all you want, but if your hardware rejects it even though you're technically right, your shader still won't run.

If a shader of a particular version compiles on one target (call it A) and not on another (call it B), this could be due to one of the following problems:

  1. Target A does not properly implement GLSL. It allowed you to compile something that the spec does not allow. The more compliant target B (or at least, differently non-compliant) rejects your shader, because your shader does not follow the specification.
  2. Target B is non-compliant with the specification. You are feeding it a legitimate shader and it is rejecting it.
  3. Target B does not support the version of GLSL your shader uses (this is unlikely), except when:
  4. Target B is using the OpenGL core specification, version 3.2 or greater. GLSL 1.20 shaders cannot be run on core 3.2 OpenGL implementations.

#1 is more likely to happen if you develop solely on NVIDIA hardware. NVIDIA plays a bit fast-and-loose with the OpenGL specification. They will take a few liberties here and there, smoothing out some of the unpleasant things the specification says. It makes for a smoother developer experience, but it also helps with keeping vendors using NVIDIA hardware if shaders don't run on competitors ;)

#3 is pretty much non-existent, with the noted exception. You linked to a Photoshop shader, so I gather that you are not in control of the creation and management of the OpenGL context. Even so, I highly doubt Photoshop would use a core context; they have too many shaders that need backwards compatibility.

The best way to deal with this is to test on both AMD and NVIDIA hardware (and Intel if you need to run there). You may not need to test on every possible combination of systems, but pick a Radeon HD card and a GeForce 200 or better. They don't even have to be high-end.

like image 61
Nicol Bolas Avatar answered Oct 04 '22 19:10

Nicol Bolas