Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

OpenGL 3.+ glsl compatibility mess?

Tags:

opengl

glsl

So, I googled a lot of opengl 3.+ tutorials, all incorporating shaders (GLSL 330 core). I however do not have a graphics card supporting these newer GLSL implementations, either I have to update my driver but still I'm not sure if my card is intrinsically able to support it.

Currently my openGL version is 3.1, and I created on windows with C++ a modern context with backwards compatibility. My GLSL version is 1.30 via NVIDIA Cg compiler (full definition), and GLSL 1.30 -> version 130.

The problem is : version 130 is fully based on the legacy opengl pipeline, because it contains things like viewmatrix, modelmatrix, etc. Then how am I supposed to use them when I am using core functions in my client app (OpenGL 3+)?

This is really confusing, give me concrete examples.

Furthermore, I want my app to be able to run on most OpenGL implementations, then could you tell me where the border is between legacy GLSL and modern GLSL? Is GLSL 300 the modern GLSL, and is there a compatibilty with OpenGL 3.+ with older GLSL versions?

like image 825
user209347 Avatar asked Nov 23 '13 10:11

user209347


People also ask

Is OpenGL the same as GLSL?

The short version is: OpenGL is an API for rendering graphics, while GLSL (which stands for GL shading language) is a language that gives programmers the ability to modify pipeline shaders. To put it another way, GLSL is a (small) part of the overall OpenGL framework.

Does OpenGL support HLSL?

The GLSL that is referred to herein is compatible with OpenGL ES 2.0; the HLSL is compatible with Direct3D 11.

Does OpenGL use GPU?

OpenGL (Open Graphics Library) is a cross-language, cross-platform application programming interface (API) for rendering 2D and 3D vector graphics. The API is typically used to interact with a graphics processing unit (GPU), to achieve hardware-accelerated rendering.

What is GLSL computer graphics?

Shaders use GLSL (OpenGL Shading Language), a special OpenGL Shading Language with syntax similar to C. GLSL is executed directly by the graphics pipeline. There are several kinds of shaders, but two are commonly used to create graphics on the web: Vertex Shaders and Fragment (Pixel) Shaders.


1 Answers

I would say OpenGL 3.1 is modern OpenGL.

Any hardware that supports OpenGL 3.1 is capable of supporting OpenGL 3.3. Whether the driver always support of it is another matter. Updating your graphics card will probably bump you up to OpenGL 3.3.

Just to clear this up OpenGL 3.1 is not legacy OpenGL.

legacy OpenGL would be:

glMatrixMode(GL_MODELVIEW);
glLoadIdentity();
glRotatef(90.0, 0.0, 1.0, 0.0);
glTranslatef(0.0, 0.0, -5.0);

Which OpenGL 3.1 with a compatibility context supports, but that doesn't mean it should be used. If you are developing for OpenGL 3 capable hardware you should most definitely not be using it. You can disable the legacy functionality by requesting a core context.

if you are using shaders then you already moved away the legacy fixed function pipeline. So GLSL 130 is not legacy :P.

Working on my Linux Laptop with my Intel CPU where the latest stable drivers are only at OpenGL 3.1 (Yes OpenGL 3.3 commits are in place, but I'm waiting for MESA 10 ;) ) I have without much effort been able to get the OpenGL 3.3 Tutorials to run on my machine without touching legacy OpenGL.

One of the wonderful things about OpenGL is that you can extend the functionality with OpenGL extension. Even if your HW isn't capable of handling OpenGL 4.4 you can still use the extensions that doesn't require OpenGL 4 HW with updated drivers!

See https://developer.nvidia.com/opengl-driver and http://developer.amd.com/resources/documentation-articles/opengl-zone/ for info on what features are added to older HW, but if you are uncertain all you have to do is test it on your HW.

And I'll finish of by saying Legacy OpenGL also has it's place.

In my opinion legacy OpenGL might be easier to learn than modern OpenGL, since you don't need knowledge of shaders and OpenGL buffers to draw your first triangle, but I don't think you should be using it in a modern production application.

If you need support for old hardware you might need to use an older OpenGL version. Even modern CPU's support OpenGL 3 so I would not worry about this to much.

Converting from OpenGL 3.3 to OpenGL 3.0

I tested it on the tutorials from http://www.opengl-tutorial.org/. I cannot put the code up I converted as most of it is as is from the tutorials and I don't have permission to put the code here.

They author talked about OpenGL 3.1, but since he is capped at glsl 130 (OpenGL 3.0) I am converting to 3.0.

  1. First of all change the context version to OpenGL 3.0 (Just change the minor version to 0 if your working from the tutorials). Also don't set it to use core context if your using OpenGL 3.0 since as far as I know ARB_compatibility is only available from OpenGL 3.1.

  2. Change the shader version to

    #version 130
    
  3. Remove all layout binding in shaders

     layout(location = #) in vec2 #myVarName;
    

    to

     in vec2 #myVarName;
    
  4. Use glBindAttribLocation to bind the in layouts as they were specified (see 3)

    e.g

    glBindAttribLocation(#myProgramName, #, "#myVarName");
    
  5. Use glBindFragDataLocation to bind the out layout as they were specified (see 3)

    e.g

    glBindFragDataLocation(#myProgramName, #, "#myVarName");
    
  6. glFramebufferTexture doesn't work in OpenGL 3.0. (Used for shadowmapping and deferred rendering etc.). Instead you need to use glFramebufferTexture2D. (It has a extra parameter, but the documentation is sufficient)

Here is screenshot of tutorial16 (I though this one covered the most areas and used this a test to see if that all that's needed)

tutorial16 with OpenGL 3.0

There is a mistake in the source of tutorial16 (At the time of writing). The FBO is set to have no color output, but the fragment shader still outputs a color value, causing a segfault (Trying to write to nothing ussually does that). Simply changing the depth fragment shader to output nothing fixes it. (Doesn't produce segfault on more tolerant drivers, but that's not something you should bargain on)

like image 64
Xonar Avatar answered Oct 20 '22 18:10

Xonar