Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

OpenGL extensions, how to use them correctly in C and glsl

Tags:

c

opengl

glsl

glew

I am working on a game engine and it has evolved greatly. Because the engine needs to work on mac also, I am still using OpenGL 3.2 and GLSL 1.2 :-(. I use GLEW which I assumed would solve extension issues for me.

EDIT: Meanwhile, part of the question is answered - read on at the next EDIT:

I was always able to make shaders work on both windows and mac, sometimes I would have to add a line in the GLSL code like #extension GL_EXT_gpu_shader4 : enable to make it work on Mac. It then seems that my Windows version will give a warning when compiling the same shader, but it will work.

But ever since I started using geometry shaders, my nightmare has begun. For some reason, mac expects me to use #extension GL_EXT_gpu_shader4 : enable while windows expects me to use #extension GL_EXT_geometry_shader4 : enable. This makes it less obvious to create a platform independent shader.

Also, and this is even more annoying: Windows version wants me to use: glTransformFeedbackVaryings and I would think that GLEW would make it available to the mac but there I explicitly need to use glTransformFeedbackVaryingsEXT which will not work on the windows version. So I need to #ifdef for APPLE to use what's needed. Same problem with glBeginTransformFeedback and glBeginTransformFeedbackEXT.

But both accept glProgramParameteriEXT, there I don't need the distiction...

I understand that it's only with the transform feedback that I am having the problem, but... what is this all about?

I thought I understood how OpenGL extensions worked, but I am starting to lose that understanding.

It's getting to a point where I think that when I run the code on another windows system or another mac system or another linux system, I will have new problems because there are different graphics adapters or something.

Can anyone help me understand this? I have read the OpenGL manual about extensions, so I am missing something obvious. How can I make sure that my code will always work?

EDIT: By removing GLEW entirely from the Mac version and fully using the Mac's own OpenGL 3.2 implementation, all the namings are normal and I can entirely remove the #extension GL_EXT_gpu_shader4 : enable from my shaders. The only thing that makes me worry now is that for geometry shaders, I need the function glProgramParameteri which doesn't seem to exist, but glProgramParameteriEXT does. For some reason, this function also works on Windows. Why? Can I be certain that it will work on all systems/hardware?

like image 780
scippie Avatar asked Feb 27 '13 08:02

scippie


People also ask

What version of OpenGL do I need to implement an extension?

Some extensions can be implemented even on OpenGL 1.1, while others have a higher base version. The minimum version that the extension can appear in is specified with the text, "X are required," where X is the minimum OpenGL and/or GLSL version required.

What version of GLSL does GL use?

So GL 4.1 uses GLSL 4.10. Many OpenGL Extensions modify GLSL's behavior and functionality as well. Unlike regular OpenGL, where extensions are implicitly always there whether you use it or not, GLSL extensions must explicitly be specified in the particular shader string being compiled.

How to use GLSL shaders in OpenGL?

GLSL is therefore an important part of modern OpenGL development. In order to use GLSL shaders in OpenGL, we first need to know about shader and program objects, which will allow us to compile and link GLSL shaders so that they can then be used during drawing operations.

What is the difference between GLSL and C?

This Wiki page will note the differences between GLSL and C. The OpenGL Shading Language requires certain information to be presented early in a shader object's compilation. In a command-line-based compiler, these would be command-line compiler options. GLSL's compilation model instead requires them to be part of the language.


2 Answers

EXT means that you are using an extension for the geometry shader. This should work on windows as well. Geometry shader should be core functionality in OpenGL 3.2, but right now you are instead using the extension (Basically you are using functionality of older OpenGL versions not actual 3.2). To use the core geometry shader you have to switch to GLSL 1.5.

I am not that well versed in OpenGL but my assumption is, that glProgramParameteri() is actually not necessary anymore if you start using core functionality, that is why it is missing in the library headers.

like image 142
littleimp Avatar answered Oct 21 '22 02:10

littleimp


Your life will be easier if you stick to matching OpenGL/GLSL versions.

http://en.wikipedia.org/wiki/OpenGL_Shading_Language

Pick from:

  • OpenGL 2.1, with GLSL 120 + GL_EXT_gpu_shader4 + GL_EXT_geometry_shader4
  • OpenGL 3.2, with GLSL 150
  • OpenGL 3.3, with GLSL 330

If you are already happy using OpenGL 3.2, then step up to GLSL 150, as it's closer to modern GLSL 330+. If you want the most compatibility, use OpenGL 2.1, GLSL 120, plus geometry shader extensions. (though I'm not sure how much more compatibility it buys you)

ONLY when using GLSL 120 plus extensions, you need to use glProgramParameteriEXT() to specify the geometry shader in/out primitive type, and vertex count. For example:

glUseProgram(prog);
glProgramParameteriEXT(prog, GL_GEOMETRY_INPUT_TYPE_EXT, GL_TRIANGLES);
glProgramParameteriEXT(prog, GL_GEOMETRY_OUTPUT_TYPE_EXT, GL_TRIANGLE_STRIP);
glProgramParameteriEXT(prog, GL_GEOMETRY_VERTICES_OUT_EXT, max_output_verticies);

https://wiki.engr.illinois.edu/display/graphics/Geometry+Shader+Hello+World

In OpenGL 3.2, GLSL 150+, you specify this information in your shader, using a declaration like:

#version 150
layout (triangles) in;
layout (triangle_strip, max_vertices = 3) out;

http://www.lighthouse3d.com/tutorials/glsl-core-tutorial/geometry-shader/

like image 45
David Jeske Avatar answered Oct 21 '22 03:10

David Jeske