Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to use unsigned short in an opengl shader?

I'm trying to upload a texture with unsigned shorts in a shader but it's not working.

I have tried the following:

glActiveTexture(GL_TEXTURE1);
glBindTexture(GL_TEXTURE_2D, vbt[1]);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, 640, 480, 0, GL_RED, GL_UNSIGNED_SHORT, kinect_depth);
glUniform1i(ptexture1, 1);
GLenum ErrorCheckValue = glGetError();

I know I'm binding correctly the texture because I get some results by using

glActiveTexture(GL_TEXTURE1);
glBindTexture(GL_TEXTURE_2D, vbt[1]);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, 640, 480, 0,
    GL_RG, GL_UNSIGNED_BYTE, kinect_depth);
glUniform1i(ptexture1, 1);
GLenum ErrorCheckValue = glGetError();

In particular, I get part of my values in the red channel. I would like to upload the texture as a unsigned byte or as a float. However I don't manage to get the glTexImage2D call correctly. Also, is it possible to something similar using a depth texture? I would like to do some operations on the depth information I get from a kinect and display it.

like image 843
eaponte Avatar asked May 01 '14 12:05

eaponte


1 Answers

Your arguments to glTexImage2D are inconsistent. The 3rd argument (GL_RGB) suggests that you want a 3 component texture, the 7th (GL_RED) suggests a one-component texture. Then your other attempt uses GL_RG, which suggests 2 components.

You need to use an internal texture format that stores unsigned shorts, like GL_RGB16UI.

If you want one component, your call would look like this:

glTexImage2D(GL_TEXTURE_2D, 0, GL_R16UI, 640, 480, 0, GL_RED_INTEGER, GL_UNSIGNED_SHORT, kinect_depth);

If you want three components:

glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB16UI, 640, 480, 0, GL_RGB_INTEGER, GL_UNSIGNED_SHORT, kinect_depth);

You also need to make sure that the types used in your shader for sampling the texture match the type of the data stored in the texture. In this example, since you use a 2D texture containing unsigned integer values, your sampler type should be usampler2D, and you want to store the result of the sampling operation (result of texture() call in the shader) in a variable of type uvec4. (paragraph added based on suggestion by Andon)

Some more background on the format/type arguments of glTexImage2D, since this is a source of fairly frequent misunderstandings:

The 3rd argument (internalFormat) is the format of the data that your OpenGL implementation will store in the texture (or at least the closest possible if the hardware does not support the exact format), and that will be used when you sample from the texture.

The last 3 arguments (format, type, data) belong together. format and type describe what is in data, i.e. they describe the data you pass into the glTexImage2D call.

It is mostly a good idea to keep the two formats matched. Like in this case, the data you pass in is GL_UNSIGNED_SHORT, and the internal format GL_R16UI contains unsigned short values. In OpenGL ES it is required for the internal format to match format/type. Full OpenGL does conversion if necessary, which is undesirable for performance reasons, and also frequently not what you want because the precision of the data in the texture won't be the same as the precision of your original data.

like image 91
Reto Koradi Avatar answered Oct 12 '22 21:10

Reto Koradi