Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

OpenGL texture format, create image/texture data for OpenGL

Ok so I need to create my own texture/image data and then display it onto a quad in OpenGL. I have the quad working and I can display a TGA file onto it with my own texture loader and it maps to the quad perfectly.

But how do I create my own "homemade image", that is 1000x1000 and 3 channels (RGB values) for each pixel? What is the format of the texture array, how do I for example set pixel (100,100) to black?

This is how I would imagine it for a completely white image/texture:

#DEFINE SCREEN_WIDTH 1000
#DEFINE SCREEN_HEIGHT 1000

unsigned int* texdata = new unsigned int[SCREEN_HEIGHT * SCREEN_WIDTH * 3];
for(int i=0; i<SCREEN_HEIGHT * SCREEN_WIDTH * 3; i++)
        texdata[i] = 255;

GLuint t = 0;
glEnable(GL_TEXTURE_2D);
glGenTextures( 1, &t );
glBindTexture(GL_TEXTURE_2D, t);

// Set parameters to determine how the texture is resized
glTexParameteri ( GL_TEXTURE_2D , GL_TEXTURE_MIN_FILTER , GL_LINEAR_MIPMAP_LINEAR );
glTexParameteri ( GL_TEXTURE_2D , GL_TEXTURE_MAG_FILTER , GL_LINEAR );
// Set parameters to determine how the texture wraps at edges
glTexParameteri ( GL_TEXTURE_2D , GL_TEXTURE_WRAP_S , GL_REPEAT );
glTexParameteri ( GL_TEXTURE_2D , GL_TEXTURE_WRAP_T , GL_REPEAT );
// Read the texture data from file and upload it to the GPU
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, SCREEN_WIDTH, SCREEN_HEIGHT, 0,
             GL_RGB, GL_UNSIGNED_BYTE, texdata);
glGenerateMipmap(GL_TEXTURE_2D);

EDIT: Below answers are correct but I also found that OpenGL doesn't handle normal ints which I used but it works fine with uint8_t. I assume it's because of the GL_RGB together with the GL_UNSIGNED_BYTE (which is only 8 bits and a normal int is not 8 bit) flag that I use when I upload to GPU.

like image 865
Jackbob Avatar asked Dec 11 '22 09:12

Jackbob


2 Answers

But how do I create my own "homemade image", that is 1000x1000 and 3 channels (RGB values) for each pixel?

std::vector< unsigned char > image( 1000 * 1000 * 3 /* bytes per pixel */ );

What is the format of the texture array

Red byte, then green byte, then blue byte. Repeat.

how do I for example set pixel (100,100) to black?

unsigned int width = 1000;
unsigned int x = 100;
unsigned int y = 100;
unsigned int location = ( x + ( y * width ) ) * 3;
image[ location + 0 ] = 0; // R
image[ location + 1 ] = 0; // G
image[ location + 2 ] = 0; // B

Upload via:

// the rows in the image array don't have any padding
// so set GL_UNPACK_ALIGNMENT to 1 (instead of the default of 4)
// https://www.khronos.org/opengl/wiki/Pixel_Transfer#Pixel_layout
glPixelStorei( GL_UNPACK_ALIGNMENT, 1 );
glTexImage2D
    (
    GL_TEXTURE_2D, 0,
    GL_RGB, 1000, 1000, 0,
    GL_RGB, GL_UNSIGNED_BYTE, &image[0]
    );
like image 123
genpfault Avatar answered Jan 18 '23 14:01

genpfault


By default, each row of a texture should be aligned to 4 bytes. The texture is an RGB texture, which needs 24 bits or 3 bytes for each texel and the texture is tightly packed especially the rows of the texture. This means that the alignment of 4 bytes for the start of a line of the texture is disregarded (except 3 times the width of the texture is divisible by 4 without a remaining).

To deal with that the alignment has to be changed to 1. This means the GL_UNPACK_ALIGNMENT paramter has to be set before loading a tightly packed texture to the GPU (glTexImage2D):

glPixelStorei(GL_UNPACK_ALIGNMENT, 1);

Otherwise an offset of 0-3 bytes per line is gained, at texture lookup. This causes a continuously twisted or tilted texture.

Since you use the soure format GL_RGB in GL_UNSIGNED_BYTE, each pixel consits of 3 color channels (red, green and blue) and each color channel is stored in one byte in range [0, 255].

If you want to set a pixel at (x, y) to the color R, G and B, the this is done like this:

texdata[(y*WIDTH+x)*3+0] = R;
texdata[(y*WIDTH+x)*3+1] = G;
texdata[(y*WIDTH+x)*3+2] = B;
like image 23
Rabbid76 Avatar answered Jan 18 '23 13:01

Rabbid76