Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

OpenGL glGetError 1281 bad value

Tags:

c++

opengl

shader

I am using OpenGL with vertices and shaders, nothing got displayed on my screen so i used glGetError to debug : I got an error 1281(bad value) on one of my buffer called color_array_buffer, here is the section i am talking about :

    GLenum error =  glGetError();
if(error) {
    cout << error << endl; 
    return ;
} else {
    cout << "no error yet" << endl;
}
//no error


// Get a handle for our "myTextureSampler" uniform
    GLuint TextureID  = glGetUniformLocation(shaderProgram, "myTextureSampler");
    if(!TextureID)
        cout << "TextureID not found ..." << endl;

// Bind our texture in Texture Unit 0
    glActiveTexture(GL_TEXTURE0);
    sf::Texture::bind(texture);
// Set our "myTextureSampler" sampler to user Texture Unit 0
    glUniform1i(TextureID, 0);
// 2nd attribute buffer : UVs
    GLuint vertexUVID = glGetAttribLocation(shaderProgram, "color");
    if(!vertexUVID)
        cout << "vertexUVID not found ..." << endl;
    glEnableVertexAttribArray(vertexUVID);
    glBindBuffer(GL_ARRAY_BUFFER, color_array_buffer);
    glVertexAttribPointer(vertexUVID, 2, GL_FLOAT, GL_FALSE, 0, 0);

error =  glGetError();
if(error) {
    cout << error << endl; 
    return ;
}
//error 1281

And here is the code where i link my buffer to the array :

    if (textured) {
        texture = new sf::Texture();
    if(!texture->loadFromFile("textures/simple.jpeg"/*,sf::IntRect(0, 0, 128, 128)*/))
        std::cout << "Error loading texture !!" << std::endl;
        glGenBuffers(1, &color_array_buffer);
        glBindBuffer(GL_ARRAY_BUFFER, color_array_buffer);
        glBufferData(GL_ARRAY_BUFFER, uvs.size() * sizeof(glm::vec3), &uvs[0], GL_STATIC_DRAW);
    }

and my values of uvs :

uvs[0] : 0.748573-0.750412

uvs[1] : 0.749279-0.501284

uvs[2] : 0.99911-0.501077

uvs[3] : 0.999455-0.75038

uvs[4] : 0.250471-0.500702

uvs[5] : 0.249682-0.749677

uvs[6] : 0.001085-0.75038

uvs[7] : 0.001517-0.499994

uvs[8] : 0.499422-0.500239

uvs[9] : 0.500149-0.750166

uvs[10] : 0.748355-0.99823

uvs[11] : 0.500193-0.998728

uvs[12] : 0.498993-0.250415

uvs[13] : 0.748953-0.25092

Am i doing something wrong, if someone could help me that would be great.

like image 467
aze Avatar asked Jul 01 '14 10:07

aze


1 Answers

Your check for glGetAttribLocation() failing to find the attribute is incorrect:

GLuint vertexUVID = glGetAttribLocation(shaderProgram, "color");
if(!vertexUVID)
    cout << "vertexUVID not found ..." << endl;

glGetAttribLocation() returns a GLint (not GLuint), and the result is -1 if an attribute with the given name is not found in the program. Since you assign the value to an unsigned variable, it will end up being the largest possible unsigned, which is then an invalid argument if you pass it to glEnableVertexAttribArray() afterwards.

Your code should look like this instead:

GLint vertexUVID = glGetAttribLocation(shaderProgram, "color");
if(vertexUVID < 0)
    cout << "vertexUVID not found ..." << endl;

Note that 0 is a perfectly valid attribute location.

like image 85
Reto Koradi Avatar answered Oct 25 '22 23:10

Reto Koradi