Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Hard time understanding indices with glDrawElements

I'm trying to draw a terrain with GL_TRIANGLE_STRIP and glDrawElements but I'm having a really hard time understanding the indices thing behind glDrawElements...

Here's what I have so far:

void Terrain::GenerateVertexBufferObjects(float ox, float oy, float oz) {
    float startWidth, startLength, *vArray;
    int vCount, vIndex = -1;

    // width = length = 256

    startWidth = (width / 2.0f) - width;
    startLength = (length / 2.0f) - length;

    vCount = 3 * width * length;
    vArray = new float[vCount];

    for(int z = 0; z < length; z++) {
        // vIndex == vIndex + width * 3  ||  width * 3 = 256 * 3 = 768
        for(int x = 0; x < width; x++) {
            vArray[++vIndex] = ox + startWidth + (x * stepWidth);
            vArray[++vIndex] = oy + heights[z][x];
            vArray[++vIndex] = oz + startLength + (z * stepLength);
        }
    }

    glGenBuffers(1, &vertexBuffer);
    glBindBuffer(GL_ARRAY_BUFFER, vertexBuffer);
    glBufferData(GL_ARRAY_BUFFER, sizeof(float) * vCount, vArray, GL_STATIC_DRAW);
    glBindBuffer(GL_ARRAY_BUFFER, 0);
}

void Terrain::DrawVBO(unsigned int texID, float ox, float oy, float oz) {
    float terrainLight[] = { 1.0f, 1.0f, 1.0f, 1.0f };

    if(!generatedVBOs) {
        GenerateVertexBufferObjects(ox, oy, oz);
        generatedVBOs = true;
    }

    unsigned int indices[] = { 0, 768, 3, 771 };

    glGenBuffers(1, &indexBuffer);
    glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, indexBuffer);
    glBufferData(GL_ELEMENT_ARRAY_BUFFER, sizeof(unsigned int) * 4, indices, GL_STATIC_DRAW);
    glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, 0);

    glEnableClientState(GL_VERTEX_ARRAY);
    glBindBuffer(GL_ARRAY_BUFFER, vertexBuffer);
    glVertexPointer(3, GL_FLOAT, 0, 0);

    glPolygonMode(GL_FRONT_AND_BACK, GL_LINE);

    glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, indexBuffer);

    glMaterialfv(GL_FRONT_AND_BACK, GL_AMBIENT_AND_DIFFUSE, terrainLight);
    glDrawElements(GL_TRIANGLE_STRIP, 4, GL_UNSIGNED_INT, 0);

    glPolygonMode(GL_FRONT_AND_BACK, GL_FILL);

    glDisableClientState(GL_VERTEX_ARRAY);
    glBindBuffer(GL_ARRAY_BUFFER, 0);
}

I believe my vArray is correct, I use the same values when drawing with glBegin(GL_TRIANGLE_STRIP)/glEnd which works just fine.

My guess was to use just the index of the x coordinate for each vertex. But I have no idea if that's the right way to use indices with glDrawElements.

  • 0: Index of the x coordinate from the first vertex of the triangle. Location: (-128, -128).
  • 768: Index of the x coordinate from the second vertex of the triangle. Location: (-128, -127)
  • 3: Index of the x coordinate from the third vertex of the triangle. Location: (-127, -128)
  • 771: Index of the x coordinate from the fourth vertex, which will draw a second triangle. Location: (-127, -127).

I think everything is making sense so far?

What's not working is that the location values above (which I doubled checked on vArray and they are correct) are not the same which glDrawElements is using. Two triangles are drawn but they are a lot bigger than what they should be. It starts correctly at (-128, -128) but it goes to something like (-125, -125) instead of (-127, -127).

I can't understand what I'm doing wrong here...

like image 481
rfgamaral Avatar asked Apr 24 '11 22:04

rfgamaral


1 Answers

Using something like the following solves my problem:

unsigned int indices[] = { 0, 256, 1, 257 };

I think it's safe to assume that the index is the x coordinate and that OpenGL is expecting that to be followed by y and z but we shouldn't increase by 3 ourselves, the server does it for us.

And now that I think about it, glDrawElements has the word element on it, which in this case is a vertex with 3 coordinates as specified in glVertexPointer and we need to pass the indices to the element, not the vertex.

I feel so dumb now...

like image 196
rfgamaral Avatar answered Oct 02 '22 17:10

rfgamaral