I tried to use this tutorial with Golang: http://www.opengl-tutorial.org/beginners-tutorials/tutorial-2-the-first-triangle/ The go-version opens the window and makes the background blue, but it doesn't show the triangle. The c-version does show it. This is the code in Go:
err := glfw.Init()
if err != nil {
log.Fatal("Failed to init GLFW: " + err.Error())
}
err = glfw.OpenWindow(1024, 768, 0,0,0,0, 32,0, glfw.Windowed)
if err != nil {
log.Fatal("Failed to open GLFW window: " + err.Error())
}
if gl.Init() != 0 {
log.Fatal("Failed to init GL")
}
gl.ClearColor(0.0, 0.0, 0.3, 0.0)
// create vertexbuffer
gVertexBufferData := []float32{-1.0,-1.0,0.0, 1.0,-1.0,0.0, 0.0,1.0,0.0}
vertexBuffer := gl.GenBuffer()
vertexBuffer.Bind(gl.ARRAY_BUFFER)
gl.BufferData(gl.ARRAY_BUFFER, len(gVertexBufferData), gVertexBufferData, gl.STATIC_DRAW)
for {
// clear screen
gl.Clear(gl.COLOR_BUFFER_BIT)
// first attribute buffer: vertices
var vertexAttrib gl.AttribLocation = 0
vertexAttrib.EnableArray()
vertexBuffer.Bind(gl.ARRAY_BUFFER)
var f float32 = 0.0
vertexAttrib.AttribPointer(
3, // size
false, // normalized?
0, // stride
&f) // array buffer offset
// draw the triangle
gl.DrawArrays(gl.TRIANGLES, 0, 3)
vertexAttrib.DisableArray()
glfw.SwapBuffers()
}
And this is the code in c which works:
if(!glfwInit())
return -1;
if(!glfwOpenWindow( 1024, 768, 0,0,0,0, 32,0, GLFW_WINDOW ))
return -1;
if(glewInit() != GLEW_OK)
return -1;
glClearColor(0.0f, 0.0f, 0.3f, 0.0f);
GLuint VertexArrayID;
glGenVertexArrays(1, &VertexArrayID);
glBindVertexArray(VertexArrayID);
static const GLfloat g_vertex_buffer_data[] = {
-1.0f, -1.0f, 0.0f,
1.0f, -1.0f, 0.0f,
0.0f, 1.0f, 0.0f,
};
GLuint vertexbuffer;
glGenBuffers(1, &vertexbuffer);
glBindBuffer(GL_ARRAY_BUFFER, vertexbuffer);
glBufferData(GL_ARRAY_BUFFER, sizeof(g_vertex_buffer_data), g_vertex_buffer_data, GL_STATIC_DRAW);
while(1) {
glClear( GL_COLOR_BUFFER_BIT );
// 1rst attribute buffer : vertices
glEnableVertexAttribArray(0);
glBindBuffer(GL_ARRAY_BUFFER, vertexbuffer);
glVertexAttribPointer(
0,
3, // size
GL_FLOAT, // type
GL_FALSE, // normalized?
0, // stride
(void*)0 // array buffer offset
);
// Draw the triangle !
glDrawArrays(GL_TRIANGLES, 0, 3); // From index 0 to 3 -> 1 triangle
glDisableVertexAttribArray(0);
// Swap buffers
glfwSwapBuffers();
}
Maybe I give vertexAttrib.AttribPointer() the wrong arguments, because I'm not sure what to give it instead of (void*)0. I tried nil, but that caused the application to crash. &gVertexBufferData[0] doesn't work either.
I'm using github.com/banthar/gl as glew-wrapper, go 1.0.2 and ubuntu 12.04 amd64.
EDIT update:
glGetError doesn't give any errors
I had the same problem and I managed to fix it after looking at your post, so first of all thanks a lot.
I managed to display a triangle by using the work branch of banthar bindings with this call to AttribPointer
:
vertexAttrib.AttribPointer(
3, // size
gl.FLOAT, //type
false, // normalized?
0, // stride
nil) // array buffer offset
and by passing the size in bytes to BufferData.
[...]
data := []float32{0, 1, 0, -1, -1, 0, 1, -1, 0}
[...]
gl.BufferData(gl.ARRAY_BUFFER, len(data)*4, data, gl.STATIC_DRAW)
[...]
There is probably a better way to pass the right length.
I recently came into a similar issue with the Golang OpenGL bindings, and this question was one of the only references to it I could find. However, none of the existing answers solved my problem, as the bindings appear to be slightly different now in 2015 than they looked in 2012.
The solution to my issue which hasn't already been covered by the existing answers involved the gl.BufferData() function called when creating a VBO.
A problem-producing example of the code in question would look like this:
[...]
vertices := []float32{0, 1, 0, -1, -1, 0, 1, -1, 0}
[...]
gl.BufferData(
gl.ARRAY_BUFFER,
len(vertices)*4,
unsafe.Pointer(&vertices),
gl.STATIC_DRAW)
[...]
One solution already provided recommended to change this code to something like this:
[...]
vertices := []float32{0, 1, 0, -1, -1, 0, 1, -1, 0}
[...]
gl.BufferData(
gl.ARRAY_BUFFER,
len(vertices)*4,
vertices,
gl.STATIC_DRAW)
[...]
However the bindings I used had a different function signature to those used here, and errored with:
cannot use vertices (type []float32) as type unsafe.Pointer in argument to gl.BufferData
The solution I ended up finding, and wanted to put here so nobody else should have to go through the headache it took trying to figure out the issue, looks like this:
[...]
vertices := []float32{0, 1, 0, -1, -1, 0, 1, -1, 0}
[...]
gl.BufferData(
gl.ARRAY_BUFFER,
len(vertices)*4, //len(vertices)*int(reflect.TypeOf(vertices).Elem().Size()),
gl.Ptr(vertices),
gl.STATIC_DRAW)
[...]
I also included a commented out option to replace len(vertices)*4 with, which produces the exact same result, but finds the '4' based on the type of slice (float32 in this case)
Footnotes
The bindings I used:
github.com/go-gl/gl/all-core/gl
github.com/go-gl/glfw/v3.1/glfw
My OpenGL context was created with these hints: primaryMonitor := glfw.GetPrimaryMonitor() vidMode := primaryMonitor.GetVideoMode()
glfw.WindowHint(glfw.ContextVersionMajor, 3)
glfw.WindowHint(glfw.ContextVersionMinor, 3)
glfw.WindowHint(glfw.OpenGLProfile, glfw.OpenGLCoreProfile)
glfw.WindowHint(glfw.OpenGLForwardCompatible, glfw.True)
glfw.WindowHint(glfw.RedBits, vidMode.RedBits)
glfw.WindowHint(glfw.GreenBits, vidMode.GreenBits)
glfw.WindowHint(glfw.BlueBits, vidMode.BlueBits)
glfw.WindowHint(glfw.RefreshRate, vidMode.RefreshRate)
glfw.WindowHint(glfw.Visible, glfw.False)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With