I have a vertex attribute that's being chewed up very weird by my shaders. It's uploaded to the VBO as a (uint8)1
but when the fragment shader sees it, it's interpreted as a 10653532160
, or 0x3F800000
which some of you might recognize as being the bit pattern for a 1.0f
in floating point.
I have no ideas as to why? I can confirm that it is uploaded to the VBO as a 1 (0x00000001
) though.
The vertex attribute is defined as:
struct Vertex{
...
glm::u8vec4 c2; // attribute with problems
};
// not-normalized
glVertexAttribPointer(aColor2, 4, GL_UNSIGNED_BYTE, GL_FALSE, sizeof(Vertex), (void*)offsetof(Vertex, c2));
While the shader has that attribute bound with
glBindAttribLocation(programID, aColor2, "c2");
The vertex shader passes along the attribute pretty uneventfully:
#version 330
in lowp uvec4 c2; // <-- this value is uploaded to the VBO as 0x00, 0x00, 0x00, 0x01;
flat out lowp uvec4 indices;
void main(){
indices = c2;
}
And finally the fragment shader gets ahold of it:
flat in lowp uvec4 indices; // <-- this value is now 0, 0, 0, 0x3F800000
out lowp vec4 fragColor;
void main(){
fragColor = vec4(indices) / 256.0;
}
The indices
varying leaves the vertex shader as a 0x3F800000
for indices.w
according to my shader inspector, so something odd is happening there? What could be causing this?
It the type of an vertex attribute is integral, then you have to use glVertexAttribIPointer
rather than glVertexAttribPointer
(focus on I
). See glVertexAttribPointer
.
The type which is specified in glVertexAttribPointer
is the type of the data in source buffer and doesn't specify the target attribute type in the shader. If you use glVertexAttribPointer
, then the type of the attribute in the shader program is assumed to be floating point, and the integral data are converted.
If you use glVertexAttribIPointer
then the values left as integer values.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With