I have code that just uploads some contrived data to a texture:
glActiveTexture(GL_TEXTURE0+gl_sim_texture_active_n);
glBindTexture(GL_TEXTURE_2D, gl_sim_texture_buff_id);
for(int i = 0; i < w*h; i++) buff[i] = 0xAB;
glTexImage2D(GL_TEXTURE_2D,0,GL_ALPHA,w,h,0,GL_ALPHA,GL_UNSIGNED_BYTE,buff);
and code that just samples from that texture in my shaders:
uniform vec2 viewport;
uniform sampler2D sim_texture;
void main()
{
vec2 tex_uv = vec2(gl_FragCoord.x/(viewport.x-1.),gl_FragCoord.y/(viewport.y-1.));
gl_FragColor = texture2D(sim_texture,tex_uv).argb; //swizzle is to just put 'a' in a visibly renderable position as "redness"
}
On OSX and Android, this texture is readable in my shader (via a sampler2D- nothing fancy)- it works. On iOS, any samples from that sampler2D returns vec4(0.,0.,0.,1.)
, regardless of data put in.
(Note that when I change GL_ALPHA
to GL_RGBA
, attach the texture to a framebuffer, then call glReadPixels
after glTexImage2D
I do get back the exact data that I put in, regardless of platform, and the functionality (or lackthereof on iOS) stays the same. The switch to GL_RGBA
was only necessary to attach it to a framebuffer, which is necessary for glReadPixels
, which I only care about for debugging purposes. tl;dr: I'm reasonably confident that the data is being uploaded to the texture correctly on all platforms.)
Other info:
gl_sim_texture_active_n
is 6, and gl_sim_texture_buff_id
is 14 (both obtained legitimately and without error). Calling glGetError()
or glCheckFramebufferStatus(GL_FRAMEBUFFER)
before or after both return clean. glGetIntegerv(GL_MAX_TEXTURE_IMAGE_UNITS)
returns 8 (the same as on my test Android device.
I'm just at a total loss why this would be working on OSX/Android, and not iOS. Any direction re: where to go from here would be helpful!
Gah I found it!
GL_REPEAT
doesn't support non POT textures! (GLSL texture2D() always returns (0, 0, 0, 1)) Fixed!
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With