I'm currently developing an iOS application (iPad and iPhone) that uses OpenGL ES 1.0 to render some basic textures. I use atlases to store and present my textures.
My main atlas is relatively large (2000x2000) but my internal algorithms load and resize the texture to 2048x2048 since OpenGL ES only accepts power of 2 sized textures. I'm able to draw the tiles, everything's fine on this side.
I'm facing a serious memory leak every time I try to load and unload (destroy) the texture. This should happen in the final version but I needed to make sure that my loading and unloading were fine. In memory the texture occupies 2048x2048x4 (RGBA) bytes = 16MB approx. This is a huge amount of bytes so you understand that the problem is pretty annoying to me (iOS kills the application after a few minutes..)
When I load a texture, Instruments indicates that the overall memory used by the application increases by 16MB this is correct (I use the "Real Memory" column). The problem occurs when I need to destroy the texture to free all the possible bytes used by it: it never looses the 16MB...and since I load and unload in a loop, the memory keeps being used and never freed.
Here's how I load the textures:
GLubyte *outData = malloc((_dataWidth * _dataHeight * 4) * sizeof(*outData));
GLuint _texture; // declared as un instance variable so its address never changes
glGenTextures(1, &_texture);
glBindTexture(GL_TEXTURE_2D, _texture);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, _dataWidth, _dataHeight, 0, GL_RGBA, GL_UNSIGNED_BYTE, outData);
free(outData);
Here's how I unload the texture (this is called, I checked)
glDeleteTextures(1, &_texture);
I used glGetError() everywhere to check if an error occurs but it always returns 0 (even after the glDeleteTexture).
Does anybody have an idea? Thank you!
Make sure that you destroy your texture on the same thread, where you've created your context. If you're doing any GL calls without context there would be no errors.
Have you tried the GL_APPLE_client_storage
extension? It means that instead of having OpenGL having a copy of the texture memory you promise to keep the data block passed to glTexImage2D
alive until glDeleteTextures
has been called.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With