Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Freeglut, OpenGL and memory

I've started practicing OpenGL with Glew and Freeglut.

I have a question about my application and was wondering if anyone ran into the same problem (if it is one)?

When I initially execute my application, the memory used is around 22,000 KB. After minimizing my window and maximizing it again, it only takes 2,900-3,300 KB of memory and continues to do so even after minimizing and maximizing the window again as well as performing mouse and keyboard input while the window has the focus.

I'm wondering why this is so? I don't know too much about FreeGlut and I'm wondering if anyone else has noticed this behavior when minimizing/maximizing the window with FreeGlut. Or maybe this is an OS-specific thing?

Sorry for not mentioning it before, but I'm using Windows XP SP3 and I'm setting up the OpenGL context with the following lines of code:

glutInit(&argc, argv);
glutInitContextVersion(3, 3);
glutInitDisplayMode(GLUT_RGBA | GLUT_DOUBLE | GLUT_DEPTH);
glutInitWindowPosition(30, 30);
glutInitWindowSize(1000, 562);
glutCreateWindow("Testing");

glewExperimental = GL_TRUE;
glewInit();
like image 387
SecretStoven Avatar asked Aug 30 '13 09:08

SecretStoven


1 Answers

This is highly OS dependent and also depends on how you measure memory usage, but I can give you a little bit of insight into why this might be happening on Microsoft Windows. Microsoft has the following to say about user-mode memory usage in an application that uses a WDDM-based driver:

Existing games and other graphics applications frequently allocate virtual memory for a copy of the video memory resources that the application uses. The application uses this copy to restore the display quickly if the contents of video memory are lost. For example, the application uses this copy if the user presses ALT+TAB or if the user puts the computer in standby. Typically, the DirectX run time manages the copy on behalf of the application when the application creates a managed resource. However, an application can also manage the copy itself. The virtual memory that the copy uses is directly proportional to the video memory resources that the application allocates.

Now, while Microsoft describes this a DirectX problem it actually applies to OpenGL too. Behind the scenes OpenGL usually deals with "device lost" events completely transparently (this is not a part of OpenGL itself, but of the window system -- WGL in this case). Although you can actually receive these events in your software in OpenGL 4.x using one of the robustness extensions you usually want to pretend like they do not exist. In any case, I suspect that this is what is to blame.

Your memory consumption will vary wildly depending on how you measure your application's memory consumption, whether it's simply a count of the number of virtual memory pages allocated (virtual size), or actual resident pages (working/resident set). On Windows, if you look in the task manager at memory consumption that is usually a measure of working set. Working set is a very vague concept, and usually refers to the number of pages that were "recently" referenced; it can grow/shrink depending on when it is sampled irrespective of how much memory is actually resident at that exact instant. I would chalk this up to normal driver behavior, and realize that this memory is probably not going to affect anything seriously.

If you're running on a modern desktop 64-bit platform (x86-64) you actually have a 48-bit hardware address space (256 TiB) and the OS will give you a fraction of this (8 TiB on Windows) for individual user-mode processes, so 22,000 KiB is completely insignificant (if you actually compiled your software as 64-bit).

like image 168
Andon M. Coleman Avatar answered Oct 17 '22 08:10

Andon M. Coleman