Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to keep the CPU usage down while running an SDL program?

I've done a very basic window with SDL and want to keep it running until I press the X on window.

#include "SDL.h"
const int SCREEN_WIDTH = 640;
const int SCREEN_HEIGHT = 480;

int main(int argc, char **argv)
{
    SDL_Init( SDL_INIT_VIDEO );
    SDL_Surface* screen = SDL_SetVideoMode( SCREEN_WIDTH, SCREEN_HEIGHT, 0, 
                                            SDL_HWSURFACE | SDL_DOUBLEBUF );
    SDL_WM_SetCaption( "SDL Test", 0 ); 
    SDL_Event event;
    bool quit = false;
    while (quit != false)
    {
        if (SDL_PollEvent(&event)) {
            if (event.type == SDL_QUIT) {
                quit = true;
            }
        }
        SDL_Delay(80);
    }
    SDL_Quit();
    return 0;
}

I tried adding SDL_Delay() at the end of the while-clause and it worked quite well.

However, 80 ms seemed to be the highest value I could use to keep the program running smoothly and even then the CPU usage is about 15-20%.

Is this the best way to do this and do I have to just live with the fact that it eats this much CPU already on this point?

like image 936
BudwiseЯ Avatar asked Oct 07 '12 15:10

BudwiseЯ


3 Answers

I know this is an older post, but I myself just came across this issue with SDL when starting up a little demo project. Like user 'thebuzzsaw' noted, the best solution is to use SDL_WaitEvent to reduce the CPU usage of your event loop.

Here's how it would look in your example for anyone looking for a quick solution to it in the future. Hope it helps!

#include "SDL.h"
const int SCREEN_WIDTH = 640;
const int SCREEN_HEIGHT = 480;

int main(int argc, char **argv)
{
    SDL_Init( SDL_INIT_VIDEO );
    SDL_Surface* screen = SDL_SetVideoMode( SCREEN_WIDTH, SCREEN_HEIGHT, 0, 
                                            SDL_HWSURFACE | SDL_DOUBLEBUF );
    SDL_WM_SetCaption( "SDL Test", 0 ); 
    SDL_Event event;
    bool quit = false;
    while (quit == false)
    {
        if (SDL_WaitEvent(&event) != 0) {
            switch (event.type) {
            case SDL_QUIT:
            quit = true;
            break;
            }
        }
    }
    SDL_Quit();
    return 0;
}
like image 181
Gordon Robert Speirs Avatar answered Nov 15 '22 18:11

Gordon Robert Speirs


I would definitely experiment with fully blocking functions (such as SDL_WaitEvent). I have an OpenGL application in Qt, and I noticed the CPU usage hovers between 0% and 1%. It spikes to maybe 4% during "usage" (moving the camera and/or causing animations).

I am working on my own windowing toolkit. I have noticed I can achieve similar CPU usage when I use blocking event loops. This will complicate any timers you may depend on, but it is not terribly difficult to implement timers with this new approach.

like image 34
TheBuzzSaw Avatar answered Nov 15 '22 20:11

TheBuzzSaw


I just figured out how to reduce CPU usage in my game from 50% down to < 10%. Your program is much more simple and simply using SDL_Delay() should be enough.

What I did was: Use SDL_DisplayFormat() when loading images, so the blitting would be faster. This brought its CPU usage down to about 30%.

So I found out that blitting the games background (big one-piece .png file) was eating the most out of my CPU. I searched the Internet for a solution, but all I found was the same answer - just use SDL_Delay(). Finally, I found out that the problem was embarrassingly simple - the SDL_DisplayFormat() was converting my 24-bit images to 32-bit. So I set my display BPP to 24, which brought CPU usage to ~20%. Bringing it down to 16 bit solved the problem for me and the CPU usage is under 10% now.

Of course this means loss of color detail, but as my game is a simplistic 2D game with not too detailed graphics, this was OK.

like image 25
Erik Avatar answered Nov 15 '22 18:11

Erik