Tiny program, 100% cpu usage, and SDL_Delay

hi,

here is a very simple program that does nearly nothing :

  • we initialize SDL and we use SDL_SetVideoMode(1024, 768, 0,
    SDL_SWSURFACE)

  • and we fall in this loop :
    while (!end)
    {
    modifs = SDL_GetModState();
    {
    SDL_Event event;

    while (SDL_PollEvent(&event))
    {
      if (event.type == SDL_QUIT)
      {
        end = 1;
      }
    }
    

    }
    }

Note that this loop does not update the screen.

  • we exit when we receive the SDL_QUIT event.

Well, my problem is that this program (even it’s a big word) uses 100% of
my CPU (well, top says it).
So I had a look at aliens 1.0.2 and saw the WaitFrame() function :

#define FRAMES_PER_SEC 50
void WaitFrame(void)
{
static Uint32 next_tick = 0;
Uint32 this_tick;

/* Wait for the next frame */
this_tick = SDL_GetTicks(); 
if ( this_tick < next_tick ) {
	SDL_Delay(next_tick-this_tick);
}
next_tick = this_tick + (1000/FRAMES_PER_SEC);

}

Now if I add a call to this function within the loop :
while(!end)
{

WaitFrame();
}

the CPU usage falls to 1% or 2%.

Can you explain me this behavior ? It would help me a lot.
Thanks in advance,
Cl?ment Bourdarias (phneutre).

In article <20010712203856.4efdaa30.cbour at noos.fr>, “Cl?ment Bourdarias”
wrote:

Can you explain me this behavior ? It would help me a lot. Thanks in
advance, Cl?ment Bourdarias (phneutre).

Yeah, that’s nothing SDL specific. In the example you showed, you’re
busy waiting (i.e. spinning constantly in a loop until a certain
condition is met). In the other example (with the WaitFrame) there is a
check for the condition and then the process is put to sleep for a while
before repeating. (because of SDL_Delay). If you don’t sleep the process
and remain in the tight loop, 100% of the CPU will be used.

Hope this helps!

-Neill.–
http://www.thecodefactory.org/neillm

here is a very simple program that does nearly nothing :

while (!end)
{
modifs = SDL_GetModState();
{
SDL_Event event;

  while (SDL_PollEvent(&event))
  {
    if (event.type == SDL_QUIT)
    {
      end = 1;
    }
  }
}

}

I’m no SDL expert, but by the look of it, your program simply goes round
and round in a pair of while loops permanently. Unless a program tells
the operating system that it is prepared to relinquish it’s time slice,
it can technically take 100% of the CPU time (or as much as the OS
wishes to give it). Nothing I can see in your program relinquishes
control to the OS, therefore it takes as much CPU time as it can get.

Now, the SDL_Delay( ) function is presumably implemented by telling the
OS to suspend this program and resume it when the specified time is up.
This means the OS has been given CPU time that other tasks can
potentially use, which will mean your program uses less than 100% CPU.

Note that this doesn’t mean your program gets magically more efficient
just by adding this line. It just means that your program spends less
time looping round and round waiting for events.

Depending on the program you’re making, you might see a similar result
if you use SDL_WaitEvent( ) instead of SDL_PollEvent( ), but a real SDL
expert would have to comment on that.


B. Sizer

----- Original Message -----
From: cbour@noos.fr (Clement Bourdarias)
Sent: Thursday, July 12, 2001 7:38 PM
Subject: [SDL] tiny program, 100% cpu usage, and SDL_Delay

it’s ok, Mikko Rantalainen sent me the explaination.

Sorry for having sent this out of topic question,
phneutre.