SDL_Delay

Hi all,

Would there be a reason for SDL_Delay to be slow ?
I’m writing a little game, and to normalize the game speed
I add an SDL_Delay() at the end of the game loop.

On Mac OS X, everything goes fine. On my Intel dual core on Linux,
it is slow (nearly not playable). After removing the SDL_Delay, the game loop goes
damn too fast.

Why such a difference ?

Regards._____________________________________________________________________________
Envoyez avec Yahoo! Mail. Une boite mail plus intelligente http://mail.yahoo.fr

Hello !

Would there be a reason for SDL_Delay to be slow ?
I’m writing a little game, and to normalize the game speed
I add an SDL_Delay() at the end of the game loop.

On Mac OS X, everything goes fine. On my Intel dual core on Linux,
it is slow (nearly not playable). After removing the SDL_Delay, the
game loop goes
damn too fast.

Why such a difference ?

SDL_Delay (15) may wait 15 msecs, but it may also wait 20 msecs.
This depends on the OS scheduler time slices. Better is to construct a
loop that checks if the actual difference is more than the scheduler size,
than do a SDL_Delay and if not just waiting in the loop till the actual time
reaches the wanted end time of the frame.

CU

CU

Would there be a reason for SDL_Delay to be slow ?
I’m writing a little game, and to normalize the game speed
I add an SDL_Delay() at the end of the game loop.

On Mac OS X, everything goes fine. On my Intel dual core on Linux,
it is slow (nearly not playable). After removing the SDL_Delay, the game
loop goes
damn too fast.

Why such a difference ?

SDL_Delay (15) may wait 15 msecs, but it may also wait 20 msecs.
This depends on the OS scheduler time slices. Better is to construct a
loop that checks if the actual difference is more than the scheduler size,
than do a SDL_Delay and if not just waiting in the loop till the actual time
reaches the wanted end time of the frame.

CU

i dont think using a delay works well for timing applications, i prefer
using a timer event.

matt

speaking of my arse.

SDL_Delay will call an operating system / os level sleep() function.
Those are inherently known to have a very wide array granularities and
afaik, your mileage will vary even when specifying a delay greater
than the minimum granularity offered by the underlying system.

A while(sdl_ticks()) loop is more or less busy waiting.

Is there a way to get accurate timings without busy waiting?On Wed, Jul 2, 2008 at 4:22 PM, Will Langford wrote:

speaking of my arse.

SDL_Delay will call an operating system / os level sleep() function.
Those are inherently known to have a very wide array granularities and
afaik, your mileage will vary even when specifying a delay greater
than the minimum granularity offered by the underlying system.

A while(sdl_ticks()) loop is more or less busy waiting.


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org

you can use SDL_GetTicks to determine the time you want to sleep, but dont
expect it to return from sleep when you want.

i have heard some people use the sound call back.

opengl can use vsync.

currently i have a timer that pushes a user event. havent tested to see
how accurate it is, perhaps ill do that.

mattOn Wed, 2 Jul 2008, Steven Lu wrote:

Is there a way to get accurate timings without busy waiting?

On Wed, Jul 2, 2008 at 4:22 PM, Will Langford wrote:

speaking of my arse.

SDL_Delay will call an operating system / os level sleep() function.
Those are inherently known to have a very wide array granularities and
afaik, your mileage will vary even when specifying a delay greater
than the minimum granularity offered by the underlying system.

A while(sdl_ticks()) loop is more or less busy waiting.

I’d use system API calls directly. If you want to write cross-platform code, create your own name for a delay routine, delcare it somewhere, and use IFDEFs to have it be a macro for whatever the system sleep API is for each system. (Which is more or less what SDL_Delay() is trying to accomplish, but if you can make it work at a better resolution, you might as well. A 10 ms resolution just isn’t good enough for precision control at anything above about 30 FPS.)>----- Original Message ----

From: Steven Lu
Subject: Re: [SDL] SDL_Delay

Is there a way to get accurate timings without busy waiting?

Hey,

You should wrap SDL_Delay to ensure good resolution. I’ve just put up an article (and code) about this since I’ve talked about it before on the mailing list. Store the time it takes to SDL_Delay(1), then delay that much less than you want, then burn the rest in an empty loop. It works very well for me, though I haven’t applied any strict testing to it.

http://pubpages.unh.edu/~jmb97/tutorials/delay.html

Jonny DDate: Wed, 2 Jul 2008 15:32:53 -0700
From: masonwheeler@yahoo.com
To: sdl at lists.libsdl.org
Subject: Re: [SDL] SDL_Delay

I’d use system API calls directly. If you want to write cross-platform code, create your own name for a delay routine, delcare it somewhere, and use IFDEFs to have it be a macro for whatever the system sleep API is for each system. (Which is more or less what SDL_Delay() is trying to accomplish, but if you can make it work at a better resolution, you might as well. A 10 ms resolution just isn’t good enough for precision control at anything above about 30 FPS.)

----- Original Message ----
From: Steven Lu
Subject: Re: [SDL] SDL_Delay

Is there a way to get accurate timings without busy waiting?

I’m a bit leery of that solution. Yeah, it’ll give you accurate timing, but when your desired delay times get short enough that precision and SDL_Delay resolution become an issue, your “remainder” is likely to be a significant fraction of the total amount of delay time, which means you’ll be red-lining your CPU a lot if you do this frequently. Yes, it would definitely work, and could be used as a last resort, but if system API calls would work better, I’d prefer to use them and suffer through a few IFDEFs.>----- Original Message ----

From: Jonathan Dearborn
Subject: Re: [SDL] SDL_Delay

Hey,

You should wrap SDL_Delay to ensure good resolution. I’ve just put up an article (and code) about this since
I’ve talked about it before on the mailing list. Store the time it takes to SDL_Delay(1), then delay that much
less than you want, then burn the rest in an empty loop. It works very well for me, though I haven’t applied
any strict testing to it.

http://pubpages.unh.edu/~jmb97/tutorials/delay.html

Jonny D

means you’ll be red-lining your CPU a lot if you do this frequently. Yes, it would definitely work, and could be used as a last resort, but if

Out of curiosity – if you’re doing a game, then does red lining the
cpu with a busy wait really matter ? Maybe I’m a bit too old school in
that way of thinking though… :slight_smile:

-Will

Hello !

means you’ll be red-lining your CPU a lot if you do this
frequently. Yes, it would definitely work, and could be used as a
last resort, but if

Out of curiosity – if you’re doing a game, then does red lining the
cpu with a busy wait really matter ? Maybe I’m a bit too old school in
that way of thinking though… :slight_smile:

It does matter. On DOS it was okay, if everything is done,
just do a busy wait. On Windows, Linux, Mac OSX you should give
the OS time to process the events, handle the drawing stuff and
so on.

CU

What kind of API calls are you referring to…?On Wed, Jul 2, 2008 at 7:06 PM, Mason Wheeler wrote:

I’m a bit leery of that solution. Yeah, it’ll give you accurate timing, but
when your desired delay times get short enough that precision and SDL_Delay
resolution become an issue, your “remainder” is likely to be a significant
fraction of the total amount of delay time, which means you’ll be red-lining
your CPU a lot if you do this frequently. Yes, it would definitely work,
and could be used as a last resort, but if system API calls would work
better, I’d prefer to use them and suffer through a few IFDEFs.

----- Original Message ----
From: Jonathan Dearborn
Subject: Re: [SDL] SDL_Delay

Hey,

You should wrap SDL_Delay to ensure good resolution. I’ve just put up an
article (and code) about this since
I’ve talked about it before on the mailing list. Store the time it takes
to SDL_Delay(1), then delay that much
less than you want, then burn the rest in an empty loop. It works very
well for me, though I haven’t applied
any strict testing to it.

http://pubpages.unh.edu/~jmb97/tutorials/delay.htmlhttp://pubpages.unh.edu/~jmb97/tutorials/delay.html

Jonny D


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org

means you’ll be red-lining your CPU a lot if you do this frequently.
Yes, it would definitely work, and could be used as a last resort, but if

Out of curiosity – if you’re doing a game, then does red lining the
cpu with a busy wait really matter ? Maybe I’m a bit too old school in
that way of thinking though… :slight_smile:

It does matter. On DOS it was okay, if everything is done,
just do a busy wait. On Windows, Linux, Mac OSX you should give
the OS time to process the events, handle the drawing stuff and
so on.

Continuing – most modern systems will simply just task switch away
from your game process as needed. If your programming layer requires
something to process events and similar (see the evils of DoEvents in
VB), I can understand wanting to sleep() for a moment, then. But on
most platforms, this shouldn’t be necessary ? no ?

But, I’d also venture a guess that while playing a game, you should
assume the player wants the game experience to be top notch at the
expense of their bit torrent download in the background… :).

-Will

Hello !

Continuing – most modern systems will simply just task switch away
from your game process as needed. If your programming layer requires
something to process events and similar (see the evils of DoEvents in
VB), I can understand wanting to sleep() for a moment, then. But on
most platforms, this shouldn’t be necessary ? no ?

It is necessary. At least the OS will task
switch away from your game, but the OS scheduler
cannot guess when your game is only waiting.

So the result will be suboptimal.

CU

Continuing – most modern systems will simply just task switch away
from your game process as needed. If your programming layer requires
something to process events and similar (see the evils of DoEvents in
VB), I can understand wanting to sleep() for a moment, then. But on
most platforms, this shouldn’t be necessary ? no ?

But, I’d also venture a guess that while playing a game, you should
assume the player wants the game experience to be top notch at the
expense of their bit torrent download in the background… :).

The problem is that if a game hogs the CPU, then the operating system
will just take it away forcibly to let the (hopefully!) well-behaved
BitTorrent download, possibly at an inopportune time.

The idea is that if you’re getting a sufficiently high frame rate, you
sleep a little bit, and when you really need the CPU, you have it.
But spinning and drawing the screen at 150 fps (when most LCD displays
refresh at no more than 60 fps anyway!) can actually make it more
"jittery", where the operating system wrests away the CPU in the
middle of rendering a frame, things like that.On Wed, Jul 2, 2008 at 7:39 PM, Will Langford wrote:


http://pphaneuf.livejournal.com/

This is why I try to enable vsync in OpenGL , and if anyone’s seen my other
email earlier I had this annoying issue where the waiting for vsync somehow
manifested itself in the SDL event queue … currently I have my loops
arranged in a way such that this doesn’t happen, and I have to say that
vsync is 100% smooth.

However, from my experiences, if you are NOT enabling vsync, no matter what
timing method you use, it will completely fail at synching up with the
monitor and you might as well put a couple Sleep calls in since busy waiting
doesn’t make it any smoother, and may also contribute more lag as has been
mentioned.

Of course this is with OpenGL rendering and may or may not pertain to
regular SDL rendering.On Wed, Jul 2, 2008 at 8:40 PM, Pierre Phaneuf wrote:

On Wed, Jul 2, 2008 at 7:39 PM, Will Langford wrote:

Continuing – most modern systems will simply just task switch away
from your game process as needed. If your programming layer requires
something to process events and similar (see the evils of DoEvents in
VB), I can understand wanting to sleep() for a moment, then. But on
most platforms, this shouldn’t be necessary ? no ?

But, I’d also venture a guess that while playing a game, you should
assume the player wants the game experience to be top notch at the
expense of their bit torrent download in the background… :).

The problem is that if a game hogs the CPU, then the operating system
will just take it away forcibly to let the (hopefully!) well-behaved
BitTorrent download, possibly at an inopportune time.

The idea is that if you’re getting a sufficiently high frame rate, you
sleep a little bit, and when you really need the CPU, you have it.
But spinning and drawing the screen at 150 fps (when most LCD displays
refresh at no more than 60 fps anyway!) can actually make it more
"jittery", where the operating system wrests away the CPU in the
middle of rendering a frame, things like that.


http://pphaneuf.livejournal.com/


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org

I can’t think of a case when you would program your delay time to be less than 10 ms for a prolonged period. It’s usually pointless to go faster than your monitor can go, and scientific computing uses arbitrary time steps that don’t depend on the actual run time. Red-lining now and then would probably be okay, anyway. As for system API calls, even Sleep() on Windows has a resolution of 10-15ms. This is because the clock ticks are less precise than milliseconds. If you want to control your program beyond that limit, I’d suppose you’ll have to look into specialized libs. That sounds like a mess unless you really, REALLY need it (especially considering that my code is already written). I’d be fine with some #ifdefs in my delay function too, but I don’t think it’d be so easy to overcome this.

Jonny D> Date: Wed, 2 Jul 2008 16:06:45 -0700

From: masonwheeler at yahoo.com
To: sdl at lists.libsdl.org
Subject: Re: [SDL] SDL_Delay

I’m a bit leery of that solution. Yeah, it’ll give you accurate timing, but when your desired delay times get short enough that precision and SDL_Delay resolution become an issue, your “remainder” is likely to be a significant fraction of the total amount of delay time, which means you’ll be red-lining your CPU a lot if you do this frequently. Yes, it would definitely work, and could be used as a last resort, but if system API calls would work better, I’d prefer to use them and suffer through a few IFDEFs.

----- Original Message ----
From: Jonathan Dearborn
Subject: Re: [SDL] SDL_Delay

Hey,

You should wrap SDL_Delay to ensure good resolution. I’ve just put up an article (and code) about this since
I’ve talked about it before on the mailing list. Store the time it takes to SDL_Delay(1), then delay that much
less than you want, then burn the rest in an empty loop. It works very well for me, though I haven’t applied
any strict testing to it.

http://pubpages.unh.edu/~jmb97/tutorials/delay.html

Jonny D


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org

be okay, anyway. As for system API calls, even Sleep() on Windows has a
resolution of 10-15ms. This is because the clock ticks are less precise
than milliseconds. If you want to control your program beyond that limit,
I’d suppose you’ll have to look into specialized libs. That sounds like a
mess unless you really, REALLY need it (especially considering that my code
is already written). I’d be fine with some #ifdefs in my delay function
too, but I don’t think it’d be so easy to overcome this.

What kind of “specialized libs”? If they don’t use system calls, then
we could do it ourselves, or even better, integrate the logic in SDL.
If they do use system calls, we could just use them directly, or again
better, abstract them in SDL.

Either way, if there’s a better way to implement SDL_Delay, then we
should do that, and make something like “SDL_Delay(3);” work as well
as we can. This would make every user of SDL work better, which is a
good thing, and is why we use SDL in the first place.On Thu, Jul 3, 2008 at 9:11 AM, Jonathan Dearborn wrote:


http://pphaneuf.livejournal.com/

Yes, look at the discussion from last week, you may find it archived somewhere I guess.
It was about the use of SDL_Delay, SDL_GetTicks, and good ways to use SDL_PollEvent().

cheers

----- Message d’origine ----De : Mason Wheeler
? : A list for developers using the SDL library. (includes SDL-announce)
Envoy? le : Mardi, 8 Juillet 2008, 0h48mn 42s
Objet : Re: [SDL] my_timer_id = SDL_AddTimer(delay, my_callbackfunc, my_callback_param);

Yes, but keep in mind the problems with SDL_Delay’s resolution, which can become significant at this level of precision. There was a thread discussing ways to deal with it on here last week.

----- Original Message ----
From: Jesse P.
Subject: Re: [SDL] my_timer_id = SDL_AddTimer(delay, my_callbackfunc, my_callback_param);

Hi,

So the following is a good method to make game 30 Frames Per Second
(roughly) ?

  _____________________________________________________________________________ 

Envoyez avec Yahoo! Mail. Une boite mail plus intelligente http://mail.yahoo.fr