SDL_Delay() Accuracy Question

So I’ve been having a problem with SDL_Delay(). I’m trying to build frame independent movement and I keep getting seemingly random slowdowns. By that I mean that the object’s movement slows down because the game thinks the game loop has sped up. After some testing, I noticed that the loop sometimes thinks that 0 milliseconds have passed since the last loop even though I clearly have SDL_Delay(1) between my SDL_GetTicks() at the beginning and end of my loop. I thought maybe I’d isolate this problem and I made this.

Code:
#include
#include “SDL2/SDL.h”

int main( int argc, char* args[] )
{
int startTicks = 0;
int endTicks = 0;
int checkTicks = 5;
//Starts SDL
if (SDL_Init(SDL_INIT_EVERYTHING) == -1)
{
std::cout << “Couldn’t initialize SDL”;
return 1;
}

while (checkTicks >= 5)
{
	startTicks=SDL_GetTicks();
	SDL_Delay(5);
	endTicks=SDL_GetTicks();
	checkTicks = endTicks - startTicks;
	std::cout << checkTicks << std::endl;
}

return 0;

}

After a quick google search, I’ve found that SDL_Delay() isn’t incredibly accurate, but I thought that it would always delay AT LEAST the given number of milliseconds. The program above should last forever. However, after testing, I’ve found that the program only lasts an average of a couple seconds before I get a 4. I often get a 4 in my output immediately and the program ends.

I could always put SDL_Delay(2) in my program, but every once in a while I get random groups of 1’s for a couple of seconds in my output and the object on screen visibly slows down. The visible slow down is leading me to believe that a visibly significant amount of time is passing that SDL_GetTicks isn’t reporting.

Anyhow, either SDL_GetTicks() or SDL_Delay() is doing something differently than what I expected it to do. I’m sort of new to SDL so I was wondering if anyone could explain this.

Hi,

You should not use SDL_Delay in order to have framerate-independent
movement.
What I am assuming is that you want a constand DeltaTime.
Here is an article that explains this really good:
http://gafferongames.com/game-physics/fix-your-timestep/

Alex.On Wed, Mar 5, 2014 at 7:12 AM, Overkill wrote:

So I’ve been having a problem with SDL_Delay(). I’m trying to build
frame independent movement and I keep getting seemingly random slowdowns.
By that I mean that the object’s movement slows down because the game
thinks the game loop has sped up. After some testing, I noticed that the
loop sometimes thinks that 0 milliseconds have passed since the last loop
even though I clearly have SDL_Delay(1) between my SDL_GetTicks() at the
beginning and end of my loop. I thought maybe I’d isolate this problem and
I made this.

Code:

#include
#include “SDL2/SDL.h”

int main( int argc, char* args[] )
{
int startTicks = 0;
int endTicks = 0;
int checkTicks = 5;
//Starts SDL
if (SDL_Init(SDL_INIT_EVERYTHING) == -1)
{
std::cout << “Couldn’t initialize SDL”;
return 1;
}

while (checkTicks >= 5)
{
startTicks=SDL_GetTicks();
SDL_Delay(5);
endTicks=SDL_GetTicks();
checkTicks = endTicks - startTicks;
std::cout << checkTicks << std::endl;
}

return 0;
}

After a quick google search, I’ve found that SDL_Delay() isn’t incredibly
accurate, but I thought that it would always delay AT LEAST the given
number of milliseconds. The program above should last forever. However,
after testing, I’ve found that the program only lasts an average of a
couple seconds before I get a 4. I often get a 4 in my output immediately
and the program ends.

I could always put SDL_Delay(2) in my program, but every once in a while I
get random groups of 1’s for a couple of seconds in my output and the
object on screen visibly slows down. The visible slow down is leading me to
believe that a visibly significant amount of time is passing that
SDL_GetTicks isn’t reporting.

Anyhow, either SDL_GetTicks() or SDL_Delay() is doing something
differently than what I expected it to do. I’m sort of new to SDL so I was
wondering if anyone could explain this.


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org

what platform are we talking about on what device? I can’t explain by
ordinary means just yet this thing you have.On Tue, Mar 4, 2014 at 11:12 PM, Overkill wrote:

So I’ve been having a problem with SDL_Delay(). I’m trying to build
frame independent movement and I keep getting seemingly random slowdowns.
By that I mean that the object’s movement slows down because the game
thinks the game loop has sped up. After some testing, I noticed that the
loop sometimes thinks that 0 milliseconds have passed since the last loop
even though I clearly have SDL_Delay(1) between my SDL_GetTicks() at the
beginning and end of my loop. I thought maybe I’d isolate this problem and
I made this.

Code:

#include
#include “SDL2/SDL.h”

int main( int argc, char* args[] )
{
int startTicks = 0;
int endTicks = 0;
int checkTicks = 5;
//Starts SDL
if (SDL_Init(SDL_INIT_EVERYTHING) == -1)
{
std::cout << “Couldn’t initialize SDL”;
return 1;
}

while (checkTicks >= 5)
{
startTicks=SDL_GetTicks();
SDL_Delay(5);
endTicks=SDL_GetTicks();
checkTicks = endTicks - startTicks;
std::cout << checkTicks << std::endl;
}

return 0;
}

After a quick google search, I’ve found that SDL_Delay() isn’t incredibly
accurate, but I thought that it would always delay AT LEAST the given
number of milliseconds. The program above should last forever. However,
after testing, I’ve found that the program only lasts an average of a
couple seconds before I get a 4. I often get a 4 in my output immediately
and the program ends.

I could always put SDL_Delay(2) in my program, but every once in a while I
get random groups of 1’s for a couple of seconds in my output and the
object on screen visibly slows down. The visible slow down is leading me to
believe that a visibly significant amount of time is passing that
SDL_GetTicks isn’t reporting.

Anyhow, either SDL_GetTicks() or SDL_Delay() is doing something
differently than what I expected it to do. I’m sort of new to SDL so I was
wondering if anyone could explain this.


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org

You’re about to get 17 links to that “fix your timestep” article mailed
to you, but here’s the basic gist:

SDL_Delay() does not promise accuracy.

  • SDL_Delay() never works at a finer resolution than what the OS’s
    scheduler offers. For example, in the 90’s, Linux’s scheduler generally
    had a 10 millisecond resolution, which meant SDL_Delay(1) would sleep
    for at least ten times longer than requested (most Linux systems have a
    1ms resolution now, I think).
  • If a system is heavily loaded, you might sleep for hundreds of
    milliseconds.
  • On some platforms, if you delay for a short enough time, the OS will
    just put your app in a spinloop instead of giving up CPU cycles (Mac OS
    X has been known to do this in the past, maybe it still does), which has
    the end result of SDL_Delay() causing your program to use more CPU time.
  • Other platforms might turn SDL_Delay(1) into a no-op and return
    immediately (“close enough!”), in the view of the OS, it’s not meant to
    be a way to pass X milliseconds, it’s a way to give up unneeded CPU time.

Likely SDL_Delay(1) is pretty useless on most OSes. 10 is probably the
minimum safe bet, but the safest bet is to not require sleep accuracy.

(We should probably put this in the wiki, if it isn’t already there.)

–ryan.On 3/5/14, 12:12 AM, Overkill wrote:

So I’ve been having a problem with SDL_Delay(). I’m trying to build
frame independent movement and I keep getting seemingly random
slowdowns. By that I mean that the object’s movement slows down because
the game thinks the game loop has sped up. After some testing, I
noticed that the loop sometimes thinks that 0 milliseconds have passed
since the last loop even though I clearly have SDL_Delay(1) between my
SDL_GetTicks() at the beginning and end of my loop. I thought maybe I’d
isolate this problem and I made this.

To confuse things further, I’ve been using SDL_Delay(1) on Ubuntu 12.04,
SteamOS, Mac OS X 10.9, and Windows 7 and 8 with the expected results. I’ve
also found that at least on Windows 7 it can sleep for less time than you
asked for.On Fri, Mar 7, 2014 at 12:26 PM, Ryan C. Gordon wrote:

On 3/5/14, 12:12 AM, Overkill wrote:

So I’ve been having a problem with SDL_Delay(). I’m trying to build
frame independent movement and I keep getting seemingly random
slowdowns. By that I mean that the object’s movement slows down because
the game thinks the game loop has sped up. After some testing, I
noticed that the loop sometimes thinks that 0 milliseconds have passed
since the last loop even though I clearly have SDL_Delay(1) between my
SDL_GetTicks() at the beginning and end of my loop. I thought maybe I’d
isolate this problem and I made this.

You’re about to get 17 links to that “fix your timestep” article mailed to
you, but here’s the basic gist:

SDL_Delay() does not promise accuracy.

  • SDL_Delay() never works at a finer resolution than what the OS’s
    scheduler offers. For example, in the 90’s, Linux’s scheduler generally had
    a 10 millisecond resolution, which meant SDL_Delay(1) would sleep for at
    least ten times longer than requested (most Linux systems have a 1ms
    resolution now, I think).
  • If a system is heavily loaded, you might sleep for hundreds of
    milliseconds.
  • On some platforms, if you delay for a short enough time, the OS will
    just put your app in a spinloop instead of giving up CPU cycles (Mac OS X
    has been known to do this in the past, maybe it still does), which has the
    end result of SDL_Delay() causing your program to use more CPU time.
  • Other platforms might turn SDL_Delay(1) into a no-op and return
    immediately (“close enough!”), in the view of the OS, it’s not meant to be
    a way to pass X milliseconds, it’s a way to give up unneeded CPU time.

Likely SDL_Delay(1) is pretty useless on most OSes. 10 is probably the
minimum safe bet, but the safest bet is to not require sleep accuracy.

(We should probably put this in the wiki, if it isn’t already there.)

–ryan.


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org

To confuse things further, I’ve been using SDL_Delay(1) on Ubuntu 12.04,
SteamOS, Mac OS X 10.9, and Windows 7 and 8 with the expected results. I’ve
also found that at least on Windows 7 it can sleep for less time than you
asked for.

So I’ve been having a problem with SDL_Delay(). I’m trying to build
frame independent movement and I keep getting seemingly random
slowdowns. By that I mean that the object’s movement slows down because
the game thinks the game loop has sped up. After some testing, I
noticed that the loop sometimes thinks that 0 milliseconds have passed
since the last loop even though I clearly have SDL_Delay(1) between my
SDL_GetTicks() at the beginning and end of my loop. I thought maybe I’d
isolate this problem and I made this.

You’re about to get 17 links to that “fix your timestep” article mailed
to
you, but here’s the basic gist:

SDL_Delay() does not promise accuracy.

  • SDL_Delay() never works at a finer resolution than what the OS’s
    scheduler offers. For example, in the 90’s, Linux’s scheduler generally
    had
    a 10 millisecond resolution, which meant SDL_Delay(1) would sleep for at
    least ten times longer than requested (most Linux systems have a 1ms
    resolution now, I think).
  • If a system is heavily loaded, you might sleep for hundreds of
    milliseconds.
  • On some platforms, if you delay for a short enough time, the OS will
    just put your app in a spinloop instead of giving up CPU cycles (Mac OS X
    has been known to do this in the past, maybe it still does), which has
    the
    end result of SDL_Delay() causing your program to use more CPU time.
  • Other platforms might turn SDL_Delay(1) into a no-op and return
    immediately (“close enough!”), in the view of the OS, it’s not meant to
    be
    a way to pass X milliseconds, it’s a way to give up unneeded CPU time.

Likely SDL_Delay(1) is pretty useless on most OSes. 10 is probably the
minimum safe bet, but the safest bet is to not require sleep accuracy.

(We should probably put this in the wiki, if it isn’t already there.)

–ryan.

Once upon a time (like 10 years ago), somebody published their own
benchmarks on the web about usleep behavior on Unix. On Linux, there
was an odd behavior that if you didn’t request rounded increments of
10ms (e.g. 10, 20, 30, but never 35), Linux was very inaccurate and
late. But if you did multiples of 10, then it was spot on. I confirmed
this myself at the time and always do multiples of 10 now. I presume
it is fixed now, but old habits die hard.

I also discovered that for some systems, if you specify too small of a
time (excluding 0), you suck up all the CPU. I speculate this is
because if the time is too small, the OS doesn’t want to yield the
thread because it knows that the context switching overhead won’t
possibly make it back in the time you requested so it just busy
spin-waits.

-EricOn 3/7/14, Sam Lantinga wrote:

On Fri, Mar 7, 2014 at 12:26 PM, Ryan C. Gordon wrote:

On 3/5/14, 12:12 AM, Overkill wrote:

Beginning iPhone Games Development
http://playcontrol.net/iphonegamebook/