Question about SDL_GetTicks()

Hello,

I had a question about how SDL_GetTicks() works. FYI I’m programming in
C++ in Visual Studio 2013 with SDL 2.0 (not the most recent, but just after
2.0 came out). I’ve set it up so that my compiled programs will run on XP
and I try to test my game on a crappy old XP machine to be sure they run
smooth. I have my game running great on my home PC (Win 8.1 w/ a nice
video card), but it runs slow at work in some places. I tried to figure
out where it was slowing down, but I wanted to confirm my understanding of
SDL_GetTicks() first.

If you need code, let me know. Here is what you need to know, at least:

  1. Initialize SDL & other stuff.
  2. Start game loop.

My Question:
Now if I call SDL_GetTicks() at ANY TIME throughout my game, it should
always give me an accurate amount of milliseconds since the start of the
game, right?

My Test:
I have the console running behind my game which I alt-tab to. I set up my
game loop to cout << SDL_GetTicks() each frame just to test this theory.
When in the main menu of my game I’ll alt-tab and watch the numbers fly
by, but I can see it’s basically 1000 per second. So 5 seconds in I’m
seeing 5000, 5001, etc. (obviously not because I can’t read 1000 lines per
second, but I can see 5xxx and 6xxx a second later).

Now the game loads level 1 and starts processing a bit more logic &
graphics. Alt-tab back to my console, all of a sudden the time isn’t
ticking by as fast. It’s more like 2-3 seconds for it to go up 1000. This
can’t be right, can it? Is there something wrong with my XP machine? I’m
going to try to update to the most recent SDL tonight.

As far as I can tell this has NOTHING to do with my game loop and how I
process time as I’m just trying to do the most basic task of seeing how
much time has passed since launch and the timer is all handled behind the
scenes in SDL, correct? If so, … what is going on?! This is driving me
nuts.

I will also note that when i disable rendering for level 1, the time keeps
up, so I’m trying to see if there is an issue in there, but even if it
takes 2 minutes to render one frame, SDL_GetTicks() shouldn’t be held up
because of it, right?

Sorry for rambling. :slight_smile:

Thanks,
Dave

I have absolutely no experience with this but:

Could you try timing your program with regular c++ functions instead?
I think the header file is <time.h>, then just do:

std::cout << clock() << std::endl;

And see if the same thing happens.

SDL_GetTicks() gives you something standard across all platforms.On 28 February 2014 17:24, Bananskrue wrote:

I have absolutely no experience with this but:

Could you try timing your program with regular c++ functions instead?
I think the header file is , then just do:

std::cout << clock() << std::endl;

And see if the same thing happens.


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org

Bananskrue wrote:

I have absolutely no experience with this but:

Could you try timing your program with regular c++ functions instead?
I think the header file is <time.h>, then just do:

std::cout << clock() << std::endl;

And see if the same thing happens.

Good call! I have no idea why I didn’t think of that…perhaps it’s the fact that I’m exhausted with a 1 month old baby at home. It wasn’t bad the first time, but now having a new baby with a 2 year old running around … crazy.

Anyways, here’s the new code:
std::cout << SDL_GetTicks() << ", " << clock() << std::endl;

On the good windows 8.1 machine both numbers match; the game keeps running at the correct speed.
On the crappy XP machine, SDL_GetTicks() falls behind as it has been, but clock() kept the correct time!

So now my question is, what could cause SDL_GetTicks() to fall behind like that? And how do I fix it?

Thanks,
Dave

How far behind, and is it the same amount behind as you go steady or does
it spiked up and down? If its spikes what are the top and bottom end times?On Feb 28, 2014 1:56 PM, “dvereb” wrote:

Bananskrue wrote:

I have absolutely no experience with this but:

Could you try timing your program with regular c++ functions instead?
I think the header file is , then just do:

std::cout << clock() << std::endl;

And see if the same thing happens.

Good call! I have no idea why I didn’t think of that…perhaps it’s the
fact that I’m exhausted with a 1 month old baby at home. It wasn’t bad the
first time, but now having a new baby with a 2 year old running around …
crazy.

Anyways, here’s the new code:
std::cout << SDL_GetTicks() << ", " << clock() << std::endl;

On the good windows 8.1 machine both numbers match; the game keeps running
at the correct speed.
On the crappy XP machine, SDL_GetTicks() falls behind as it has been, but
clock() kept the correct time!

So now my question is, what could cause SDL_GetTicks() to fall behind like
that? And how do I fix it?

Thanks,
Dave


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org

R Manard wrote:

How far behind, and is it the same amount behind as you go steady or does it spiked up and down? If its spikes what are the top and bottom end times?

I rewrote it so it displays SDL_GetTicks() and clock() every time SDL_GetTicks() goes up by 1000 (so it’s easier to see the changes over time).

I stayed in the menu for 10 seconds (10000 milliseconds) and it was synced throughout that time:
SDL, clock:
1006, 1093
2003, 2078
3002, 3078
4002, 4078
5002, 5078
6003, 6078
7002, 7078
8002, 8078
9010, 9093
10010, 10093

Then I pressed a key to load level 1 (all graphics are loaded on game launch, so it isn’t spending any time loading - instant into level 1) where it slows down to what SEEMS like a consistent slow speed
SDL, clock:
11002, 12296
12002, 15515
13002, 18703
14007, 21906
15003, 25078
16011, 28109
17011, 31156
18002, 34093
19004, 37109

So when it is slow, 1 second of SDL_GetTicks() is just over 3 seconds of clock() time. Clock time seems to be REAL time. 37.109 seconds in, SDL think’s I’m 19.004 seconds in.

Does that answer your question or do you want me to try something else? Thanks for the help!

So at any given time the difference between the two calls is at most 87
thousands of a second difference. So we have to ask ourselves first, is a
tick the same as whatever you are counting with the other call. One of them
is tix , which I don’t think is exactly 1/1000 of a second, and what is the
other? Is it an operating system time call?On Feb 28, 2014 2:45 PM, “dvereb” wrote:

R Manard wrote:

How far behind, and is it the same amount behind as you go steady or does
it spiked up and down? If its spikes what are the top and bottom end times?

I rewrote it so it displays SDL_GetTicks() and clock() every time
SDL_GetTicks() goes up by 1000 (so it’s easier to see the changes over
time).

I stayed in the menu for 10 seconds (10000 milliseconds) and it was synced
throughout that time:
SDL, clock:
1006, 1093
2003, 2078
3002, 3078
4002, 4078
5002, 5078
6003, 6078
7002, 7078
8002, 8078
9010, 9093
10010, 10093

Then I pressed a key to load level 1 (all graphics are loaded on game
launch, so it isn’t spending any time loading - instant into level 1) where
it slows down to what SEEMS like a consistent slow speed
SDL, clock:
11002, 12296
12002, 15515
13002, 18703
14007, 21906
15003, 25078
16011, 28109
17011, 31156
18002, 34093
19004, 37109

So when it is slow, 1 second of SDL_GetTicks() is just over 3 seconds of
clock() time. Clock time seems to be REAL time. 37.109 seconds in, SDL
think’s I’m 19.004 seconds in.

Does that answer your question or do you want me to try something else?
Thanks for the help!


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org

R Manard wrote:

So at any given time the difference between the two calls is at most 87 thousands of a second difference. So we have to ask ourselves first, is a tick the same as whatever you are counting with the other call. One of them is tix , which I don’t think is exactly 1/1000 of a second, and what is the other? Is it an operating system time call?

19004, 37109

37,109 - 19,004 = 18.105 seconds difference. The slight milliseconds difference for the first 10 seconds of running is likely because it took SDL 87 milliseconds or so to initialize…then for the next ten seconds they’re in sync because I’m in my game’s menu. Then it starts to slow down and it loses 2.some seconds each second.

I’m using clock() from time.h as suggested above. I have no experience with it as I used to use the high resolution windows timers, not clock().

While I can’t address the core of this matter I will note that the QueryPerformanceCounter and timeGetTime can differ quite wildly on certain motherboards, mostly from the Windows XP era.

In general you can’t trust QueryPerformanceCounter because it is the cpu clock and can vary on some cpus from that era as well (due to power profiles, as well as multicore CPUs that contain
independent clocks and thread allocation to CPU cores can fluctuate wildly), constant monitoring of QueryPerformanceFrequency only partly addresses the issues with this method.

However SDL_GetTicks() uses timeGetTime() which can deviate substantially from wall time on the aforementioned motherboards.

A more reliable clock source for these offending motherboards would be greatly appreciated in my game engine as well (as I have both SDL and native implementations of the time functions, primarily for
profiling reasons where QueryPerformanceCounter is the superior choice, but is not the default for timekeeping reasons).

I’ve never dug deeply into this issue, I have only made the observation that timeGetTime is not trustworthy on certain motherboards.On 02/28/2014 01:11 PM, dvereb wrote:

R Manard wrote:
So at any given time the difference between the two calls is at most 87 thousands of a second difference. So we have to ask ourselves first, is a tick the same as whatever you are counting with the
other call. One of them is tix , which I don’t think is exactly 1/1000 of a second, and what is the other? Is it an operating system time call?

19004, 37109

37,109 - 19,004 = 18.105 seconds difference. The slight milliseconds difference for the first 10 seconds of running is likely because it took SDL 87 milliseconds or so to initialize…then for the
next ten seconds they’re in sync because I’m in my game’s menu. Then it starts to slow down and it loses 2.some seconds each second.

I’m using clock() from time.h as suggested above. I have no experience with it as I used to use the high resolution windows timers, not clock().


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org


LordHavoc
Author of DarkPlaces Quake1 engine - http://icculus.org/twilight/darkplaces
Co-designer of Nexuiz - http://alientrap.org/nexuiz
"War does not prove who is right, it proves who is left." - Unknown
"Any sufficiently advanced technology is indistinguishable from a rigged demo." - James Klass
"A game is a series of interesting choices." - Sid Meier

I guess I would have to know how you are getting your math. That’s the
first thing. Nothing can lose two seconds each one second because that
would mean time is running backwards. SDL is working when you make a time
call where the other time call is probably from the operating system
information that was sent to the program when it started the cycle. That’s
an educated guess. I know if it’s taking two seconds to show a frame of sdl
either your driver is messed up or your program code is messed up.

R Manard wrote:

I guess I would have to know how you are getting your math. That’s the first thing. Nothing can lose two seconds each one second because that would mean time is running backwards. SDL is working when you make a time call where the other time call is probably from the operating system information that was sent to the program when it started the cycle. That’s an educated guess. I know if it’s taking two seconds to show a frame of sdl either your driver is messed up or your program code is messed up.

Sorry for any typos…on phone…

The offending computer is my office computer. I’ll have to wait until Monday, but I think a quick video would help demonstrate my issue. I also don’t mean to say it loses time, just that it proceeds forward in time at about 1/3 the speed. The game feels about 1/3 speed too, but is still immediately responsive, so it’s not like its just trying to catch up.

Also, to clear things up, here is how my loop runs:

Code:

time = old_time = 0;

while(!quit)
{
old_time = time;
time = SDL_GetTicks();

...output debug GetTicks and clock()

gameLogicCounter += time - old_time; // duration of last loop
while(gameLogicTimer >= 100)
{
    run logic here...
    gameLogicTimer -= 100;
}

...render...

}

This is to show that I don’t run logic based on delta time per frame. i do it via adding up elapsed time until at least 100ms has gone by, effectively giving me a logic update 10x per second. The slowdown of my game is due to the timer, not the increased workload, as far as I can tell.

As for QueryPerformanceCounter and such, I am using SDL_GetTicks() for the issues mentioned in the other reply. I’m hoping I don’t have to worry about those offending motherboards by using SDL. I was just saying I had no experience with clock(). :slight_smile:

Oh and when the kids and wife are asleep Ill post actual code and try to better explain my math. Bubble bath time for the 2yo. :wink:

Thanks again everyone for the help!

I’m all for having code posted , but I would suggest maybe running one of
the best example programs and maybe put your get tix and what not in there.
One of my machines is XP though so I could test your program on my
computer. Plus I like to get my code monkey paws on every example I can he
he he

LordHavoc wrote:

While I can’t address the core of this matter I will note that the QueryPerformanceCounter and timeGetTime can differ quite wildly on certain motherboards, mostly from the Windows XP era.

In general you can’t trust QueryPerformanceCounter because it is the cpu clock and can vary on some cpus from that era as well (due to power profiles, as well as multicore CPUs that contain
independent clocks and thread allocation to CPU cores can fluctuate wildly), constant monitoring of QueryPerformanceFrequency only partly addresses the issues with this method.

However SDL_GetTicks() uses timeGetTime() which can deviate substantially from wall time on the aforementioned motherboards.

A more reliable clock source for these offending motherboards would be greatly appreciated in my game engine as well (as I have both SDL and native implementations of the time functions, primarily for
profiling reasons where QueryPerformanceCounter is the superior choice, but is not the default for timekeeping reasons).

I’ve never dug deeply into this issue, I have only made the observation that timeGetTime is not trustworthy on certain motherboards.

I initially didn’t think much of this. When googling I read that SDL_GetTicks was using gettimeofday, but isn’t that linux only? So I downloaded the full source code of SDL and had a look myself (I guess I’m an idiot for not doing this :D). Turns out it’ll use the QueryPerformanceCounter&Frequency if available - I didn’t realize that. This leads me to believe that the XP machine I’m using has an outdated BIOS based on this article:
http://support.microsoft.com/kb/895980

I’ll see what I can do for updating the BIOS on the XP machine and get back to you guys, but that’s not until Monday. If that IS in fact the issue, I really don’t want to tell someone “hey thanks for trying my game but please update your BIOS to play it.” …not that I’ll have many (any?) XP players. It’s just a scenario I need to be aware of, I guess … again, assuming this is the issue.

This was also interesting to read, but seems limited to Intel-based CPUs:
http://software.intel.com/en-us/articles/best-timing-function-for-measuring-ipp-api-timing

As usual, thanks everyone for helping me work through this - sorry for the delay until Monday. If the BIOS update doesn’t solve it, then I’ll get back to posting code.

LordHavoc wrote:
While I can’t address the core of this matter I will note that the QueryPerformanceCounter and timeGetTime can differ quite wildly on certain motherboards, mostly from the Windows XP era.

In general you can’t trust QueryPerformanceCounter because it is the cpu clock and can vary on some cpus from that era as well (due to power profiles, as well as multicore CPUs that contain
independent clocks and thread allocation to CPU cores can fluctuate wildly), constant monitoring of QueryPerformanceFrequency only partly addresses the issues with this method.

However SDL_GetTicks() uses timeGetTime() which can deviate substantially from wall time on the aforementioned motherboards.

A more reliable clock source for these offending motherboards would be greatly appreciated in my game engine as well (as I have both SDL and native implementations of the time functions, primarily for
profiling reasons where QueryPerformanceCounter is the superior choice, but is not the default for timekeeping reasons).

I’ve never dug deeply into this issue, I have only made the observation that timeGetTime is not trustworthy on certain motherboards.

I initially didn’t think much of this. When googling I read that SDL_GetTicks was using gettimeofday, but isn’t that linux only? So I downloaded the full source code of SDL and had a look myself (I guess I’m an idiot for not doing this :D). Turns out it’ll use the QueryPerformanceCounter&Frequency if available - I didn’t realize that. This leads me to believe that the XP machine I’m using has an outdated BIOS based on this article:
http://support.microsoft.com/kb/895980

in SDL 1.2 the linux/mac code was using gettimeofday/timeGetTime, however in 2.0 it no longer does this (I made the changes myself) instead it?s using a monotonic clock on each platform ( QueryPerformanceCounter on win32, mach_time routines on mac, and clock_gettime(MONOTONIC) on linux). IF those ?high-res monotonic timers? aren?t available it falls back to gettimeofday/timeGetTime… Using a monotonic clock is very important as gettimeofday can go BACKWARDS in time which would be rather bad for game development and other applications that expect time to always move forward.

It does seem that there is something ?odd? about your system that is causing time to not progress evenly correctly as it should… which could potentially mean your system time could be getting off as well. This would NOT be the behavior on most systems however… on most systems SDL_GetTicks() would progress at a normal rate. As nearly everyone uses QueryPerformanceCounter on win32 when writing games etc… (which get replaces with SDL_GetQueryPerformanceCounter/Frequency methods as I port those games to mac+linux).

Also note that the C library function clock has other issues as a timing mechanism. it accounts for processor time consumed by the program… (thus if you program isn?t doing much it will progress slowly)… http://www.cplusplus.com/reference/ctime/clock/On Mar 1, 2014, at 12:54 PM, dvereb wrote:

I’ll see what I can do for updating the BIOS on the XP machine and get back to you guys, but that’s not until Monday. If that IS in fact the issue, I really don’t want to tell someone “hey thanks for trying my game but please update your BIOS to play it.” …not that I’ll have many (any?) XP players. It’s just a scenario I need to be aware of, I guess … again, assuming this is the issue.

This was also interesting to read, but seems limited to Intel-based CPUs:
http://software.intel.com/en-us/articles/best-timing-function-for-measuring-ipp-api-timing

As usual, thanks everyone for helping me work through this - sorry for the delay until Monday. If the BIOS update doesn’t solve it, then I’ll get back to posting code.


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org

What evidence can you cite that “most games use QueryPerformanceCounter”? All the engine code I’ve ever worked with that I can recall is using timeGetTime() for the main clock (sometimes with
"stepped backwards" checks to process 0 delta instead of a negative delta in their main loop), and utilize QueryPerformanceCounter only for builtin profiling tools if at all.

In my personal experience, QueryPerformanceCounter requires at least the use of thread affinity masks to keep it on the same cpu core (on multi-cpu systems the CPUs are guaranteed to have out of sync
clocks, and on some early multi-core processors from AMD they are even out of sync between cores in the same package), and you can expect QueryPerformanceFrequency to vary wildly from one call to the
next due to dynamic clocking.

The one exception is when using a single modern Intel CPU where they use cycle skipping instead of dynamic clocking (so Frequency does not vary and Counter increases linearly), and all cores share the
same Counter. This is not the case on AMD, and I doubt it is the case on a multi-CPU Intel system but have not personally tested the behavior on such a system.

A particularly notorious problem was on Windows 98 where QueryPerformanceCounter would accelerate over time, causing QuakeWorld engine (one of the last I know of that used QueryPerformanceCounter) to
have players move faster the longer the system had been running, to the point that they would begin getting instantly kicked by the servers for speed cheating (a check that was only expected to catch
real cheaters, was triggering on every ordinary Windows 98 player after sufficient system uptime). I do not know the basis of this Windows 98 issue, perhaps QueryPerformanceFrequency decreased over
time despite the CPU keeping the same real frequency?

Yes timeGetTime can step backwards, but I’ve never seen it misbehave as frequently as QueryPerformanceCounter in the wild.

With this in mind, I tend to automatically assume that SDL_GetTicks() can step backwards.On 03/01/2014 02:19 PM, Edward Rudd wrote:

On Mar 1, 2014, at 12:54 PM, dvereb <dvereb at gmail.com <mailto:dvereb at gmail.com>> wrote:

LordHavoc wrote:
While I can’t address the core of this matter I will note that the QueryPerformanceCounter and timeGetTime can differ quite wildly on certain motherboards, mostly from the Windows XP era.

In general you can’t trust QueryPerformanceCounter because it is the cpu clock and can vary on some cpus from that era as well (due to power profiles, as well as multicore CPUs that contain
independent clocks and thread allocation to CPU cores can fluctuate wildly), constant monitoring of QueryPerformanceFrequency only partly addresses the issues with this method.

However SDL_GetTicks() uses timeGetTime() which can deviate substantially from wall time on the aforementioned motherboards.

A more reliable clock source for these offending motherboards would be greatly appreciated in my game engine as well (as I have both SDL and native implementations of the time functions, primarily for
profiling reasons where QueryPerformanceCounter is the superior choice, but is not the default for timekeeping reasons).

I’ve never dug deeply into this issue, I have only made the observation that timeGetTime is not trustworthy on certain motherboards.

I initially didn’t think much of this. When googling I read that SDL_GetTicks was using gettimeofday, but isn’t that linux only? So I downloaded the full source code of SDL and had a look myself (I
guess I’m an idiot for not doing this :D). Turns out it’ll use the QueryPerformanceCounter&Frequency if available - I didn’t realize that. This leads me to believe that the XP machine I’m using has
an outdated BIOS based on this article:
http://support.microsoft.com/kb/895980

in SDL 1.2 the linux/mac code was using gettimeofday/timeGetTime, however in 2.0 it no longer does this (I made the changes myself) instead it?s using a monotonic clock on each platform (
QueryPerformanceCounter on win32, mach_time routines on mac, and clock_gettime(MONOTONIC) on linux). IF those ?high-res monotonic timers? aren?t available it falls back to gettimeofday/timeGetTime…
Using a monotonic clock is very important as gettimeofday can go BACKWARDS in time which would be rather bad for game development and other applications that expect time to always move forward.

It does seem that there is something ?odd? about your system that is causing time to not progress evenly correctly as it should… which could potentially mean your system time could be getting off as
well. This would NOT be the behavior on most systems however… on most systems SDL_GetTicks() would progress at a normal rate. As nearly everyone uses QueryPerformanceCounter on win32 when writing
games etc… (which get replaces with SDL_GetQueryPerformanceCounter/Frequency methods as I port those games to mac+linux).

Also note that the C library function clock has other issues as a timing mechanism. it accounts for processor time consumed by the program… (thus if you program isn?t doing much it will progress
slowly)… http://www.cplusplus.com/reference/ctime/clock/

I’ll see what I can do for updating the BIOS on the XP machine and get back to you guys, but that’s not until Monday. If that IS in fact the issue, I really don’t want to tell someone “hey thanks
for trying my game but please update your BIOS to play it.” …not that I’ll have many (any?) XP players. It’s just a scenario I need to be aware of, I guess … again, assuming this is the issue.

This was also interesting to read, but seems limited to Intel-based CPUs:
http://software.intel.com/en-us/articles/best-timing-function-for-measuring-ipp-api-timing

As usual, thanks everyone for helping me work through this - sorry for the delay until Monday. If the BIOS update doesn’t solve it, then I’ll get back to posting code.


SDL mailing list
SDL at lists.libsdl.org <mailto:SDL at lists.libsdl.org>
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org


LordHavoc
Author of DarkPlaces Quake1 engine - http://icculus.org/twilight/darkplaces
Co-designer of Nexuiz - http://alientrap.org/nexuiz
"War does not prove who is right, it proves who is left." - Unknown
"Any sufficiently advanced technology is indistinguishable from a rigged demo." - James Klass
"A game is a series of interesting choices." - Sid Meier

What evidence can you cite that “most games use QueryPerformanceCounter”? All the engine code I’ve ever worked with that I can recall is using timeGetTime() for the main clock (sometimes with
"stepped backwards" checks to process 0 delta instead of a negative delta in their main loop), and utilize QueryPerformanceCounter only for builtin profiling tools if at all.

The 30+ games I?ve ported over the past 8 years is the evidence. Nearly every one of them used QueryPerformanceCounter on windows… The one that I recall that didn?t use that was Democracy 3, but it wasn?t for any purpose of game time, but seeing random numbers and some perf timing of certain routines.

In my personal experience, QueryPerformanceCounter requires at least the use of thread affinity masks to keep it on the same cpu core (on multi-cpu systems the CPUs are guaranteed to have out of sync
clocks, and on some early multi-core processors from AMD they are even out of sync between cores in the same package), and you can expect QueryPerformanceFrequency to vary wildly from one call to the
next due to dynamic clocking.

OUCH!!.. that?s kinda nasty. I?m not sure how things mac_absolute_time and clock_gettime on linux/other unixes are affected by this, as that is what is used on the mac/unix support for SDL2. ( essentially derived from this code https://github.com/ThomasHabets/monotonic_clock)

The one exception is when using a single modern Intel CPU where they use cycle skipping instead of dynamic clocking (so Frequency does not vary and Counter increases linearly), and all cores share the
same Counter. This is not the case on AMD, and I doubt it is the case on a multi-CPU Intel system but have not personally tested the behavior on such a system.

A particularly notorious problem was on Windows 98 where QueryPerformanceCounter would accelerate over time, causing QuakeWorld engine (one of the last I know of that used QueryPerformanceCounter) to
have players move faster the longer the system had been running, to the point that they would begin getting instantly kicked by the servers for speed cheating (a check that was only expected to catch
real cheaters, was triggering on every ordinary Windows 98 player after sufficient system uptime). I do not know the basis of this Windows 98 issue, perhaps QueryPerformanceFrequency decreased over
time despite the CPU keeping the same real frequency?

But who supports windows 98 anymore?

Yes timeGetTime can step backwards, but I’ve never seen it misbehave as frequently as QueryPerformanceCounter in the wild.

Glad I don?t do ports to windows… If you can suggest a clean alternative monotonic source for windows, we?ll code it into SDL2

With this in mind, I tend to automatically assume that SDL_GetTicks() can step backwards.

In SDL 1.2 that was the case, but with 2.0, the code tries to use a monotonic source to not exhibit that behavior.On Mar 1, 2014, at 5:57 PM, Forest Hale wrote:

On 03/01/2014 02:19 PM, Edward Rudd wrote:

On Mar 1, 2014, at 12:54 PM, dvereb <dvereb at gmail.com <mailto:dvereb at gmail.com>> wrote:

LordHavoc wrote:
While I can’t address the core of this matter I will note that the QueryPerformanceCounter and timeGetTime can differ quite wildly on certain motherboards, mostly from the Windows XP era.

In general you can’t trust QueryPerformanceCounter because it is the cpu clock and can vary on some cpus from that era as well (due to power profiles, as well as multicore CPUs that contain
independent clocks and thread allocation to CPU cores can fluctuate wildly), constant monitoring of QueryPerformanceFrequency only partly addresses the issues with this method.

However SDL_GetTicks() uses timeGetTime() which can deviate substantially from wall time on the aforementioned motherboards.

A more reliable clock source for these offending motherboards would be greatly appreciated in my game engine as well (as I have both SDL and native implementations of the time functions, primarily for
profiling reasons where QueryPerformanceCounter is the superior choice, but is not the default for timekeeping reasons).

I’ve never dug deeply into this issue, I have only made the observation that timeGetTime is not trustworthy on certain motherboards.

I initially didn’t think much of this. When googling I read that SDL_GetTicks was using gettimeofday, but isn’t that linux only? So I downloaded the full source code of SDL and had a look myself (I
guess I’m an idiot for not doing this :D). Turns out it’ll use the QueryPerformanceCounter&Frequency if available - I didn’t realize that. This leads me to believe that the XP machine I’m using has
an outdated BIOS based on this article:
http://support.microsoft.com/kb/895980

in SDL 1.2 the linux/mac code was using gettimeofday/timeGetTime, however in 2.0 it no longer does this (I made the changes myself) instead it?s using a monotonic clock on each platform (
QueryPerformanceCounter on win32, mach_time routines on mac, and clock_gettime(MONOTONIC) on linux). IF those ?high-res monotonic timers? aren?t available it falls back to gettimeofday/timeGetTime…
Using a monotonic clock is very important as gettimeofday can go BACKWARDS in time which would be rather bad for game development and other applications that expect time to always move forward.

It does seem that there is something ?odd? about your system that is causing time to not progress evenly correctly as it should… which could potentially mean your system time could be getting off as
well. This would NOT be the behavior on most systems however… on most systems SDL_GetTicks() would progress at a normal rate. As nearly everyone uses QueryPerformanceCounter on win32 when writing
games etc… (which get replaces with SDL_GetQueryPerformanceCounter/Frequency methods as I port those games to mac+linux).

Also note that the C library function clock has other issues as a timing mechanism. it accounts for processor time consumed by the program… (thus if you program isn?t doing much it will progress
slowly)… http://www.cplusplus.com/reference/ctime/clock/

I’ll see what I can do for updating the BIOS on the XP machine and get back to you guys, but that’s not until Monday. If that IS in fact the issue, I really don’t want to tell someone “hey thanks
for trying my game but please update your BIOS to play it.” …not that I’ll have many (any?) XP players. It’s just a scenario I need to be aware of, I guess … again, assuming this is the issue.

This was also interesting to read, but seems limited to Intel-based CPUs:
http://software.intel.com/en-us/articles/best-timing-function-for-measuring-ipp-api-timing

As usual, thanks everyone for helping me work through this - sorry for the delay until Monday. If the BIOS update doesn’t solve it, then I’ll get back to posting code.


SDL mailing list
SDL at lists.libsdl.org <mailto:SDL at lists.libsdl.org>
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org


LordHavoc
Author of DarkPlaces Quake1 engine - http://icculus.org/twilight/darkplaces
Co-designer of Nexuiz - http://alientrap.org/nexuiz
"War does not prove who is right, it proves who is left." - Unknown
"Any sufficiently advanced technology is indistinguishable from a rigged demo." - James Klass
"A game is a series of interesting choices." - Sid Meier


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org

It’s really reassuring to hear that 30+ Windows games depend on QueryPerformanceCounter, I just wonder if they use thread affinity masks. And I also wonder if they ever yield the CPU, because dynamic
clocking does tend to go away when you slam the cpu with full activity 100% of the time.

My own games just sanitize the output of SDL_GetTicks (or whatever other clock source I coded in - often several that are user selectable), ignoring deltas that are not sane, and my go-to clock source
on Windows is timeGetTime, because only a handful of users have ever reported trouble with that (usually with PCs with a chipset so bad that timeGetTime() produces multiple negative deltas per second,
and by ignoring them the game runs too fast).–
LordHavoc
Author of DarkPlaces Quake1 engine - http://icculus.org/twilight/darkplaces
Co-designer of Nexuiz - http://alientrap.org/nexuiz
"War does not prove who is right, it proves who is left." - Unknown
"Any sufficiently advanced technology is indistinguishable from a rigged demo." - James Klass
"A game is a series of interesting choices." - Sid Meier

As for my usage of clock(), I mainly used it to prove to myself that I wasn’t going insane and that at least one clock on the system was counting in (almost) real-time.

I do have a solution to my problem, though!
It was the BIOS. There was an update for my system released in June of 2010. My computer was manufactured in March of 2010. That means a system that is just four years old could easily run into timing issues with my games relying on SDL_GetTicks(). I’d assume that most people wouldn’t know how to do a BIOS update nor would they be comfortable doing so. Once the two of you are done discussing QueryPerformanceCounter vs timeGetTime I’ll revisit the situation and see how to best implement my timers. :smiley:

I just want to thank everyone in the thread once more. I usually tend to shy away from posting on message boards as I feel like I don’t do enough research and I’m asking for help before I should (e.g.: not looking at the actual SDL source code to notice it was using QueryPerformanceCounter and just trusting some random website). Thanks!