What evidence can you cite that “most games use QueryPerformanceCounter”? All the engine code I’ve ever worked with that I can recall is using timeGetTime() for the main clock (sometimes with
"stepped backwards" checks to process 0 delta instead of a negative delta in their main loop), and utilize QueryPerformanceCounter only for builtin profiling tools if at all.
In my personal experience, QueryPerformanceCounter requires at least the use of thread affinity masks to keep it on the same cpu core (on multi-cpu systems the CPUs are guaranteed to have out of sync
clocks, and on some early multi-core processors from AMD they are even out of sync between cores in the same package), and you can expect QueryPerformanceFrequency to vary wildly from one call to the
next due to dynamic clocking.
The one exception is when using a single modern Intel CPU where they use cycle skipping instead of dynamic clocking (so Frequency does not vary and Counter increases linearly), and all cores share the
same Counter. This is not the case on AMD, and I doubt it is the case on a multi-CPU Intel system but have not personally tested the behavior on such a system.
A particularly notorious problem was on Windows 98 where QueryPerformanceCounter would accelerate over time, causing QuakeWorld engine (one of the last I know of that used QueryPerformanceCounter) to
have players move faster the longer the system had been running, to the point that they would begin getting instantly kicked by the servers for speed cheating (a check that was only expected to catch
real cheaters, was triggering on every ordinary Windows 98 player after sufficient system uptime). I do not know the basis of this Windows 98 issue, perhaps QueryPerformanceFrequency decreased over
time despite the CPU keeping the same real frequency?
Yes timeGetTime can step backwards, but I’ve never seen it misbehave as frequently as QueryPerformanceCounter in the wild.
With this in mind, I tend to automatically assume that SDL_GetTicks() can step backwards.On 03/01/2014 02:19 PM, Edward Rudd wrote:
On Mar 1, 2014, at 12:54 PM, dvereb <dvereb at gmail.com <mailto:dvereb at gmail.com>> wrote:
While I can’t address the core of this matter I will note that the QueryPerformanceCounter and timeGetTime can differ quite wildly on certain motherboards, mostly from the Windows XP era.
In general you can’t trust QueryPerformanceCounter because it is the cpu clock and can vary on some cpus from that era as well (due to power profiles, as well as multicore CPUs that contain
independent clocks and thread allocation to CPU cores can fluctuate wildly), constant monitoring of QueryPerformanceFrequency only partly addresses the issues with this method.
However SDL_GetTicks() uses timeGetTime() which can deviate substantially from wall time on the aforementioned motherboards.
A more reliable clock source for these offending motherboards would be greatly appreciated in my game engine as well (as I have both SDL and native implementations of the time functions, primarily for
profiling reasons where QueryPerformanceCounter is the superior choice, but is not the default for timekeeping reasons).
I’ve never dug deeply into this issue, I have only made the observation that timeGetTime is not trustworthy on certain motherboards.
I initially didn’t think much of this. When googling I read that SDL_GetTicks was using gettimeofday, but isn’t that linux only? So I downloaded the full source code of SDL and had a look myself (I
guess I’m an idiot for not doing this :D). Turns out it’ll use the QueryPerformanceCounter&Frequency if available - I didn’t realize that. This leads me to believe that the XP machine I’m using has
an outdated BIOS based on this article:
in SDL 1.2 the linux/mac code was using gettimeofday/timeGetTime, however in 2.0 it no longer does this (I made the changes myself) instead it?s using a monotonic clock on each platform (
QueryPerformanceCounter on win32, mach_time routines on mac, and clock_gettime(MONOTONIC) on linux). IF those ?high-res monotonic timers? aren?t available it falls back to gettimeofday/timeGetTime…
Using a monotonic clock is very important as gettimeofday can go BACKWARDS in time which would be rather bad for game development and other applications that expect time to always move forward.
It does seem that there is something ?odd? about your system that is causing time to not progress evenly correctly as it should… which could potentially mean your system time could be getting off as
well. This would NOT be the behavior on most systems however… on most systems SDL_GetTicks() would progress at a normal rate. As nearly everyone uses QueryPerformanceCounter on win32 when writing
games etc… (which get replaces with SDL_GetQueryPerformanceCounter/Frequency methods as I port those games to mac+linux).
Also note that the C library function clock has other issues as a timing mechanism. it accounts for processor time consumed by the program… (thus if you program isn?t doing much it will progress
I’ll see what I can do for updating the BIOS on the XP machine and get back to you guys, but that’s not until Monday. If that IS in fact the issue, I really don’t want to tell someone “hey thanks
for trying my game but please update your BIOS to play it.” …not that I’ll have many (any?) XP players. It’s just a scenario I need to be aware of, I guess … again, assuming this is the issue.
This was also interesting to read, but seems limited to Intel-based CPUs:
As usual, thanks everyone for helping me work through this - sorry for the delay until Monday. If the BIOS update doesn’t solve it, then I’ll get back to posting code.
SDL mailing list
SDL at lists.libsdl.org <mailto:SDL at lists.libsdl.org>
SDL mailing list
SDL at lists.libsdl.org
Author of DarkPlaces Quake1 engine - http://icculus.org/twilight/darkplaces
Co-designer of Nexuiz - http://alientrap.org/nexuiz
"War does not prove who is right, it proves who is left." - Unknown
"Any sufficiently advanced technology is indistinguishable from a rigged demo." - James Klass
"A game is a series of interesting choices." - Sid Meier