Re(2): SDL Timing in Windows

mat at writes:

“slouken” == Sam Lantinga writes:

slouken> Both of these are true. :slight_smile: GetTickCount() provides
slouken> millisecond precision timing, and QPC isn’t on all systems
slouken> (or you have to do some teaking, I forget which.) If you
slouken> need sub-millisecond precision, by all means, use QPC.

Although GetTickCount looks like it returns millisecond precision,
it’s actually much less accurate than QueryPerformanceCounter,
returning values only accurate to 55ms on Win95/98 and 10ms on WinNT.
QueryPerformanceCounter’s precision is much better, varying between
1024Hz and the clock speed of your CPU depending on your OS and

Whether or not the time returned by QueryPerformanceCounter slowly
drifts relative to realtime varies between different versions of the
OS. The ones that use rdtsc drift over time, as the CPU clock isn’t
synced with the CMOS clock. Those versions of Windows that base their
QueryPerformanceCounter on the interrupt driving the realtime clock
stay synced to it forever (as you’d expect) barring clock adjustments.

It is a big win to use QueryPerformanceCounter when it’s available,
especially for timing short intervals.

We determined most of the above information experimentally.


Some more food for thought :wink:

When I saw the 55mS figure it jarred a memory of a discussions on the old
Angelic Coders DX list regarding timing. I have copied the messages below
(Raymond Chen spoke for MSoft on that list):From: (Raymond Chen)
To: DirectXDev at
Subject: Re: QueryPerformanceCounter
Date: Tue, 14 Jul 1998 05:12:09 GMT
Organization: Microsoft Corporation

… or better yet, use GetTickCount() which returns the same time
[as timeGetTime] (millisecond) but seems to have a lot less call

GetTickCount has the same resolution but lower accuracy. timeGetTime() is
accurate to 1ms, whereas GetTickCount can be as many as 55ms off. You pays
your money and you takes your choice.

From: (Raymond Chen)
To: DirectXDev at
Subject: Re: QueryPerformanceCounter
Date: Thu, 09 Jul 1998 20:08:39 GMT
Organization: Microsoft Corporation

QPC is available on both 9x and NT. However, it does depend on a
high-performance hardware counter being available

QPC will always be available on 9x and NT (but not CE). On Win9x it uses
hi-res timer, which is present on all motherboards.

Subject: Re: timeGetTime() vs QueryPerformanceCounter
Date: Tue, 03 Feb 1998 18:21:29 GMT
From: (Raymond Chen)
Reply-To: DirectXDev at
Organization: Microsoft Corporation

I added system time to the mix, timeGetTime() tracks system time, QPC is

I find this very odd, since timeGetTime and QPC use the same timer.
At every timer tick, a global variable is updated.

timeGetTime reads that global variable and returns immediately.
QPC reads that global variable, and then adds in the fraction of a
tick has elapsed since the previous tick.

From: (Peter Dimov)
Subject: Re: Best way of timing game - What about GetTickCount?

As for GetTickCount() vs timeGetTime(), there is nothing in the docs, but
from what Raymond Chen said I gather that GetTickCount() returns, well, the
tick count (the system timer ticks 18.2 times per second), so it has a 55ms
accuracy. timeGetTime() is supposedly accurate to 1-3 ms.

From: (Vinnie Falco)
Subject: Re: Best way of timing game - SOLVED

It sounds like QueryPerformanceCounter() uses RDTSC in its


Well if QueryPerformanceCounter() uses RDTSC then it makes all this talk
of switching to RDTSC for timing rather moot doesn’t it? So then the issue
of timing the game falls into three categories :

  1. using QueryPerformanceCounter()…Equivalent to RDTSC and very accurate.
    Only available on Pentium processors and up (pretty reasonable to me).

  2. using GetTickCount()…Should never actually happen in practice but hey,
    if QueryPerformanceCounter() should fail us, we can always fall back on

  3. using timeGetTime()…Kind of funny how this function works, you have to
    call timeBeginPeriod() to change the frequency. Besides, considering the
    smallest value allowed in timeBeginPeriod() it sure does seem like
    QueryPerformanceCounter() beats it hands down in accuracy every time.

So I would tend towards eliminating 3), leaving 1) as the most accurate
method, but
falling back on 2) if QueryPerformanceCounter() is unavailable.

Which is exactly the implementation for my replacement for

// Copyright © 1997, 1998 by Vinnie Falco, Mickey Portilla, and Alex
// Permission is granted to use this file for commercial or non commercial
// msTimer.CPP
// Tested with Developer Studio Visual C++ 5.0 with SP3
// Change History
// 7/15/98 Created

#include <windows.h>

static int gFirstTime=1;
static int gQPCAvail;
static DWORD gLastTickCount;

// returns elapsed milliseconds since system startup
DWORD GetElapsedMilliseconds( void )
BOOL result;
DWORD startTicks, endTicks;

    if( gFirstTime ) {
            gQPCAvail=QueryPerformanceFrequency( &gQPCFreq );

    if( gQPCAvail ) {
            QueryPerformanceCounter( &qpcTicks );
            ms=1000.0 * ((double)(qpcTicks.QuadPart)) / 

((double)(gQPCFreq.QuadPart)) ;
else ms=GetTickCount();

    return ms;


Thanks for the code. I have a question.
QueryPerformanceCounter( &qpcTicks );
ms=1000.0 * ((double)(qpcTicks.QuadPart)) /
Why do you cast qpcTicks and gQPCFreq to double? Why not use

ms= (1000 * qpcTicks.QuadPart) / gQPCFreq.QuadPart;

Well two reasons. First of all (1000*qpcTicks.QuadPart) can overflow a
LARGE_INTEGER, unnecessarily reducing the dynamic range by 10 bits.

Secondly I thought it was more important to encapsulate the formula for
seconds ((double)(qpcTicks.QuadPart)) / ((double)(gQPCFreq.QuadPart))
This is purely aesthetic and doesn’t affect the execution, speed is not
an issue here!!!

Vinnie Falco

From: (Raymond Chen)
Subject: Re: Best way of timing game - SOLVED

Doesn’t timeGetTime() default to 1ms precision under Windows 95?


Or is the precision indeterminant unless you call timeBeginPeriod()??

timeBeginPeriod sets up a minimum precision. Win95’s default precision is
but NT’s default precision is something like 5ms or 10ms.

Note also that the nominal precision of GetSystemTime is different between
Win95, Win98 and NT. Win95’s nominal precision of GST is 55ms, NT’s
precision is 10ms (I think) and Win98’s nominal precision is also 10ms (I

In reality, GST often does better than the nominal precision because device
drivers may request system time information at any time, and doing so
GST to give slightly improved results for a brief period of time.