Strange problems with SDL_GetTicks() & Windows2000

Hi,

I’ve noticed the following problem with SDL 1.2.1 and Windows 2000:

SDL_GetTicks() seem to have a granularity of 10 or 15 ms there! :frowning:

The problem showed up in my private “delay” function (used in the
games “Rocks’n’Diamonds” and “Mirror Magic”): To wait for a certain
amount of x milliseconds as precise as possible, I split x into
x = x1 * 10 ms + x2 (with x2 < 10 ms), then I “SDL_Delay(x1 * 10)”,
then I wait for x2 milliseconds using a busy-loop with SDL_GetTicks().

This works quite fine under Linux (2.2.x and 2.4.x kernels tested) and
with Windows95 and Windows98, but not with Windows2000.

Where SDL_GetTicks() has a granularity of 1 ms normally, it has a
granularity of 10 ms on my first Windows2000 installation and a
granularity of 15 ms on my second Windows2000 installation (which
are installed identically from the same original Windows2000 CD).

I’ve tracked down the problem with the following piece of test code:

delay20ms()
{
unsigned long count1 = SDL_GetTicks(), count2, current_ms, test;

current_ms = SDL_GetTicks();
test = -1;
while (current_ms < count1 + 20)  /* busy wait 20 milliseconds */
{
  current_ms = SDL_GetTicks();

  if (test != current_ms)
  {
    fprintf(stderr, "current_ms == %ld\n", current_ms);
    test = current_ms;
  }
}

count2 = SDL_GetTicks();
fprintf(stderr, "delay == %ld\n", count2 - count1);

}

The function prints out each change of the result of SDL_GetTicks()
and finally prints out the time interval effectively delayed.

(For testing, the function was called multiple times with the
printed output being redirected to a file.)

Under Linux and Windows95/98, I get what I expected:
[…]
rocksndiamonds: current_ms == 2320
rocksndiamonds: current_ms == 2321
rocksndiamonds: current_ms == 2322
rocksndiamonds: current_ms == 2323
rocksndiamonds: delay == 20
rocksndiamonds: current_ms == 2323
rocksndiamonds: current_ms == 2324
rocksndiamonds: current_ms == 2325
rocksndiamonds: current_ms == 2326
[…]
rocksndiamonds: current_ms == 2341
rocksndiamonds: current_ms == 2342
rocksndiamonds: current_ms == 2343
rocksndiamonds: delay == 20
rocksndiamonds: current_ms == 2343
rocksndiamonds: current_ms == 2344
rocksndiamonds: current_ms == 2345
[…]

Under my first Windows2000 installation (PII/700MHz), I get:
[…]
rocksndiamonds.exe: current_ms == 3214
rocksndiamonds.exe: current_ms == 3224
rocksndiamonds.exe: current_ms == 3234
rocksndiamonds.exe: delay == 20
rocksndiamonds.exe: current_ms == 3234
rocksndiamonds.exe: current_ms == 3244
rocksndiamonds.exe: current_ms == 3254
rocksndiamonds.exe: delay == 20
rocksndiamonds.exe: current_ms == 3254
rocksndiamonds.exe: current_ms == 3264
[…]
Very strange, although still the well known time slice of 10 ms.

Under my second Windows2000 installation (PIII/800MHz), I get:
[…]
rocksndiamonds.exe: current_ms == 2000
rocksndiamonds.exe: current_ms == 2015
rocksndiamonds.exe: current_ms == 2031
rocksndiamonds.exe: delay == 31
rocksndiamonds.exe: current_ms == 2031
rocksndiamonds.exe: current_ms == 2047
rocksndiamonds.exe: current_ms == 2062
rocksndiamonds.exe: delay == 31
rocksndiamonds.exe: current_ms == 2062
rocksndiamonds.exe: current_ms == 2078
[…]
Even more strange: A time slice of 15/16 ms!

The two tested Linux installations run on the same hardware
as the two tested Windows installations, without this problem.

Has anyone made similar observations with SDL?
What the heck is going on here?
What can I do to get usable results from SDL_GetTicks()?

Completely confused… :frowning:

Best regards,
Holger–
holger.schemel at mediaways.net

This works quite fine under Linux (2.2.x and 2.4.x kernels tested) and
with Windows95 and Windows98, but not with Windows2000.

not knowing windows at all, could it be an effect from your fprintf calls?
try something like

test_resolution(int n, unsigned *out)
{
int i;
unsigned now, t;
now = SDL_GetTicks();
for(i = 0; i < n; i++) {
do {
t = SDL_GetTicks();
} while(t == now);
out[i] = now = t;
}
}

and then dump the resulting vector

SDL_GetTicks() seem to have a granularity of 10 or 15 ms there! :frowning:
(…)
This works quite fine under Linux (2.2.x and 2.4.x kernels tested) and
with Windows95 and Windows98, but not with Windows2000.

Have you tried turning off optimization ? Are you using the same executable
in all windowses ? What happens if you cross-compile from fx. linux to win2k ?–
Trick

Windows 2000 might have a different method of multitasking, especially
in a server installation, unless you give your process higher
priority/affinity or something. Remember that Windows 2000 is another
name for NT5, so see if an NT4 installation also has similar symptoms,
as Windows 9X does use a different kernel. (as may Millenium Edition,
not too sure) As for Windows XP, all bets are off! :)–

Olivier A. Dagenais - Software Architect and Developer

“Holger Schemel” <holger.schemel at mediaways.net> wrote in message
news:3B3FA21E.45CF9DF0 at mediaways.net

Hi,

I’ve noticed the following problem with SDL 1.2.1 and Windows 2000:

SDL_GetTicks() seem to have a granularity of 10 or 15 ms there! :frowning:

The problem showed up in my private “delay” function (used in the
games “Rocks’n’Diamonds” and “Mirror Magic”): To wait for a certain
amount of x milliseconds as precise as possible, I split x into
x = x1 * 10 ms + x2 (with x2 < 10 ms), then I “SDL_Delay(x1 * 10)”,
then I wait for x2 milliseconds using a busy-loop with
SDL_GetTicks().

This works quite fine under Linux (2.2.x and 2.4.x kernels tested)
and
with Windows95 and Windows98, but not with Windows2000.

Where SDL_GetTicks() has a granularity of 1 ms normally, it has a
granularity of 10 ms on my first Windows2000 installation and a
granularity of 15 ms on my second Windows2000 installation (which
are installed identically from the same original Windows2000 CD).

I’ve tracked down the problem with the following piece of test code:

delay20ms()
{
unsigned long count1 = SDL_GetTicks(), count2, current_ms, test;

current_ms = SDL_GetTicks();
test = -1;
while (current_ms < count1 + 20)  /* busy wait 20 milliseconds

*/

{
  current_ms = SDL_GetTicks();

  if (test != current_ms)
  {
    fprintf(stderr, "current_ms == %ld\n", current_ms);
    test = current_ms;
  }
}

count2 = SDL_GetTicks();
fprintf(stderr, "delay == %ld\n", count2 - count1);

}

The function prints out each change of the result of SDL_GetTicks()
and finally prints out the time interval effectively delayed.

(For testing, the function was called multiple times with the
printed output being redirected to a file.)

Under Linux and Windows95/98, I get what I expected:
[…]
rocksndiamonds: current_ms == 2320
rocksndiamonds: current_ms == 2321
rocksndiamonds: current_ms == 2322
rocksndiamonds: current_ms == 2323
rocksndiamonds: delay == 20
rocksndiamonds: current_ms == 2323
rocksndiamonds: current_ms == 2324
rocksndiamonds: current_ms == 2325
rocksndiamonds: current_ms == 2326
[…]
rocksndiamonds: current_ms == 2341
rocksndiamonds: current_ms == 2342
rocksndiamonds: current_ms == 2343
rocksndiamonds: delay == 20
rocksndiamonds: current_ms == 2343
rocksndiamonds: current_ms == 2344
rocksndiamonds: current_ms == 2345
[…]

Under my first Windows2000 installation (PII/700MHz), I get:
[…]
rocksndiamonds.exe: current_ms == 3214
rocksndiamonds.exe: current_ms == 3224
rocksndiamonds.exe: current_ms == 3234
rocksndiamonds.exe: delay == 20
rocksndiamonds.exe: current_ms == 3234
rocksndiamonds.exe: current_ms == 3244
rocksndiamonds.exe: current_ms == 3254
rocksndiamonds.exe: delay == 20
rocksndiamonds.exe: current_ms == 3254
rocksndiamonds.exe: current_ms == 3264
[…]
Very strange, although still the well known time slice of 10 ms.

Under my second Windows2000 installation (PIII/800MHz), I get:
[…]
rocksndiamonds.exe: current_ms == 2000
rocksndiamonds.exe: current_ms == 2015
rocksndiamonds.exe: current_ms == 2031
rocksndiamonds.exe: delay == 31
rocksndiamonds.exe: current_ms == 2031
rocksndiamonds.exe: current_ms == 2047
rocksndiamonds.exe: current_ms == 2062
rocksndiamonds.exe: delay == 31
rocksndiamonds.exe: current_ms == 2062
rocksndiamonds.exe: current_ms == 2078
[…]
Even more strange: A time slice of 15/16 ms!

The two tested Linux installations run on the same hardware
as the two tested Windows installations, without this problem.

Has anyone made similar observations with SDL?
What the heck is going on here?
What can I do to get usable results from SDL_GetTicks()?

Completely confused… :frowning:

Best regards,
Holger

holger.schemel at mediaways.net

SDL_GetTicks() seem to have a granularity of 10 or 15 ms there! :frowning:

this is normal, since timeGetTime() have a granularity of AT LEAST 5 ms
(10ms on my PC)

from msdn:

Windows NT: The default precision of the timeGetTime function can be five
milliseconds or more, depending on the machine. You can use the
timeBeginPeriod and timeEndPeriod functions to increase the precision of
timeGetTime. If you do so, the minimum difference between successive values
returned by timeGetTime can be as large as the minimum period value set
using timeBeginPeriod and timeEndPeriod. Use the QueryPerformanceCounter and
QueryPerformanceFrequency functions to measure short time intervals at a
high resolution.

Windows 95: The default precision of the timeGetTime function is 1
millisecond.

Gautier - www.tlk.fr

Hi Trick,

SDL_GetTicks() seem to have a granularity of 10 or 15 ms there! :frowning:
(…)
This works quite fine under Linux (2.2.x and 2.4.x kernels tested) and
with Windows95 and Windows98, but not with Windows2000.

Have you tried turning off optimization ? Are you using the same executable
in all windowses ? What happens if you cross-compile from fx. linux to win2k ?

It was the same executable on all Windowses (95,98 and the two W2k).

This binary was cross-compiled on Linux. (I’ve never compiled anything
on any native Windows, so far.)

It was compiled with the usual “-O3”, but I can check it without it.

Best regards,
Holger–
holger.schemel at mediaways.net … ++49 +5246 80 1438

Hi Mattias,

This works quite fine under Linux (2.2.x and 2.4.x kernels tested) and
with Windows95 and Windows98, but not with Windows2000.

not knowing windows at all, could it be an effect from your fprintf calls?

I doubt it, but I’ll try your test function.

The fprintf calls were introduced later, after measuring the wrong
delay of ~30ms (instead of 20ms). I use this delay to get a fixed
frame rate of 50 FPS, so the higher delay resulted in a noticably
reduced frame rate. (In any case, I should drop frames instead of
slowing down the game speed…)

test_resolution(int n, unsigned *out)
{
[…]
}

and then dump the resulting vector

I’ll check this, but I really expect the same results.

Best regards,
Holger–
holger.schemel at mediaways.net … ++49 +5246 80 1438

Hi Olivier,

Windows 2000 might have a different method of multitasking, especially
in a server installation, unless you give your process higher
priority/affinity or something. Remember that Windows 2000 is another

Do you know how I can get my game a higher priority under W2k?
I can see its CPU usage in the Task Manager (something around 50%),
but I know nothing about raising its priority. (Some equivalent to
"nice" under Unix?)

Would be interesting for testing purpose…

name for NT5, so see if an NT4 installation also has similar symptoms,

I’ll try to get an NT4 system under my fingers to try this,
and I’ll write a summary here.

as Windows 9X does use a different kernel. (as may Millenium Edition,
not too sure) As for Windows XP, all bets are off! :slight_smile:

:wink:

Best regards,
Holger–
holger.schemel at mediaways.net … ++49 +5246 80 1438

Hi Gautier,

SDL_GetTicks() seem to have a granularity of 10 or 15 ms there! :frowning:

this is normal, since timeGetTime() have a granularity of AT LEAST 5 ms
(10ms on my PC)

Great! Someone who has some definitive facts on this topic! :slight_smile: :slight_smile:

from msdn:

Windows NT: The default precision of the timeGetTime function can be five
milliseconds or more, depending on the machine. You can use the
timeBeginPeriod and timeEndPeriod functions to increase the precision of
timeGetTime. If you do so, the minimum difference between successive values
returned by timeGetTime can be as large as the minimum period value set
using timeBeginPeriod and timeEndPeriod. Use the QueryPerformanceCounter and
QueryPerformanceFrequency functions to measure short time intervals at a
high resolution.

Cool! Can this be used in “timer/win32/SDL_systimer.c” to get a
higher timer resolution without performance disadvantages?

Windows 95: The default precision of the timeGetTime function is 1
millisecond.

Yep – that’s what I’ve observed on the Win9X platforms, too.

At least I’m very glad to have this clarified: That I cannot rely
on SDL_GetTicks() having a higher resolution than SDL_Delay(),
although I think this is a big disadvantage when it comes to
precise timing. :frowning:

So far, my SDL games rely on SDL_GetTicks() (roughly) having a
timer resolution of ~1ms for some more or less precise timing.
Especially if you have a granularity of only 15 ms, you’ll have
serious problems doing any timing stuff reasonably accurate… :-/

A question to the SDL developers:
When looking at the SDL documentation, it looks like SDL_GetTicks()
should have a precision of single milliseconds (where SDL_Delay()
is defined as being semi-precise according to the systems scheduler’s
time-slice of 10 ms for most systems). For Windows NT/2000, this
seems to be not true for most installations and therefore might
seriously break some timing-dependant applications/games.

Gautier, do you have a link to this (and other/more) MSDN stuff?
(Or do I have to pay Microsoft lots of dollars for it? :-/ )

Then, the above sounds as if it is possible to write a SDL_GetTicks()
for WindowsNT/2000 which is as precise (1 ms) as the implementations
for all the other platforms.

If I had more information about this, I would like to hack it by
myself, but I haven’t done any system programming on Windows
platforms before… :-/

Best regards,
Holger–
holger.schemel at mediaways.net … ++49 +5246 80 1438

You can use the
timeBeginPeriod and timeEndPeriod functions to increase the precision of
timeGetTime. If you do so, the minimum difference between successive values
returned by timeGetTime can be as large as the minimum period value set
using timeBeginPeriod and timeEndPeriod. Use the QueryPerformanceCounter and
QueryPerformanceFrequency functions to measure short time intervals at a
high resolution.

We would definitely appreciate a patch for this — there is no reason
SDL_GetTicks() should settle for worse than 1 ms accuracy on platforms
where it is at all possible. If possible, select the best method
during runtime, so that the same binary will work on any Windows variant

At least I’m very glad to have this clarified: That I cannot rely
on SDL_GetTicks() having a higher resolution than SDL_Delay(),
although I think this is a big disadvantage when it comes to
precise timing. :frowning:

you should be able to rely on SDL_GetTicks() to give the best resolution
possible on the platform (which usually means 1 ms), and on SDL_Delay to
have no worse granularity than allowed by the OS scheduler/timers (usually
10 ms)

Gautier, do you have a link to this (and other/more) MSDN stuff?
(Or do I have to pay Microsoft lots of dollars for it? :-/ )

It’s freely available:

http://msdn.microsoft.com/library/default.asp

Then, look in “Windows Development” in the frame on the left.

(Although there’s lots of other interesting stuff in there, too. OSDN
needs to spend less time on things like Slashdot and more on a Free
Software equivalent of the MSDN library.)

–ryan.

Well, it seems my idea of Windows NT doing something different was
proven with the MSDN documentation (which is available for free:
http://msdn.microsoft.com/library/default.asp?url=/library/en-us/winui
/hh/winui/timers_4z76.asp?frame=true ) so messing with a process’s
priority is probably not a good idea anymore.

I wonder if the Windows CE implementation of SDL also should use
QueryPerformanceCounter, since it is available and will return zero if
the hardware does not support it…–

Olivier A. Dagenais - Software Architect and Developer

“Holger Schemel” <holger.schemel at mediaways.net> wrote in message
news:3B40690A.A009AC39 at mediaways.net

Hi Olivier,

Windows 2000 might have a different method of multitasking,
especially

in a server installation, unless you give your process higher
priority/affinity or something. Remember that Windows 2000 is
another

Do you know how I can get my game a higher priority under W2k?
I can see its CPU usage in the Task Manager (something around 50%),
but I know nothing about raising its priority. (Some equivalent to
"nice" under Unix?)

Would be interesting for testing purpose…

name for NT5, so see if an NT4 installation also has similar
symptoms,

I’ll try to get an NT4 system under my fingers to try this,
and I’ll write a summary here.

as Windows 9X does use a different kernel. (as may Millenium
Edition,

not too sure) As for Windows XP, all bets are off! :slight_smile:

:wink:

Best regards,
Holger

holger.schemel at mediaways.net … ++49 +5246 80 1438