SDL_GetTicks()

Hello,
Is there anybody that knows this function ?? what is it for ?

Does anybody has a piece of code to measure time ??

thanks…
cesar

Hi again,
about sdl_geticks()
Ok, sorry, to be more specific… does anybody know a function to obtain time
more accurate…non miliseconds but microseconds or things like that
thanks
cesar

For what operating system?

ChrisOn Thu, 2 Dec 2004 19:48:36 +0100, Cesar Mendoza <c.mendoza at escet.urjc.es> wrote:

Hi again,
about sdl_geticks()
Ok, sorry, to be more specific… does anybody know a function to obtain time
more accurate…non miliseconds but microseconds or things like that


Chris Nystrom
http://www.newio.org/~ccn
AIM: nystromchris

I am using windows…

thanks.
cesar> ----- Original Message -----

From: cnystrom@gmail.com (Chris Nystrom)
To: "A list for developers using the SDL library. (includes SDL-announce)"

Sent: Thursday, December 02, 2004 9:14 PM
Subject: Re: [SDL] SDL_GetTicks()

On Thu, 2 Dec 2004 19:48:36 +0100, Cesar Mendoza <@Cesar_Mendoza1> wrote:

Hi again,
about sdl_geticks()
Ok, sorry, to be more specific… does anybody know a function to obtain
time
more accurate…non miliseconds but microseconds or things like that

For what operating system?

Chris


Chris Nystrom
http://www.newio.org/~ccn
AIM: nystromchris


SDL mailing list
SDL at libsdl.org
http://www.libsdl.org/mailman/listinfo/sdl

Hey Cesar,

Computers dont work in microseconds!

I believe the most accurate a modern computer/OS can be is to be accurate to
the nearest 10 miliseconds.> ----- Original Message -----

From: c.mendoza@escet.urjc.es (Cesar Mendoza)
To: "A list for developers using the SDL library. (includesSDL-announce)"

Sent: Thursday, December 02, 2004 10:48 AM
Subject: [SDL] SDL_GetTicks()

Hi again,
about sdl_geticks()
Ok, sorry, to be more specific… does anybody know a function to obtain
time
more accurate…non miliseconds but microseconds or things like that
thanks
cesar


SDL mailing list
SDL at libsdl.org
http://www.libsdl.org/mailman/listinfo/sdl

Hey Cesar,

Computers dont work in microseconds!

I believe the most accurate a modern computer/OS can be is to be accurate to
the nearest 10 miliseconds.

it is easy to get 1 millisec timing on windows.>From: “Cesar Mendoza” <c.mendoza at escet.urjc.es>

To: "A list for developers using the SDL library. (includesSDL-announce)"

Sent: Thursday, December 02, 2004 10:48 AM
Subject: [SDL] SDL_GetTicks()

Hi again,
about sdl_geticks()
Ok, sorry, to be more specific… does anybody know a function to obtain
time
more accurate…non miliseconds but microseconds or things like that
thanks
cesar

Computers dont work in microseconds!

I believe the most accurate a modern computer/OS can be is to be accurate to
the nearest 10 miliseconds.

it is easy to get 1 millisec timing on windows.

Using RTDSC it is easy to get even much better results, theoretically*, at resolution of single cycles… though this assembler instruction is available only on x86 processors.

Koshmaar

*didn’t worked for me :-/

Hey Cesar,

Computers dont work in microseconds!

I believe the most accurate a modern computer/OS can be is to be
accurate to
the nearest 10 miliseconds.

it is easy to get 1 millisec timing on windows.

Hi again,
about sdl_geticks()
Ok, sorry, to be more specific… does anybody know a function to obtain
time
more accurate…non miliseconds but microseconds or things like that
thanks
cesar


SDL mailing list
SDL at libsdl.org
http://www.libsdl.org/mailman/listinfo/sdl
On windows you can use QueryPerformanceCounter and QueryPerformanceFrequency http://msdn.microsoft.com/library/default.asp?url=/library/en-us/winui/winui/windowsuserinterface/windowing/timers/timerreference/timerfunctions/queryperformancecounter.asp I can’t remember what timing resolution you could get down to, but it was considerably better than with standard time() or getticks functions… regards, graf. aj wrote:

From: “Cesar Mendoza” <c.mendoza at escet.urjc.es>
To: "A list for developers using the SDL library. (includesSDL-announce)"
Sent: Thursday, December 02, 2004 10:48 AM
Subject: [SDL] SDL_GetTicks()

Ah, looking at SDL_systimer.c I see that the performance timer stuff has
been implemented in SDL, but it’s been #if’d out:

#if 0 /* Apparently there are problems with QPC on Win2K */
if (QueryPerformanceFrequency(&hires_ticks_per_second) == TRUE)

Maybe the QPC problems on Win2K have been fixed now (I think I’ve used
it successfully on Win2K), but of course SDL_GetTicks() returns a
millisecond count, so you’re never gonna get better than 1ms resolution
with it…

graffiti wrote:> On windows you can use QueryPerformanceCounter and QueryPerformanceFrequency http://msdn.microsoft.com/library/default.asp?url=/library/en-us/winui/winui/windowsuserinterface/windowing/timers/timerreference/timerfunctions/queryperformancecounter.asp I can’t remember what timing resolution you could get down to, but it was considerably better than with standard time() or getticks functions… regards, graf. aj wrote:

Hey Cesar,

Computers dont work in microseconds!

I believe the most accurate a modern computer/OS can be is to be
accurate to
the nearest 10 miliseconds.

it is easy to get 1 millisec timing on windows.

From: “Cesar Mendoza” <c.mendoza at escet.urjc.es>
To: "A list for developers using the SDL library.
(includesSDL-announce)"

Sent: Thursday, December 02, 2004 10:48 AM
Subject: [SDL] SDL_GetTicks()

Hi again,
about sdl_geticks()
Ok, sorry, to be more specific… does anybody know a function to
obtain
time
more accurate…non miliseconds but microseconds or things like that
thanks
cesar


SDL mailing list
SDL at libsdl.org
http://www.libsdl.org/mailman/listinfo/sdl


SDL mailing list
SDL at libsdl.org
http://www.libsdl.org/mailman/listinfo/sdl

hi,

I don’t know if it’s possible to have accurate microsecond timing in a
portable way.
There are some methods for Windows and Linux. Unfortunatly, I don’t know
other OSes, so I can’t submit it as a potentiel improvement for SDL, but
this would be cool to have a :

void SDL_GetFineTicks(unsigned long *p_high, unsigned long p_low )
/
64 bits integer /
or
/
returns in second, but in a double */
double SDL_GetFineTicks();

if someone knows/finds a way to do the same on all targets supported.

For Windows :
One can use QueryPerformanceCounter(), QueryPerformanceCounterFrequency().
Around 1 microsecond accurancy. MSDN states that this may be unsupported by
some hardware. (anyway I never saw a PC without support to those functions)

For Linux :
gettimeofday() is returning you a struct with 2 members, one in secondes,
the other with microsecondes.

For profiling purpose and only valid for x86 computers :
One can use the RDTSC assembly instruction,
This instruction is telling you the number of CPU cycles elapsed
since the CPU was started, in a 64 bits integer,
stored in EDX:EAX register pair.
EDX = 32 most significant bits, EAX = 32 least significant bits.

If your CPU is 1Ghz , the cpu cycle is 1 nanosecond ( 1/1Ghz = 1
nanosecond ) etc…

This will not work on Cyrix 686 and some other old x86 clones), but works on
cpus from intel pentium to latest cpu (or Amd equivalent)

This is not a measure of absolute time because you can’t easily detect the
cpu frequency at run-time. This is a measure of time which is usefull to
compare 2 pieces of code on 1 given CPU.

Regards,
William.> ----- Original Message -----

From: “Cesar Mendoza” <c.mendoza at escet.urjc.es>
Subject: [SDL] SDL_GetTicks()

Hi again,
about sdl_geticks()
Ok, sorry, to be more specific… does anybody know a function to obtain
time
more accurate…non miliseconds but microseconds or things like that
thanks
cesar

William Petiot wrote:

For Windows :
One can use QueryPerformanceCounter(), QueryPerformanceCounterFrequency().
Around 1 microsecond accurancy. MSDN states that this may be unsupported by
some hardware. (anyway I never saw a PC without support to those functions)

An early 486, for example. QueryPerformanceCounter is just a wrapper
around RDTSC, which doesn’t exist on these CPUs.

For Linux :
gettimeofday() is returning you a struct with 2 members, one in secondes,
the other with microsecondes.

For profiling purpose and only valid for x86 computers :
One can use the RDTSC assembly instruction,
This instruction is telling you the number of CPU cycles elapsed
since the CPU was started, in a 64 bits integer,
stored in EDX:EAX register pair.
EDX = 32 most significant bits, EAX = 32 least significant bits.

There’s already some code using RDTSC in SDL. However, it’s disabled
because it causes trouble on SMP machines (the timers might move
forwards/backwards when the program is scheduled on a different cpu).
We should read the apic timers on these machines, but I have no idea how
to do that.

Stephane

There’s already some code using RDTSC in SDL. However, it’s disabled
because it causes trouble on SMP machines (the timers might move
forwards/backwards when the program is scheduled on a different cpu).
We should read the apic timers on these machines, but I have no idea how
to do that.

It also causes problems on laptops where the CPU frequency dynamically changes
for power savings. :slight_smile:

-Sam Lantinga, Software Engineer, Blizzard Entertainment

Using RTDSC it is easy to get even much better results, theoretically*,

Doesn’t work reliably on laptops and other variable-frequency processors
like AMD’s “Cool’n’Quiet” feature. Even desktop machines are using
Intel’s SpeedStep tech now. When these features exist, rdtsc doesn’t
increment reliably, so depending on it may make your game think time is
moving faster or slower than expected.

We ripped this out of Unreal and used gettimeofday() on Unix and
timeGetTime() on Windows (QueryPerformanceCounter() is just a wrapper
over rdtsc, as far as I know). gettimeofday() is actually a surprisingly
fast system call.

gettimeofday() gives you microsecond resolution…the comment about not
being able to work in units smaller than 10 milliseconds is bunk, and
has to do with the Linux 2.4 scheduler not letting you sleep less than
10ms…but even there, within your timeslice, you can get microsecond
timing. Linux 2.6 fixed the scheduler resolution, too.

–ryan.

Maybe the QPC problems on Win2K have been fixed now (I think I’ve used
it successfully on Win2K), but of course SDL_GetTicks() returns a
millisecond count, so you’re never gonna get better than 1ms resolution
with it…

Also, I have to question what a game needs sub-millisecond timing for.
Generally this demand is from the same people that think checking an
event loop is fatally inefficient, and would rather run a for-loop over
the keyboard state array instead.

–ryan.

[QueryPerformanceCounter doesn’t work on all pc hardware]

An early 486, for example. QueryPerformanceCounter is just a wrapper
around RDTSC, which doesn’t exist on these CPUs.

I was hoping QueryPerformanceXXX functions were wrappers on the timing
hardware which is on all PCs since the early 8086 …

I didn’t know they were a wrapper on rdtsc… Now that I know, I aggree with
sam & Ryan & you, Stephane, QueryPerformanceCounter is also unusable to
measure time.

As said Ryan, it seems that timeGetTime() is the only solution for windows
boxes (with the use of timeBeginPeriod() etc).

This is a pity, since there IS a hardware timer with around 1 microsecond
increment in all PCs, which is programmable. This piece of hardware was
used to provide system clock in old DOS. One could read the timer register
anytime and deduce time at the precision of the timer frequency (usually
around 1.13 Mhz).
Windows use this timer to provide the system tick and doesn’t show the
in-between counter values between ticks to user space…

So, to achieve 1 ms accurancy (at least), for Linux we have gettimeofday(),
for Windows we have timeGetTime()… sounded like a good starting point :slight_smile:

I checked SDL cvs sources, and for windows, SDL is already using
timeGetTime(), for Linux, SDL is already using gettimeofday() (if USE_RDTSC
is not defined)…

To summerize : SDL is already ok, as usual. The more I use it, the more I
like it. :slight_smile:

For profiling purpose and only valid for x86 computers :
One can use the RDTSC assembly instruction,
[…]
There’s already some code using RDTSC in SDL. However, it’s disabled
because it causes trouble on SMP machines (the timers might move
forwards/backwards when the program is scheduled on a different cpu).
We should read the apic timers on these machines, but I have no idea how
to do that.

I don’t know neither ;(… btw you can use rdtsc on SMP to profile your
code: tell windows to only use 1 cpu while you are profiling
( SetProcessAffinityMask() ).

Regards,
William

Also, I have to question what a game needs sub-millisecond timing for.

Theoretically* it can be used for profiling code; and for the games… hmmm, hard to think where it is absolutely necessary to have such timer resolution but generally it’s better to have more than less timer resolution.

Generally this demand is from the same people that think checking an
event loop is fatally inefficient, and would rather run a for-loop over
the keyboard state array instead.

That was me some time ago :wink: but I also had other, more important reasons to do so.

Koshmaar

  • yes, another one, in practice it gave me varying results (though maybe threads were being switched? dunno)

Ryan C. Gordon wrote:

Using RTDSC it is easy to get even much better results, theoretically*,

Doesn’t work reliably on laptops and other variable-frequency
processors like AMD’s “Cool’n’Quiet” feature. Even desktop machines
are using Intel’s SpeedStep tech now. When these features exist, rdtsc
doesn’t increment reliably, so depending on it may make your game
think time is moving faster or slower than expected.

We ripped this out of Unreal and used gettimeofday() on Unix and
timeGetTime() on Windows (QueryPerformanceCounter() is just a wrapper
over rdtsc, as far as I know). gettimeofday() is actually a
surprisingly fast system call.

gettimeofday() gives you microsecond resolution…the comment about
not being able to work in units smaller than 10 milliseconds is bunk,
and has to do with the Linux 2.4 scheduler not letting you sleep less
than 10ms…but even there, within your timeslice, you can get
microsecond timing. Linux 2.6 fixed the scheduler resolution, too.

Well, gettimeofday (which afaik uses the apic timers under linux) or
timeGetTime surely gives you microsecond resolution, but SDL truncates
it to millisecond resolution.
Or wasn’t this the issue ?

Maybe we should add a 64 bits timing function, that would also remove
the 49-day-uptime limitation.

Stephane

Well, gettimeofday (which afaik uses the apic timers under linux) or
timeGetTime surely gives you microsecond resolution, but SDL truncates
it to millisecond resolution.
Or wasn’t this the issue ?

Someone said that you can’t get less than 10 milliseconds. I was just
noting that this isn’t the case, and the belief is a holdover from an
unrelated Linux 2.4 issue.

But yeah, even if SDL_GetTicks() isn’t sufficient for a purpose, there
are other means of getting it done.

–ryan.