Milliseconds on the Mac

I was trying to figure out why my engine got so jumpy when I tried to
time synchronize it with SDL_GetTicks(), when I looked through the
source code for the timer… It seems that on MacOS it gets ticks in
1/60th of a second, and then converts them to milliseconds. This is
obviously inadequate for timing the physics in a real time game.
However, I scanned Apple’s docs and can’t figure out how to get time
at smaller intervals then 1/100th of a second, which is hardly
better, so I don’t know how to fix it. Does anybody know how I can
coax proper millisecond time measurements out of the MacOS?
,---------------------------------------------------------.
/ /// /// _/ _/ // Michael Powell /
/ _/ _/ _/ _/ _/ _/ _/ / /
/ //
/ // / /// // Aspiring /
/ _/ _/ _/ _/ _/ / / / Video Game /
/ //
/ //
/ //
/ _/ _/ _/ _/ Programmer /
`---------------------------------------------------------’

I was trying to figure out why my engine got so jumpy when I tried to
time synchronize it with SDL_GetTicks(), when I looked through the
source code for the timer… It seems that on MacOS it gets ticks in
1/60th of a second, and then converts them to milliseconds. This is
obviously inadequate for timing the physics in a real time game.
However, I scanned Apple’s docs and can’t figure out how to get time
at smaller intervals then 1/100th of a second, which is hardly
better, so I don’t know how to fix it. Does anybody know how I can
coax proper millisecond time measurements out of the MacOS?

PPCs have a high-resolution timer which would be more adequate to the task.
I will toss you the source code once I boot up my Linux box.

Nicholas

----- Original Message -----
From: belar@earthling.net (Mike Powell)
To: sdl at lokigames.com
Date: Saturday, April 29, 2000 2:48 AM
Subject: [SDL] Milliseconds on the Mac

PPCs have a high-resolution timer which would be more adequate to the task.
I will toss you the source code once I boot up my Linux box.

Why do you need to boot up Linux to get at MacOS source? Or this
source for PPC Linux? If so, it may not do me much good, unless you
can tell me how to call the same routines from the MacOS.

  ,---------------------------------------------------------.
 / _/_/_/    _/_/_/  _/        _/    _/_/   Michael Powell  /
/ _/    _/  _/      _/      _/  _/  _/  _/                 /

/ //_/ // / /// // Aspiring /
/ _/ _/ _/ _/ _/ / / / Video Game /
/ //
/ //
/ //
/ _/ _/ _/ _/ Programmer /
`---------------------------------------------------------’

#include <Timer.h>

UnsignedWide time;

MicroSeconds (&time);

The low word of time variable is the number of milliseconds (with 20
microsecond error), since the computer has booted up. I am not sure what the
high word is for, but I assume it is rarely used.

so to get the microseconds, you do :

UInt32 microseconds = time.lo;

Keep in mind that MicroSeconds() is several times slower (I don’t know
exactly) than LMGetTicks (), which is what SDL is using right now.> From: Mike Powell

Reply-To: sdl at lokigames.com
Date: Sat, 29 Apr 2000 09:17:57 -0700
To: sdl at lokigames.com
Subject: Re: [SDL] Milliseconds on the Mac

PPCs have a high-resolution timer which would be more adequate to the task.
I will toss you the source code once I boot up my Linux box.

Why do you need to boot up Linux to get at MacOS source? Or this
source for PPC Linux? If so, it may not do me much good, unless you
can tell me how to call the same routines from the MacOS.

,---------------------------------------------------------.
/ /// /// _/ _/ // Michael Powell /
/ _/ _/ _/ _/ _/ _/ _/ / /
/ //
/ // / /// // Aspiring /
/ _/ _/ _/ _/ _/ / / / Video Game /
/ //
/ //
/ //
/ _/ _/ _/ _/ Programmer /
`---------------------------------------------------------’

MicroSeconds (&time);

Lazily browsing DejaNews I found some alternatives: UpTime, ISpUpTime,
OTGetTimeStamp. See

http://x24.deja.com/=dnc/[ST_rn=ps]/getdoc.xp?AN=488751472&CONTEXT=957116960.1266024459&hitnum=7

Check out my sample library for getting milli and micro seconds on the Mac.
It covers all kinds of edge cases where UpTime, ISpUpTime, and OTGetTimeStamp
aren’t available or use slow fallback logic.

http://www.AmbrosiaSW.com/~fprefect/FastTimes.sit.bin

Matt–
/* Matt Slot, Bitwise Operator * One box, two box, yellow box, blue box *

#include <Timer.h>

UnsignedWide time;

MicroSeconds (&time);

The low word of time variable is the number of milliseconds (with 20
microsecond error), since the computer has booted up. I am not sure what the
high word is for, but I assume it is rarely used.

so to get the microseconds, you do :

UInt32 microseconds = time.lo;

Keep in mind that MicroSeconds() is several times slower (I don’t know
exactly) than LMGetTicks (), which is what SDL is using right now.

Actually, the docs on Microseconds() on Apple’s site say that it
merely returns the time in Microseconds since some date in 1904. It’s
really just a 64-bit int that it returns, but, according to the docs,
there was no way to make a 64 bit int, so they broke it into two 32
bit ints in a struct. The hi and lo elements are just the high and
low order 32 bits.

  ,---------------------------------------------------------.
 / _/_/_/    _/_/_/  _/        _/    _/_/   Michael Powell  /
/ _/    _/  _/      _/      _/  _/  _/  _/                 /

/ //_/ // / /// // Aspiring /
/ _/ _/ _/ _/ _/ / / / Video Game /
/ //
/ //
/ //
/ _/ _/ _/ _/ Programmer /
`---------------------------------------------------------’

MicroSeconds (&time);

Lazily browsing DejaNews I found some alternatives: UpTime, ISpUpTime,
OTGetTimeStamp. See

http://x24.deja.com/=dnc/[ST_rn=ps]/getdoc.xp?AN=488751472&CONTEXT=95
7116960.1266024459&hitnum=7

Ah, thanks. It looks like UpTime is exactly what I need. It starts
saying that it’s not on all machines, but latter says it’s included
in MacOS 8.6, and only works on PPCs, but considering that OS 9 is
out now, and PPCs have been standard for like 5 years, I don’t think
we really need to feel bad about placing a “MacOS 8.6 or better and
PPC processor required” sticker on it, now do we? Does SDL even work
on 68ks as it is?

 ,---------------------------------------------------------.
/ _/_/_/    _/_/_/  _/        _/    _/_/   Michael Powell  /

/ _/ _/ _/ _/ _/ _/ _/ / /
/ //
/ // / /// // Aspiring /
/ _/ _/ _/ _/ _/ / / / Video Game /
/ //
/ //
/ //
/ _/ _/ _/ _/ Programmer /
`---------------------------------------------------------’

Actually, the docs on Microseconds() on Apple’s site say that it
merely returns the time in Microseconds since some date in 1904.
It’s really just a 64-bit int that it returns, but, according to the
docs, there was no way to make a 64 bit int, so they broke it into
two 32 bit ints in a struct. The hi and lo elements are just the
high and low order 32 bits.

You know, I was just reading further through Apple’s docs, and found
that a microsecond 1/1,000,000th of a second, not 1/100th as I had
thought. That’s many times the resolution I need, though taking 300
microseconds just to find that, as that article Mattias pointed me to
said, is a bit excessive when trying to write a real time 3D engine
(300/1,000,000ths of a second may not seem like much, until you
consider how much it has to do in 33,333/1,000,000ths of a second in
order to render a complete frame once every 30 seconds, so just
calling Microseconds() 111 times would eat up the entire time it
should take to render a frame).

  ,---------------------------------------------------------.
 / _/_/_/    _/_/_/  _/        _/    _/_/   Michael Powell  /
/ _/    _/  _/      _/      _/  _/  _/  _/                 /

/ //_/ // / /// // Aspiring /
/ _/ _/ _/ _/ _/ / / / Video Game /
/ //
/ //
/ //
/ _/ _/ _/ _/ Programmer /
`---------------------------------------------------------’

Ah, thanks. It looks like UpTime is exactly what I need.

Meanwhile, Matt Slot just posted a much better solution. I took a look at it
and it seems to be just what you wanted.