For the physics simulation - updating object positions and rotations -
I need to measure the exact time spent in that frame. On fast machines
it turned out that SDL timer precision was not enough; simulation could
run so fast that the time spent in one animation step was not quite
precise enough, and resulted jerky animation.
My question remains: For what application is timekeeping better than
1/1000 s needed, and how high resolution is needed? microsecond?
nanosecond?
greater than 1000Hz : DSP on audio; high-speed simulations; sampling from
some devices; networking transformations
less than 5ms resolution: (ms=millisecond; 1/1000 of a second)
interrupt handling (clue: this is RT|os stuff - not for libraries)
I/O polling | monitoring (somewhat device specific)
sound
less than 16ms resolution:
video refresh if one is feeling silly (16ms == 60fps)
(although displays may be faster by and large one can’t
see the changes…
(if one is working with PAL(25fps/40ms) or NTSC(30fps/33ms) one
should be clean enough)
I’m not saying it isn’t needed; I just want clear quantified answers
to be able to design the interface better
I’m not sure if a high-response (emphasis on fast response) system is
needed beyond 5ms (==5000 microseconds) for SDL… requirements beyond
that generally are beyond the capabilities of the OS anyways unless it’s
a RT/OS. (or close, such as linux). As a subtle hint, SDL is used in
graphics and gaming, yes - but it’s also farely platform neutral. If one
wishes for resolution beyond this range, one should probably code it
specifically to an OS | system designed for the job and not for a
general-purpose system… although if one really wishes said to occur, I
suggest decoupling the display and specifically sampling the data results
into the display rather than depending on the display framerate.
If you really want comparison via OS, linux at the last point I checked
could respond to a timed event within 50 microseconds. Windows last I
checked (486/100; windows 95) could handle 10000microseconds (1/100
second). I suspect windows (and hardware) has improved since this time,
but YMMV, eh?
(a RT/OS under a 486 -can- respond to a timed event in 45 microseconds -
this is the processor’s own response time on that hardware. This was
when linux -first- clocked a -sustained- 50ms response time)
I asked once before if anyone wanted an object for tracking the passing of
time against actions -> My solution is to find out how far an action
-should- be given a certain amount of time passing. No matter what the
FPS, unless it drops below the comfortable level (15fps) the system looks
pretty smooth. I haven’t tested the code since last December but at that
point it was using SDL_GetTicks() comparisons (and occasionally
SDL_Delay(…) if it was going too quickly) using -floating point- time
values rather than integer for my timed objects.
Maybe this description doesn’t make sense, and maybe it does. Hope it’s
of some use
G’day, eh?
- Teunis, who’s learned a little about time but has a hard time
with resolutions below half an hour…On Thu, 5 Apr 2001, Mattias Engdegard wrote:
–
What is courage now? Is it just to go until we’re done?
Men may call us heroes when they say we’ve won but if we should fail, how
then… What is courage now?
- Fellowship Going South by Leslie Fish sung by Julia Ecklar
Member in purple standing of the Mad Poet’s Society.
Trying to bring truth from beauty is Winterlion.
find at this winterlions’ page