- Accurate monitoring of current time
-> The sampling of this need be no more often than needed to
maintain within a given tolerance.
(timed? from my own experiences, even gettimeofday() can
gain innacuracy… though I have to admit my CPU is
overclocked!)
All I need i quering accurate time. I am not sure what you mean by
sampling. When CPU cycle counter is used, you can simply ask what
is value of the counter. The value is automatically update, and you
only need to read the value when user asks.
- Ability to deliver time-deltas between frames
(maybe there’s a better way to describe?)
This can be done by user. Calculating delta between two times
is not that difficult.
- 1 microsecond (or better) base measurement (ints ARE handier
I don’t like the idea of fixed unit. I would like interface where you
can ask how long one unit is, and the unit can be variable. Either that,
or use floating points and seconds as 1. I use floats. I do not know
any reason why not use floats.
- the ability to detect when frames have been lost - and how many
All I need is ability to get the current time. Frame is something the
library shouldn’t care about. I do use SDL timer so that it tries to
run at fixed intervals. The simulation will make add simulation time
a fixed amount even if simulation timer fails to run at that speed.
Effectively this means that the simulation slows down, but for me it
is acceptable. I could, if I wanted to, notice when time delta between
frames (or several frames) indicates that frames are lost, and do
something about that. It is easy to find out lost frames just by
using the accurate time.
- the ability to detect real latency in measurements (how?)
This is not possible; if we provide the most precise timer in the
system, we have no tools to measure the latency in any additional
precision. But there is very little latency when using cpu cycle
counters; it is the correct when you read it, best possible estimation
of time at that moment. If for some reason your process is scheduled to
wait right after you have read the value, it won’t hurt much, because this
will count to the age of next simulation frame.
So I expect that SDL timer does it’s best to run my timer function at
fixed intervals, but I accept it being slightly inaccurate. I can always
measure the real time between SDL timer callbacks. Though, I currently
use this only for fps counter. The actual simulation updates the simulated
time at fixed intervals; In fact, discrete incremental simulation does not
work correctly unless you make it that way.
It is by design that the simulation timer thread is light enough to be
run as often as required, and that it runs more often than screen updates.
If you have problems running the simulation at some speed, you should
measure the overall system performance, and choose the correct simulation
interval, which is fixed once you have chosen it. I don’t skip frames if
I am behind schedule; the simulation just slows down.
- the ability to detect the real coherency (samples/second)
-> perhaps using RDTSC under intel/pentium II+, how on
other architectures?
I guess you mean quering timer resolution? Yes, that would be handy.
I thought otherwise), the idea seemed doable. There’s no reliable way
to read the number of rendered frames/second on a graphics card - and they
can float depending on system/BUS load as well, so making sure of taking
I can count number of frames I have drawn during one second. I can
measure the exact time spent for this. And I would call my method
reliable - assuming I have the precise time available of course.
– Timo Suoranta – @Timo_K_Suoranta –