I’ve been trying to find a remedy to my current problem with
SDL_GetTicks under Linux and failing. The only thing I can figure is
that it uses gettimeofday under Linux and timegettime under Windows
which both return the amount in milliseconds but lack a sufficient
level of accuracy for games. This would also explain the unfortunate
hiccups I can’t seem to get rid of in Linux programs using OpenGL.
Have you analyzed the timestamps you get from SDL_GetTicks() - or
rather, the deltas? Unless you’re running SCHED_FIFO or similar on a
realtime OS (which usually requires root/sysadmin privileges and
doesn’t mix well, if at all, with the video subsystem), you should
expect jittering of a few ms.
(Note that this 10 ms granularity does not apply to SDL_GetTicks();
only to SDL_Delay()! The granularity of SDL_GetTicks() should be 1
ms on all platforms, AFAIK.)
For perfectly smooth animation, you’ll need to apply som filtering
before you inject the timestamps into the game logic. If you know the
display refresh rate, you could essentially calculate a fixed
per-frame delta, and just keep track of missed frames. After all,
there are no such things as “fractional frames” on current CRT and
TFT displays.
Are you using proper page flipping with retrace sync? If not, there
are two major issues that make smooth animation pretty much
impossible:
1) Since you never sleep and just pump out frames
as fast as possible (most of which will never
be seen), the OS will consider your program a
CPU hog, and will gladly hand the CPU to any
background process that has work to do, and
more seriously; you may not get the CPU back
in a (relatively speaking) long time.
2) Animation without retrace sync invariably
results in tearing and unstable, hard to track
timing. You can get a reasonably approximation
of smooth animation if you do things correctly,
and if you get an insane frame rate (a few
hundred fps or more), you'll even reduce the
tearing quite a bit, but for all practical
matters, it can never be perfectly smooth, nor
tearing free.
Now, before starting the serious hair pulling, do any games run
perfectly smooth on your system?
BTW, note that scrolling 2D games are generally much more sensitive
than first person 3D games. The difference in scale and perspective
generally makes tearing, low frame rates and dropped frames a bit
less obvious in 3D games.
What I will do for the time being is using my own code through
QueryPertormanceCounter and clock_gettime, but are my assumptions
correct and, if so, why is this the case?
Well, you can try it, but unless there’s something wrong with
SDL_GetTicks() or the underlying API on your system, I don’t think
it’s going to help.
It’s not going to eliminate the scheduling latency jitter, and even if
there was no jitter, improving on a timing accuracy of 10+ units per
display refresh wouldn’t make all that much of a difference.
If it does help, I suspect SDL_GetTicks() is broken on your system.
BTW, QueryPertormanceCounter and similar APIs are usually based on
RDTSC and corresponding CPU instructions, and they tend to have
problems on SMP systems, due to the CPUs not booting at the same
time, and/or drifting out of sync over time. Oh, and then there’s
thermothrottling… Technically, these issues can be dealt with by
the OS - but certain versions of Windows and Linux (AFAIK) fail to do
so, rendering these benchmarking APIs pretty much useless for
production code.
//David Olofson - Programmer, Composer, Open Source Advocate
.------- http://olofson.net - Games, SDL examples -------.
| http://zeespace.net - 2.5D rendering engine |
| http://audiality.org - Music/audio engine |
| http://eel.olofson.net - Real time scripting |
’-- http://www.reologica.se - Rheology instrumentation --'On Wednesday 17 January 2007 15:07, Paul Duffy wrote: