Timing issues with SDL on Linux (OpenGL app)

I’ve got an OpenGL engine I’ve been playing with for a few years; up
until tonight it used GLUT for keyboard input and requesting the window.
I wanted to have proper key-up/key-down events so I ported to SDL.

The GL part worked perfectly, the window comes up, the key events work
properly and all is happy with the world except that it’s very very
jerky.

I’ve narrowed it down to timer issues: stuff in my engine moves a
distance proportional to the most recent frame time so that it’s
consistent with varying frame rates. If the measured frame time is
wrong, things move the wrong distance. My source of time information
is ftime() for the first second and then rdtsc once it’s confident of
the clock rate. It’s not an SMP system, so rdtsc should be consistent;
disabling rdtsc doesn’t help my problem anyway.

When I use SDL to create my windows, the time information returned by
both ftime and (this is the bit I REALLY don’t understand) rdtsc
occasionally jumps ahead about 60ms, so the motion keeps jumping ahead.

Typical framerate is 150fps so the frame times returned are ~6.5ms.
Occasionally, one will be returned saying 65ms has elapsed and this
causes the jerking. Note that things jerk ahead of their proper
position, not lag behind and that if I ignore the clock output and
force the time intervals to be constant for simulation purposes, the
rendering is perfectly smooth. The system isn’t actually pausing or
dropping frames but somehow the clock is jumping ahead.

If I go back to using glut instead of SDL to open the GL window, timing
goes back to normal. Turning off optimisation didn’t help at all.
Using SDL as my source of time by calling SDL_GetTicks() didn’t help
either.

If I put in a print to stdout and stdout is a terminal, the timing
becomes a little more consistent but it’s slow because the terminal is
updating. Redirecting stdout to a file brings the timing back to
inconsistent; here’s an example where I print out the measured frame
times derived from SDL, problem values in the middle of each sequence:
fdt 0.007
fdt 0.007
fdt 0.062
fdt 0.005
fdt 0.006

Or derived from rdtsc:
fdt 0.00679732
fdt 0.00671102
fdt 0.06334
fdt 0.00524602
fdt 0.00523732

As a practical matter, I’ve solved the jerkiness by taking the sim time
to be the average measured frame time from the previous 100ms; the fact
that this makes it smooth proves I think that it’s a time measurement
issue, not merely lag and frame-dropping.

Some system info:
detritus:~/prog/src/bogl%sdl-config --version
1.2.11
detritus:~/prog/src/bogl%gcc --version
i686-pc-linux-gnu-gcc (GCC) 3.4.6 (Gentoo 3.4.6-r1, ssp-3.4.5-1.0,
pie-8.7.9) Copyright © 2006 Free Software Foundation, Inc.
This is free software; see the source for copying conditions. There is
NO warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR
PURPOSE.

It’s an Athlon 1800+ and ntpd is running though I can’t see ntpd
changing the output of rdtsc.

So: what nasty stuff could be SDL doing to make the timing inconsistent?–
William Brodie-Tyrrell

Carpe Diem - fish of the day.

<@William_Brodie-Tyrre>
http://www.brodie-tyrrell.org/