What functions does SDL_GetTicks use on each platform?

I’ve been trying to find a remedy to my current problem with
SDL_GetTicks under Linux and failing. The only thing I can figure is
that it uses gettimeofday under Linux and timegettime under Windows
which both return the amount in milliseconds but lack a sufficient
level of accuracy for games. This would also explain the unfortunate
hiccups I can’t seem to get rid of in Linux programs using OpenGL.

What I will do for the time being is using my own code through
QueryPertormanceCounter and clock_gettime, but are my assumptions
correct and, if so, why is this the case?

“Duct tape is like the Force. It has a dark side, it has a
light side, and it holds the Universe together.”
-Carl Zwanig____________________________________________________________________________________
Never Miss an Email
Stay connected with Yahoo! Mail on your mobile. Get started!
http://mobile.yahoo.com/services?promote=mail

You know SDL is open source right?

Doing this from the top level of SDL sources:

find ./ -name “*.c” -exec grep -H “Uint32 SDL_GetTicks” {} ;

tells you where the different implementations for the different OSs are.
You can use any “find in files” search you like btw.

Looking at the implementation in ./src/timer/unix/SDL_systimer.c you can
see that it is either using clock_gettime or gettimeofday based on the
HAVE_CLOCK_GETTIME define.

Hope that helps.

Cheers,
Kos> ----- Original Message -----

From: sdl-bounces+kos=climaxgroup.com@libsdl.org
[mailto:sdl-bounces+kos=climaxgroup.com at libsdl.org] On Behalf Of Paul Duffy
Sent: 17 January 2007 14:08
To: sdl at libsdl.org
Subject: [SDL] What functions does SDL_GetTicks use on each platform?

I’ve been trying to find a remedy to my current problem with
SDL_GetTicks under Linux and failing. The only thing I can figure is
that it uses gettimeofday under Linux and timegettime under Windows
which both return the amount in milliseconds but lack a sufficient
level of accuracy for games. This would also explain the unfortunate
hiccups I can’t seem to get rid of in Linux programs using OpenGL.

What I will do for the time being is using my own code through
QueryPertormanceCounter and clock_gettime, but are my assumptions
correct and, if so, why is this the case?

“Duct tape is like the Force. It has a dark side, it has a
light side, and it holds the Universe together.”
-Carl Zwanig



Never Miss an Email
Stay connected with Yahoo! Mail on your mobile. Get started!
http://mobile.yahoo.com/services?promote=mail


SDL mailing list
SDL at libsdl.org
http://www.libsdl.org/mailman/listinfo/sdl

I’ve been trying to find a remedy to my current problem with
SDL_GetTicks under Linux and failing. The only thing I can figure is
that it uses gettimeofday under Linux and timegettime under Windows
which both return the amount in milliseconds but lack a sufficient
level of accuracy for games. This would also explain the unfortunate
hiccups I can’t seem to get rid of in Linux programs using OpenGL.

Have you analyzed the timestamps you get from SDL_GetTicks() - or
rather, the deltas? Unless you’re running SCHED_FIFO or similar on a
realtime OS (which usually requires root/sysadmin privileges and
doesn’t mix well, if at all, with the video subsystem), you should
expect jittering of a few ms.

(Note that this 10 ms granularity does not apply to SDL_GetTicks();
only to SDL_Delay()! The granularity of SDL_GetTicks() should be 1
ms on all platforms, AFAIK.)

For perfectly smooth animation, you’ll need to apply som filtering
before you inject the timestamps into the game logic. If you know the
display refresh rate, you could essentially calculate a fixed
per-frame delta, and just keep track of missed frames. After all,
there are no such things as “fractional frames” on current CRT and
TFT displays.

Are you using proper page flipping with retrace sync? If not, there
are two major issues that make smooth animation pretty much
impossible:
1) Since you never sleep and just pump out frames
as fast as possible (most of which will never
be seen), the OS will consider your program a
CPU hog, and will gladly hand the CPU to any
background process that has work to do, and
more seriously; you may not get the CPU back
in a (relatively speaking) long time.

2) Animation without retrace sync invariably
   results in tearing and unstable, hard to track
   timing. You can get a reasonably approximation
   of smooth animation if you do things correctly,
   and if you get an insane frame rate (a few
   hundred fps or more), you'll even reduce the
   tearing quite a bit, but for all practical
   matters, it can never be perfectly smooth, nor
   tearing free.

Now, before starting the serious hair pulling, do any games run
perfectly smooth on your system?

BTW, note that scrolling 2D games are generally much more sensitive
than first person 3D games. The difference in scale and perspective
generally makes tearing, low frame rates and dropped frames a bit
less obvious in 3D games.

What I will do for the time being is using my own code through
QueryPertormanceCounter and clock_gettime, but are my assumptions
correct and, if so, why is this the case?

Well, you can try it, but unless there’s something wrong with
SDL_GetTicks() or the underlying API on your system, I don’t think
it’s going to help.

It’s not going to eliminate the scheduling latency jitter, and even if
there was no jitter, improving on a timing accuracy of 10+ units per
display refresh wouldn’t make all that much of a difference.

If it does help, I suspect SDL_GetTicks() is broken on your system.

BTW, QueryPertormanceCounter and similar APIs are usually based on
RDTSC and corresponding CPU instructions, and they tend to have
problems on SMP systems, due to the CPUs not booting at the same
time, and/or drifting out of sync over time. Oh, and then there’s
thermothrottling… Technically, these issues can be dealt with by
the OS - but certain versions of Windows and Linux (AFAIK) fail to do
so, rendering these benchmarking APIs pretty much useless for
production code.

//David Olofson - Programmer, Composer, Open Source Advocate

.------- http://olofson.net - Games, SDL examples -------.
| http://zeespace.net - 2.5D rendering engine |
| http://audiality.org - Music/audio engine |
| http://eel.olofson.net - Real time scripting |
’-- http://www.reologica.se - Rheology instrumentation --'On Wednesday 17 January 2007 15:07, Paul Duffy wrote:

Have you analyzed the timestamps you get from SDL_GetTicks() - or
rather, the deltas? Unless you’re running SCHED_FIFO or similar on a
realtime OS (which usually requires root/sysadmin privileges and
doesn’t mix well, if at all, with the video subsystem), you should
expect jittering of a few ms.

That, I expect and have compensated for. What I get is a visible pause
for a sizeable fraction of a second and, of course, a jump of all
animation as I’m basing all animation time on SDL_GetTicks.

My output for values of time difference are usually around 3-7 ticks,
but every few seconds I’ll get one of about 56, 76 or even 136. While
this is happening, my CPU utilisation is between 0% and 9% so it’s
hardly a processor hog.

(Note that this 10 ms granularity does not apply to SDL_GetTicks();
only to SDL_Delay()! The granularity of SDL_GetTicks() should be 1
ms on all platforms, AFAIK.)

Right, I’m still a little fuzzy on the different behaviour on different
platforms for various functions.

Are you using proper page flipping with retrace sync?

Well, I’m using proper OpenGL double buffering but I’m not using vSync.

However, using something like SDL_Delay has no effect as this is quite
the sizeable pause. What I was hoping, was that the problem was
something to do with using a time function which was being
insufficiently accurate. This does not, however, appear to be the case.

Now, before starting the serious hair pulling, do any games run
perfectly smooth on your system?

Oh yes. Neverball, which is programmed with SDL so far as I’m away, runs
dandy, and it does a bit more than my effort to boot.

If it does help, I suspect SDL_GetTicks() is broken on your system.

Oh joy. I’ve been spending all this time thinking I was missing
something. I did post about this problem sometime back to no avail. I
thought this was a Linux problem as I have no such issue in Windows.

It seems the other reply may have the answer as to why I’m having this
problem.

I might see if swapping SuSE 10.0 for 10.2 makes any difference.

Thanks for your help.On Wed, 2007-01-17 at 15:56 +0100, David Olofson wrote:


All new Yahoo! Mail “The new Interface is stunning in its simplicity and ease of use.” - PC Magazine
http://uk.docs.yahoo.com/nowyoucan.html

You know SDL is open source right?

Oh yes, just didn’t think it’d be that easy to find the answer in all
that source code :o)

Looking at the implementation in ./src/timer/unix/SDL_systimer.c you can
see that it is either using clock_gettime or gettimeofday based on the
HAVE_CLOCK_GETTIME define.

Hope that helps.

Oh yes, having clobbered the source out of opensuse.org I have
discovered that my SDL-devel is ‘special’ and uses RDTSC in place of
clock_gettime, then comments it out because of its unreliability on
multi-CPU systems.

It seems that the minimal amount of advice I’ve found pointing to this
method being a tad unreliable might well be right, much like it was a
good year before I found anything authoritative on PHPs gaping
security holes.

Of course, what this does mean is that it’s only ever going to use
gettimeofday. This may bear further investigation.

sigh I’ll have a poke at the SuSE 10.2 code and see if it’s worth
upgrading, I probably should at some point.

Thanks for your help.On Wed, 2007-01-17 at 14:15 +0000, Kostas Kostiadis wrote:


All New Yahoo! Mail ? Tired of Vi at gr@! come-ons? Let our SpamGuard protect you. http://uk.docs.yahoo.com/nowyoucan.html

[…]

My output for values of time difference are usually around 3-7
ticks, but every few seconds I’ll get one of about 56, 76 or even
136. While this is happening, my CPU utilisation is between 0% and
9% so it’s hardly a processor hog.

That is strange, indeed - especially if throwing in an SDL_Delay(10)
doesn’t help. That usually makes a bigger difference on Linux than it
does on Windows, probably because Linux counts actual CPU time,
whereas Windows apparently bases in’s “CPU usage” on the frequency of
certain system calls.

[…]

Now, before starting the serious hair pulling, do any games run
perfectly smooth on your system?

Oh yes. Neverball, which is programmed with SDL so far as I’m away,
runs dandy, and it does a bit more than my effort to boot.

…and it runs fine on the very machine and operating system where
your code has problems? If so, maybe it’s worth a try recompiling
Neverball, making sure it’s really using the same SDL library that
you’re using, and see if it still runs smoothly.

If it does help, I suspect SDL_GetTicks() is broken on your
system.

Oh joy. I’ve been spending all this time thinking I was missing
something.

Well, it could be something seemingly unrelated, such as event
handling, och code that doesn’t even use SDL. Have you located the
exact spot where time is “lost”? Could it be some unintentionally
blocking system call or something?

Have you tried commenting out stuff from the main loop until you have
virtually nothing but the timing code left, and/or until the problem
goes away?

I did post about this problem sometime back to no avail. I
thought this was a Linux problem as I have no such issue in Windows.

Any platform specific #ifdefs in your code…?

It seems the other reply may have the answer as to why I’m having
this problem.

I might see if swapping SuSE 10.0 for 10.2 makes any difference.

Thanks for your help.

No problem.

//David Olofson - Programmer, Composer, Open Source Advocate

.------- http://olofson.net - Games, SDL examples -------.
| http://zeespace.net - 2.5D rendering engine |
| http://audiality.org - Music/audio engine |
| http://eel.olofson.net - Real time scripting |
’-- http://www.reologica.se - Rheology instrumentation --'On Wednesday 17 January 2007 18:34, Paul Duffy wrote:

Oh yes, just didn’t think it’d be that easy to find the answer in all
that source code :o)

I know exactly what you mean :wink: Been there, done that myself.
Eventually, “getting your hands dirty” is quite beneficial because you learn
a lot of stuff in the process. It may seem a bit slower, but it’s
definitely worth it.

Thanks for your help.

No probs…good luck. Let us know if you find out what the problem was.

Cheers,
K.

[…]

Oh yes, having clobbered the source out of opensuse.org I have
discovered that my SDL-devel is ‘special’ and uses RDTSC in place of
clock_gettime, then comments it out because of its unreliability on
multi-CPU systems.

It seems that the minimal amount of advice I’ve found pointing to
this method being a tad unreliable might well be right, much like it
was a good year before I found anything authoritative on PHPs
gaping security holes.

Of course, what this does mean is that it’s only ever going to use
gettimeofday. This may bear further investigation.

sigh I’ll have a poke at the SuSE 10.2 code and see if it’s worth
upgrading, I probably should at some point.

AFAIK, some implementations of gettimeofday() use RDTSC - and indeed,
some versions have had problems with thermothrottling and SMP
machines… So, SDL might not be the only place to look.

Maybe you could try some LiveCD, just to see if it makes a difference?

//David Olofson - Programmer, Composer, Open Source Advocate

.------- http://olofson.net - Games, SDL examples -------.
| http://zeespace.net - 2.5D rendering engine |
| http://audiality.org - Music/audio engine |
| http://eel.olofson.net - Real time scripting |
’-- http://www.reologica.se - Rheology instrumentation --'On Wednesday 17 January 2007 18:44, Paul Duffy wrote:

That is strange, indeed - especially if throwing in an SDL_Delay(10)
doesn’t help.

Actually, I was using a delay of 1 but…

…and it runs fine on the very machine and operating system where
your code has problems?

Same login session even.

If so, maybe it’s worth a try recompiling
Neverball, making sure it’s really using the same SDL library that
you’re using, and see if it still runs smoothly.

I’ll look into that, although I may have identified the problem which
does not appear to apply to current releases; I’m on 1.2.8-8 SuSE right
now and 1.2.11-22 (SuSE 10.2) appears to have properly dumped RTDSC in
favour of clock_gettime rather than just disabling it and reverting to
gettimeofday. I was right about one thing though, this version is almost
certainly using gettimeofday and it really /isn’t/ cutting the mustard.

Well, it could be something seemingly unrelated, such as event
handling, och code that doesn’t even use SDL. Have you located the
exact spot where time is “lost”? Could it be some unintentionally
blocking system call or something?

Well, I ran the attached code, and managed to get a delay of 178 at one
point. Not sure how much more I could pare it down :oD

Any platform specific #ifdefs in your code…?

Not in this example.

Achh, seems this particular version is properly knackered. Thanks again.
-------------- next part --------------
A non-text attachment was scrubbed…
Name: test2.cpp
Type: text/x-c++src
Size: 3301 bytes
Desc: not available
URL: http://lists.libsdl.org/pipermail/sdl-libsdl.org/attachments/20070117/34906a6a/attachment.cppOn Wed, 2007-01-17 at 18:49 +0100, David Olofson wrote:

AFAIK, some implementations of gettimeofday() use RDTSC - and indeed,
some versions have had problems with thermothrottling and SMP
machines… So, SDL might not be the only place to look.

Cack. Yes it does, according to the kernel source notes. I think this is
less of a SMP problem, seeing as I don’t even have dual core, and more
of an ‘is the CPU busy?’ problem. If I can manage to get it to use
clock_gettime, even if this means bloody forcing it, I think it’ll be
OK.

Maybe you could try some LiveCD, just to see if it makes a difference?

I suppose that has its appeal :opOn Wed, 2007-01-17 at 18:59 +0100, David Olofson wrote:


Try the all-new Yahoo! Mail. “The New Version is radically easier to use” ? The Wall Street Journal
http://uk.docs.yahoo.com/nowyoucan.html

[…]

Well, it could be something seemingly unrelated, such as event
handling, och code that doesn’t even use SDL. Have you located the
exact spot where time is “lost”? Could it be some unintentionally
blocking system call or something?

Well, I ran the attached code, and managed to get a delay of 178 at
one point.

Ouch…

Right; I almost forgot: Have you tried removing the SDL_Delay()
altogether, and/or using a fixed time delta for the animation? That
is, is your application stalled for 100+ ms every now and then, or is
is “just” getting bogus timestamps?

Not sure how much more I could pare it down :oD

Well… ;-)On Wednesday 17 January 2007 19:30, Paul Duffy wrote:

#include <stdio.h>
#include "SDL.h"
int main(int argc, char *argv[])
{
Uint32 next = SDL_GetTicks() + 10;
while(1)
{
Uint32 tick = SDL_GetTicks();
if(tick > next + 1)
printf(“Late by %d ms!\n”, tick - next);
next = tick + 10;
SDL_Delay(10);
}
}

Here (Gentoo Linux/AMD64, kernel 2.6.17) I got at most 24 ms when
moving windows around, messing with Bon Echo (Firefox relative) and
stuff. Normally, it stays silent for a few seconds, prints “Late by 2
ms!” 10-15 times in about one second, stays silent for a few second,
gets an 11 ms peak when the SLAY Radio page autorefreshes, goes
silent again etc.

(And BTW, SDL_Delay(1) indeed sleeps here. I get the same kind of
behavior, only slightly more of these “Late by 2 ms!”. Moving windows
and stuff doesn’t cause any more trouble than with the 10 ms “frame
rate”. Busy-waiting OTOH results in peaks of several hundred ms every
now and then - and the CPU fan makes the difference very audible as
well.)

//David Olofson - Programmer, Composer, Open Source Advocate

.------- http://olofson.net - Games, SDL examples -------.
| http://zeespace.net - 2.5D rendering engine |
| http://audiality.org - Music/audio engine |
| http://eel.olofson.net - Real time scripting |
’-- http://www.reologica.se - Rheology instrumentation --’

— David Olofson wrote:

Right; I almost forgot: Have you tried removing the SDL_Delay()
altogether, and/or using a fixed time delta for the animation? That
is, is your application stalled for 100+ ms every now and then, or is

is “just” getting bogus timestamps?

The latter. It’ll run fine for a few seconds then blip, fine for some
other random interval, blip. It’s not really a case of bogus
timestamps though as that’s how long it actually takes the function to
return a value. The majority time interval is 12 microseconds with
SDL_Delay(10).

This, as mentioned elsewhere, is probably because the SDL libs I’m
relying are muntered and so end up only able to use gettimeofday rather
than clock_gettime and, through this, are relying on RDTSC which is
based on the CPUs ability to answer such a query at any given time. For
some reason, I don’t know, this makes it inherently unreliable as has
been said of QueryPerformanceTimer from the Microsoft Multimedia SDK
which uses the same method.

Newer versions of the lib seem to handle this stuff properly so I’ll be
moving substantially from 1.2.8-8 to 1.2.11-22; been meaning to install
SuSE 10.2 on my proper box for a few weeks now.

Well… :wink:

#include <stdio.h>
#include "SDL.h"
int main(int argc, char *argv[])
{
Uint32 next = SDL_GetTicks() + 10;
while(1)
{
Uint32 tick = SDL_GetTicks();
if(tick > next + 1)
printf(“Late by %d ms!\n”, tick - next);
next = tick + 10;
SDL_Delay(10);
}
}

Ah, you mean SDL doesn’t choke if you don’t provide a surface or open a
window? Didn’t know that.____________________________________________________________________________________
Bored stiff? Loosen up…
Download and play hundreds of games for free on Yahoo! Games.

[…]

Right; I almost forgot: Have you tried removing the SDL_Delay()
altogether, and/or using a fixed time delta for the animation?
That is, is your application stalled for 100+ ms every now and
then, or is is “just” getting bogus timestamps?

The latter. It’ll run fine for a few seconds then blip, fine for
some other random interval, blip. It’s not really a case of bogus
timestamps though as that’s how long it actually takes the function
to return a value.

So, the application is actually stalling, then?

Are you sure it’s SDL_GetTicks()? (I have a hard time seeing how it
could block or otherwise take a long time to return…)

I’d suggest removing all timing code to see if you rendering loop runs
full speed without it. That is, use the rendering to make it possible
to see what’s going on, as you obviously cannot trust the
timestamps.

Then you can insert the SDL_Delay(10) and see if that causes problems.
If not, try removing it and throwing in a dummy SDL_GetTicks().

If either call causes the problem to return, the problem is indeed
that one of these calls indeed runs off to do something weird every
now and then.

The majority time interval is 12 microseconds
with SDL_Delay(10).

This, as mentioned elsewhere, is probably because the SDL libs I’m
relying are muntered and so end up only able to use gettimeofday
rather than clock_gettime and, through this, are relying on RDTSC
which is based on the CPUs ability to answer such a query at any
given time.

RDTSC is a very fast instruction that just reads the value of a 64 bit
counter that’s incremented by the core clock. AFAIK, there’s no way
it could possibly make the CPU stall, cause a context switch or
whatever - but if the motherboard is throttling the CPU core clock
(for heat protection and/or power saving), you may have a hard time
getting anything sensible out of the values…

For some reason, I don’t know, this makes it inherently
unreliable as has been said of QueryPerformanceTimer from the
Microsoft Multimedia SDK which uses the same method.

Like I said, the reason for these problems is that thermothrottling
causes the core clock (and thus, the TSC “speed”) to
change “randomly”, and at that, SMP systems have issues with the core
clocks differing between CPUs.

Newer versions of the lib seem to handle this stuff properly so I’ll
be moving substantially from 1.2.8-8 to 1.2.11-22; been meaning to
install SuSE 10.2 on my proper box for a few weeks now.

Well, that’s probably a good idea either way. It definitely seems like
something’s broken in the version you’re using.

Well… :wink:

#include <stdio.h>
#include "SDL.h"
int main(int argc, char *argv[])
{
Uint32 next = SDL_GetTicks() + 10;
while(1)
{
Uint32 tick = SDL_GetTicks();
if(tick > next + 1)
printf(“Late by %d ms!\n”, tick - next);
next = tick + 10;
SDL_Delay(10);
}
}

Ah, you mean SDL doesn’t choke if you don’t provide a surface or
open a window? Didn’t know that.

Well, yes and no…

Generally, you’re not supposed to use any SDL calls before you’ve
initialized SDL. Some calls may not work or could even crash on some
platforms, unless you have a display up, for example.

However, many calls are pretty much wired directly to the underlying
OS, and don’t care whether or not SDL has been initialized.

Even so, I wouldn’t really recommend doing this sort of stuff in
production code, as it may well break with future versions of SDL.

//David Olofson - Programmer, Composer, Open Source Advocate

.------- http://olofson.net - Games, SDL examples -------.
| http://zeespace.net - 2.5D rendering engine |
| http://audiality.org - Music/audio engine |
| http://eel.olofson.net - Real time scripting |
’-- http://www.reologica.se - Rheology instrumentation --'On Wednesday 17 January 2007 21:37, Paul Duffy wrote:

In this particular vein, you might very well be right. There are delays,
but they’re substantially less noticable without SDL_GetTicks involved
so either these ones are occurring for a different reason or
SDL_GetTicks somehow amplifies the effect.

Unfortunately, I can’t pin it down to one function. I got a delay of 77
from:

void drawBackSpheres(void)
{
fprintf(stderr, “Start of drawBackSpheres: %d\n”, SDL_GetTicks());

GLint i;

// Draw the randomly located spheres
glBindTexture(GL_TEXTURE_2D, textureObjects[SPHERE_TEXTURE]);

for(i = 0; i < NUM_SPHERES; i++)
{
	glPushMatrix();
		gltActorTransform(&spheres[i]);
		glCallList(dlBackSphere);
	glPopMatrix();
}

fprintf(stderr,  "End of drawBackSpheres: %d\n", SDL_GetTicks());

}

gltActorTransform is taken (as is the tutorial I’m trying to adapt) from the OpenGL
SuperBible 3rd Edition. Some necessary files are availble at
http://www.starstonesoftware.com/OpenGL/articles.htm under ‘common.zip’.

The delays have no particular favourite place occuring at the start, end, middle and
outside assorted functions. I might try to update the nVidia drivers (and by extension
the GL headers and libs) and see if that achieves anything as, asdes from GetTicks,
that particular function is all GL and some basic maths.On Wed, 2007-01-17 at 22:23 +0100, David Olofson wrote:

On Wednesday 17 January 2007 21:37, Paul Duffy wrote:
[…]

Right; I almost forgot: Have you tried removing the SDL_Delay()
altogether, and/or using a fixed time delta for the animation?
That is, is your application stalled for 100+ ms every now and
then, or is is “just” getting bogus timestamps?

The latter. It’ll run fine for a few seconds then blip, fine for
some other random interval, blip. It’s not really a case of bogus
timestamps though as that’s how long it actually takes the function
to return a value.

So, the application is actually stalling, then?

Are you sure it’s SDL_GetTicks()? (I have a hard time seeing how it
could block or otherwise take a long time to return…)

I’d suggest removing all timing code to see if you rendering loop runs
full speed without it. That is, use the rendering to make it possible
to see what’s going on, as you obviously cannot trust the
timestamps.


The all-new Yahoo! Mail goes wherever you go - free your email address from your Internet provider. http://uk.docs.yahoo.com/nowyoucan.html

I might try to update the nVidia drivers (and by extension
the GL headers and libs) and see if that achieves anything as, asdes from GetTicks,
that particular function is all GL and some basic maths.

Cobblers.___________________________________________________________
Try the all-new Yahoo! Mail. “The New Version is radically easier to use” ? The Wall Street Journal
http://uk.docs.yahoo.com/nowyoucan.html

Well, that was interesting. Compiled my code on SuSE 10.2 and it runs
fine… OK, it runs as well as my code can be expected to run :oD gotta
drop in a WaitForvSync (or equivelant) there to solve the tearing issue
for starters.

Average 8 ms, 12 ms tops, certainly not enough to be noticable. Now to
ask why the SuSE’s Repo’s don’t have SDL_Net-devel, but that’s a
question for another mailing list :o)

My new years resolution is now: learn to debug properly ya great
pillock.On Thu, 2007-01-18 at 23:01 +0000, Paul Duffy wrote:

I might try to update the nVidia drivers (and by extension
the GL headers and libs) and see if that achieves anything as, asdes from GetTicks,
that particular function is all GL and some basic maths.

Cobblers.


Inbox full of spam? Get leading spam protection and 1GB storage with All New Yahoo! Mail. http://uk.docs.yahoo.com/nowyoucan.html