Time stamped events, was Detecting Double Clicks

Mattias Engdeg?rd wrote:

“Ryan C. Gordon” wrote:

do I even have to mention why this is an incredibly bad idea?

It’s good to know the sensitivity training is going well, Mattias. :slight_smile:

I’m sorry if I sound a bit harsh at times, but if people can debate
endlessly about utterly trivial parts of SDL, problems which are
no problems, without ever thinking about it, how can they even
hope to understand the hard parts?

I see absolutely no reason not to use more or less exactly the
same SDL_GetTicks() in future SDL releases. It is simple, it is easy
to implement, it is easy to understand, it is easy to use, and you
don’t need 64-bit integers.

There are design mistakes in SDL (and I’m responsible for several of them),
and we know how to improve stuff if we get the chance.
SDL_GetTicks() is not a design mistake.

I’ve only been on this list a short time, but in that short time I’ve
seen you blow up like this and insult me and the other members of the
list several times. I’ve yet to see this happen over anything that was
worth insulting people over. No where in the discussion about
SDL_GetTicks() did anyone say or imply that SDL_GetTicks() was in anyway
a design flaw. You need to ask yourself why you think they did. And,
even if someone had said it was a design flaw, why to you react so
violently to such a suggestion?

I’d like to know what makes you think that I (we) haven’t thought about
the time problem in games and servers? You don’t know enough about me to
know that.

If you would like to start discussion of thing you think are important
then please do so.

Bob Pendleton-- 

±-----------------------------------+

  • Bob Pendleton is seeking contract +
  • and consulting work. Find out more +
  • at http://www.jump.net/~bobp +
    ±-----------------------------------+

Sam Lantinga wrote:

Does that imply that you are going to store events in a priority
queue and allow
future timestamped events to be placed in the queue?

No, but that’s an interesting idea. The main problem is that the
timestamp used by the windowing system (if any) may not have any
relationship to the value returned by SDL_GetTicks()

Yah, that is a problem. If you keep track of the first timestamp on the
first event you get from the OS and the GetTicks value for the time
when you go that event and use those to compute a bias you should be
able to synch new OS generated timestamps with the SDL_GetTicks() time
by subtracting the bias from the timestamp so that your timestamps
start at zero just like SDL_GetTicks().

There may be problems with that on some platforms. What guarantees that
the timestamps and SDL_GetTicks() get their timing from the same hardware
timer/counter? (For example, if one reads from Win32 multimedia timers,
and the other reads system time or performance counters, there’s no
guarantee that there’s no drift.)

On the other hand, does anyone
write code that counts on the fact that SDL_GetTicks() restarts at zero
every time you start the program?

It does? :wink:

I used to do discrete event simulations and those are based on the idea
of an event queue that is ordered by time. Several years ago I applied
the same ideas used in discrete event simulation to games and found
that they worked very well and they insured that input events and game
generated events, such as “the torch burned out,” “the Ogre checked the
hall,” and so on took place in exact sequence with input actions. And
it leads to a very simple style of programming in which there are very
few timers, just events that invoke methods of objects. (Discrete event
simulation drove the development of object oriented programming while
GUIs drove its wide spread adoption.)

Works great for soft synths and the like as well. Passing a single
ordered list of timestamped events to a plugin makes it easy to implement
sample accurate timing, without the overhead of breaking buffers up and
calling plugins multiple times.

Bob Pendleton

P.S.

If you wanted to use SDL to write a portable game server you would need
something like SDL_GetTicks() that didn’t roll over after 49 days.

Why? Just cast the tick values to Sint32, and calculate the delta time
every time you call SDL_GetTicks(). Then translate the delta time as
needed, and add it to a long double, Uint64, a pair of Uint32’s (like a
timeval struct) or something.

Still, some tested, portable and ready to use solutions could be nice
having around…

//David Olofson — Programmer, Reologica Instruments AB

.- M A I A -------------------------------------------------.
| Multimedia Application Integration Architecture |
| A Free/Open Source Plugin API for Professional Multimedia |
----------------------------> http://www.linuxdj.com/maia -' .- David Olofson -------------------------------------------. | Audio Hacker - Open Source Advocate - Singer - Songwriter |-------------------------------------> http://olofson.net -'On Thursday 17 January 2002 16:38, Bob Pendleton wrote:

Not much point, as you can calculate that yourself, as long as the result
is less than 2G ticks. (Never decoded an Amiga mouse directly from the
h/w regs? Those counters are only 8 bits… :slight_smile:

static Sint32 last;
Sint32 now = (Sint32)SDL_GetTicks();
Sint32 dt = now - last;
last = now;

Besides, this method lets you run as many “delta timers” you like,
whereas a specific delta time call would only work as long as you keep
track of every single call to it, or just use it in only one place in the
entire application.

//David Olofson — Programmer, Reologica Instruments AB

.- M A I A -------------------------------------------------.
| Multimedia Application Integration Architecture |
| A Free/Open Source Plugin API for Professional Multimedia |
----------------------------> http://www.linuxdj.com/maia -' .- David Olofson -------------------------------------------. | Audio Hacker - Open Source Advocate - Singer - Songwriter |-------------------------------------> http://olofson.net -'On Thursday 17 January 2002 18:59, Trick wrote:

If you wanted to use SDL to write a portable game server you would
need something like SDL_GetTicks() that didn’t roll over after 49
days.

What about introducing a new SDL_GetTicks in SDL 1.3 which returns the
number of ticks since the last time SDL_GetTicks was called ?

Ooooh… this may have been aimed at me. So let me rephrase myself.

I had responded saying I could make a SDL_GetTicks() that would be accurate
to the ns., so it could obviously be accurate to the ms. (which is what it
would output).

For those who weren’t around at the time (sorry, I’ve been on this list for
a while and forget that sometimes), there was a good arguement as to why
that shouldn’t be in there.

  1. Only under Windows
  2. Not guarenteed under Windows
  3. You are going to have the obvious complaints like “why isn’t this
    available on ____?”, “why can’t my SDL_Delay() be that accurate”, etc.

HEY! I’m reading Tyan C. Gordon’s response to “making 200Hz…” Did you
guys actually implement it? After all my arguing at the time, you used
it???> ----- Original Message -----

From: f91-men@nada.kth.se (Mattias Engdegard)
To:
Sent: Thursday, January 17, 2002 2:31 PM
Subject: Re: time stamped events, was [SDL] Detecting Double Clicks

“Ryan C. Gordon” wrote:

do I even have to mention why this is an incredibly bad idea?

It’s good to know the sensitivity training is going well, Mattias. :slight_smile:

I’m sorry if I sound a bit harsh at times, but if people can debate
endlessly about utterly trivial parts of SDL, problems which are
no problems, without ever thinking about it, how can they even
hope to understand the hard parts?

I see absolutely no reason not to use more or less exactly the
same SDL_GetTicks() in future SDL releases. It is simple, it is easy
to implement, it is easy to understand, it is easy to use, and you
don’t need 64-bit integers.

There are design mistakes in SDL (and I’m responsible for several of
them),
and we know how to improve stuff if we get the chance.
SDL_GetTicks() is not a design mistake.


SDL mailing list
SDL at libsdl.org
http://www.libsdl.org/mailman/listinfo/sdl

David Olofson wrote:

Sam Lantinga wrote:

Does that imply that you are going to store events in a priority
queue and allow
future timestamped events to be placed in the queue?

No, but that’s an interesting idea. The main problem is that the
timestamp used by the windowing system (if any) may not have any
relationship to the value returned by SDL_GetTicks()

Yah, that is a problem. If you keep track of the first timestamp on the
first event you get from the OS and the GetTicks value for the time
when you go that event and use those to compute a bias you should be
able to synch new OS generated timestamps with the SDL_GetTicks() time
by subtracting the bias from the timestamp so that your timestamps
start at zero just like SDL_GetTicks().

There may be problems with that on some platforms. What guarantees that
the timestamps and SDL_GetTicks() get their timing from the same hardware
timer/counter? (For example, if one reads from Win32 multimedia timers,
and the other reads system time or performance counters, there’s no
guarantee that there’s no drift.)

Very good point. On a PC there are only a few physical mechanisms for
tracking time. One is the hardware timer that has been in the PC since
the original and another is the TSC (Total Cycle Count?) register. So, I
suspect that any and all timers on the PC will stay in pretty good sync.
I admit I could be completely wrong on that and that it doesn’t apply to
any other kind of computer.

On the other hand, does anyone
write code that counts on the fact that SDL_GetTicks() restarts at zero
every time you start the program?

It does? :wink:

According to the docs it returns the time in milliseconds since the SDL
library was initialized. Which is pretty close to what I said.

I used to do discrete event simulations and those are based on the idea
of an event queue that is ordered by time. Several years ago I applied
the same ideas used in discrete event simulation to games and found
that they worked very well and they insured that input events and game
generated events, such as “the torch burned out,” “the Ogre checked the
hall,” and so on took place in exact sequence with input actions. And
it leads to a very simple style of programming in which there are very
few timers, just events that invoke methods of objects. (Discrete event
simulation drove the development of object oriented programming while
GUIs drove its wide spread adoption.)

Works great for soft synths and the like as well. Passing a single
ordered list of timestamped events to a plugin makes it easy to implement
sample accurate timing, without the overhead of breaking buffers up and
calling plugins multiple times.

  Bob Pendleton

P.S.

If you wanted to use SDL to write a portable game server you would need
something like SDL_GetTicks() that didn’t roll over after 49 days.

Why? Just cast the tick values to Sint32, and calculate the delta time
every time you call SDL_GetTicks(). Then translate the delta time as
needed, and add it to a long double, Uint64, a pair of Uint32’s (like a
timeval struct) or something.

Yes, I know that trick. And, I can certainly wrap SDL_GetTicks() to use
it. I just hate the idea of doing that when every OS I know of all ready
has a way of getting better time information. If I didn’t care about
writing completely portable code I would just use the time functions of
a specific OS.

		Bob P.> On Thursday 17 January 2002 16:38, Bob Pendleton wrote:

David Olofson wrote:
[…]

There may be problems with that on some platforms. What guarantees
that the timestamps and SDL_GetTicks() get their timing from the same
hardware timer/counter? (For example, if one reads from Win32
multimedia timers, and the other reads system time or performance
counters, there’s no guarantee that there’s no drift.)

Very good point. On a PC there are only a few physical mechanisms for
tracking time. One is the hardware timer that has been in the PC since
the original and another is the TSC (Total Cycle Count?) register.

Time Stamp Counter, AFAIK.

So,
I suspect that any and all timers on the PC will stay in pretty good
sync.

Actually, no - they’re usually driven from separate crystal oscillators,
so there’s nothing at all keeping them in sync. Not even the TSCs of the
CPUs in an SMP system are guaranteed to stay in sync!

I admit I could be completely wrong on that and that it doesn’t
apply to any other kind of computer.

It does apply to older machines like the C64 and Amiga, where basically
everything is driven from one or two crystal oscillators, but that’s
not how PC and other more modern machines are designed.

On the other hand, does anyone
write code that counts on the fact that SDL_GetTicks() restarts at
zero every time you start the program?

It does? :wink:

According to the docs it returns the time in milliseconds since the SDL
library was initialized. Which is pretty close to what I said.

Yes. I’m just not sure if it’s a good idea to rely on that. Turns out
that such dependencies cause trouble when you eventually get to that
"now, I’d like to be able to reinitialize the whole graphics subsystem,
and then return to the game"…

(BTW, I just added the ALT+ENTER fullscreen/windowed toggle shortcut to
Kobo Deluxe. Works even in the middle of a game. :slight_smile:

[…]

Why? Just cast the tick values to Sint32, and calculate the delta
time every time you call SDL_GetTicks(). Then translate the delta
time as needed, and add it to a long double, Uint64, a pair of
Uint32’s (like a timeval struct) or something.

Yes, I know that trick. And, I can certainly wrap SDL_GetTicks() to use
it. I just hate the idea of doing that when every OS I know of all
ready has a way of getting better time information. If I didn’t care
about writing completely portable code I would just use the time
functions of a specific OS.

Yeah, I see what you mean - but at the same time, I kind of feel bad
about using Uint64 and the like on 32 bit machines, if it can be avoided.

Not that it really matters normally - it’s probably this audio hacking
and plugin API stuff that makes me think of situations where you easilly
end up operating on several thousand timestamps per second. :slight_smile:

//David Olofson — Programmer, Reologica Instruments AB

.- M A I A -------------------------------------------------.
| Multimedia Application Integration Architecture |
| A Free/Open Source Plugin API for Professional Multimedia |
----------------------------> http://www.linuxdj.com/maia -' .- David Olofson -------------------------------------------. | Audio Hacker - Open Source Advocate - Singer - Songwriter |-------------------------------------> http://olofson.net -'On Thursday 17 January 2002 22:53, Bob Pendleton wrote:

David Olofson wrote:

the original and another is the TSC (Total Cycle Count?) register.

Time Stamp Counter, AFAIK.

Yah, I had just looked it up and still got it wrong.

Thanks for the update on PC hardware, its been afew years since I took a
detailed looke at it. I thought that most OS based timers were still
based on the old PC hardware timer. I’m surprised that that has changed.

On the other hand, does anyone
write code that counts on the fact that SDL_GetTicks() restarts at
zero every time you start the program?

It does? :wink:

According to the docs it returns the time in milliseconds since the SDL
library was initialized. Which is pretty close to what I said.

Yes. I’m just not sure if it’s a good idea to rely on that. Turns out
that such dependencies cause trouble when you eventually get to that
"now, I’d like to be able to reinitialize the whole graphics subsystem,
and then return to the game"…

Which is the best argument I’ve seen so far for adding a different way
of getting time to SDL. There really are times when you want to keep
track of the time without being tied to the state of the graphics
display.

Bob P.–
±-----------------------------------+

  • Bob Pendleton is seeking contract +
  • and consulting work. Find out more +
  • at http://www.jump.net/~bobp +
    ±-----------------------------------+

David Olofson wrote:
[…]

Yes. I’m just not sure if it’s a good idea to rely on that. Turns out
that such dependencies cause trouble when you eventually get to that
"now, I’d like to be able to reinitialize the whole graphics
subsystem, and then return to the game"…

Which is the best argument I’ve seen so far for adding a different way
of getting time to SDL. There really are times when you want to keep
track of the time without being tied to the state of the graphics
display.

Well, my only real argument against an alternative to SDL_GetTicks() is
that SDL_GetTicks() is as simple as it gets; it’s effectively working in
the same way as most hardware timers/counters, like the TSC or the RTC.
Or; just like the rest of SDL, it wraps the underlying implementations in
the simplest way possible.

More bits? Well, the only time you really need that is if you’re
potentially going to sleep for more than 49 days, and then want to find
out how long you were asleep. :slight_smile:

Anyway, as we’ve seen even on the list, a “Timing Toolkit” library could
avoid some bugs and save some time. I’m thinking about really useful
stuff to include in such a library, but as always with flexible APIs,
complexity tends to increase exponentially with level… heh

For example, the “interpolating frame rate emulator” of the “Spitfire
Engine” used in Kobo Deluxe is next to invisible as it’s implemented
inside the engine, but it’s probably pretty hard to use the code
correctly if you rip it out. It would be nice if it was possible to split
the engine up into small, generic libraries, but so far most things
rather indicate that closer integration of the parts is the way to go. Of
course, that might also be an indication that the engine needs to be
redesigned! Remains to see, when it’s time to rip it out of Kobo Deluxe
and wrap it up as a truly stand-alone package.

//David Olofson — Programmer, Reologica Instruments AB

.- M A I A -------------------------------------------------.
| Multimedia Application Integration Architecture |
| A Free/Open Source Plugin API for Professional Multimedia |
----------------------------> http://www.linuxdj.com/maia -' .- David Olofson -------------------------------------------. | Audio Hacker - Open Source Advocate - Singer - Songwriter |-------------------------------------> http://olofson.net -'On Thursday 17 January 2002 23:54, Bob Pendleton wrote:

David Olofson wrote:

David Olofson wrote:
[…]

Yes. I’m just not sure if it’s a good idea to rely on that. Turns out
that such dependencies cause trouble when you eventually get to that
"now, I’d like to be able to reinitialize the whole graphics
subsystem, and then return to the game"…

Which is the best argument I’ve seen so far for adding a different way
of getting time to SDL. There really are times when you want to keep
track of the time without being tied to the state of the graphics
display.

Well, my only real argument against an alternative to SDL_GetTicks() is
that SDL_GetTicks() is as simple as it gets; it’s effectively working in
the same way as most hardware timers/counters, like the TSC or the RTC.
Or; just like the rest of SDL, it wraps the underlying implementations in
the simplest way possible.

Absolutely!

More bits? Well, the only time you really need that is if you’re
potentially going to sleep for more than 49 days, and then want to find
out how long you were asleep. :slight_smile:

In systems like MUDs and other longer term online games there are many
places where you need to keep track of time periods longer than 49 days.

Anyway, as we’ve seen even on the list, a “Timing Toolkit” library could
avoid some bugs and save some time. I’m thinking about really useful
stuff to include in such a library, but as always with flexible APIs,
complexity tends to increase exponentially with level… heh

Yeah, I know. Several times during last 25 years part of my job various
jobs has been to track the progress of various standards. Often, the
hardest part is trying to keep “nice” features from clutering up
standard. Which is why I have sympathy for Mattias when he feels he is
defending SDL from feature creep.

Bob P.> On Thursday 17 January 2002 23:54, Bob Pendleton wrote:

time_passed = SDL_GetTicks() - first_time;

Some people appearently feel a need to have this functionality
encapsulated in a function. It would cut down the noise in this
newsgroup/mailing list. And it’s really small and trivial to
implement, so why not?

That’s strange, i haven’t noticed anyone saying anything like that.–
Trick


Linux User #229006 * http://counter.li.org

“There is no magic” - Nakor, magic user

Well, yea, i know what you mean. It turns out i replied a wee bit to soon,
because there’s really not a problem with the wrapping-a-round of
SDL_GetTicks in the first place… (10 - -5 = 15).

Still, you could be a bit more loving and caring ;)On Thursday 17. January 2002 21:31, you wrote:

“Ryan C. Gordon” wrote:

do I even have to mention why this is an incredibly bad idea?

It’s good to know the sensitivity training is going well, Mattias. :slight_smile:

I’m sorry if I sound a bit harsh at times, but if people can debate
endlessly about utterly trivial parts of SDL, problems which are
no problems, without ever thinking about it, how can they even
hope to understand the hard parts?


Trick


Linux User #229006 * http://counter.li.org

“There is no magic” - Nakor, magic user

What about introducing a new SDL_GetTicks in SDL 1.3 which returns the
number of ticks since the last time SDL_GetTicks was called ?

Not much point, as you can calculate that yourself, as long as the result
is less than 2G ticks. (Never decoded an Amiga mouse directly from the
h/w regs? Those counters are only 8 bits… :slight_smile:

Yep =)

I should start thinking before replying (as opposed to kobo ;)–
Trick


Linux User #229006 * http://counter.li.org

“There is no magic” - Nakor, magic user

David Olofson wrote:
[…]

More bits? Well, the only time you really need that is if you’re
potentially going to sleep for more than 49 days, and then want to
find out how long you were asleep. :slight_smile:

In systems like MUDs and other longer term online games there are many
places where you need to keep track of time periods longer than 49
days.

Yes, but that’s not a problem if you use SDL_GetTicks() to calculate
delta times, which you then translate into whatever format you may need -
as long as you do that at least every 49 days.

Of course, it’s simpler to just use Uint64, but… I haven’t even used
a true 64 bit system so far! :wink:

[…feeping creatures…]

standard. Which is why I have sympathy for Mattias when he feels he is
defending SDL from feature creep.

…and IMHO, the answer is right there; there should be only on, simple
and portable call. Any higher level stuff should be handled by optional,
stand-alone libraries. We should discuss the latter, and not whether or
not SDL is missing some timer call. (IMHO, it isn’t - with the possible
exception of some way of figuring out what accuracy to expect.)

//David Olofson — Programmer, Reologica Instruments AB

.- M A I A -------------------------------------------------.
| Multimedia Application Integration Architecture |
| A Free/Open Source Plugin API for Professional Multimedia |
----------------------------> http://www.linuxdj.com/maia -' .- David Olofson -------------------------------------------. | Audio Hacker - Open Source Advocate - Singer - Songwriter |-------------------------------------> http://olofson.net -'On Friday 18 January 2002 01:00, Bob Pendleton wrote:

David Olofson wrote:

Of course, it’s simpler to just use Uint64, but… I haven’t even used
a true 64 bit system so far! :wink:

this is getting pretty far off topic… so forgive me. I’ve written
packages for doing 32 and 64 bit integer, floating point, and even
decimal arithmetic on 8 and 16 bit computers. Whether or not the
computer is a “true” 64 bit system makes little difference. You just
have to understand that there is a little extra cost to using longer
variables. Any way, I should have suggested using a double precision
floating point time value. Most all computers support them, and they
have plenty of bits to support a time value with the precision the I
would prefer.

Funny things is, I LOOKED at the Linux and Windows time code, and now I
understand that the 32 bit limit is the result of portability
restraints. 32 bits is the least common denominator.

An off topic aside:

Two of the weirdest projects I ever worked on were writing the
arithmetic support for the runtime library for a COBOL compiler for
8080/z80 based machines and porting the X Server to a 64 bit SPARC about
4 years before anyone actually built a 64 bit SPARC. The 64 bit SPARC
emulator ran on a 32 bit Sun SPARC workstation. It took most of the
morning to boot UNIX and start the X server…

Bob P.> _______________________________________________

SDL mailing list
SDL at libsdl.org
http://www.libsdl.org/mailman/listinfo/sdl


±-----------------------------------+

  • Bob Pendleton is seeking contract +
  • and consulting work. Find out more +
  • at http://www.jump.net/~bobp +
    ±-----------------------------------+

David Olofson wrote:

Of course, it’s simpler to just use Uint64, but… I haven’t even
used a true 64 bit system so far! :wink:

this is getting pretty far off topic… so forgive me. I’ve written
packages for doing 32 and 64 bit integer, floating point, and even
decimal arithmetic on 8 and 16 bit computers. Whether or not the
computer is a “true” 64 bit system makes little difference. You just
have to understand that there is a little extra cost to using longer
variables.

Everything is possible, of course, but the performance hit, and more
importantly, the lack of native support for wide datatypes in some
compilers might be a problem. (Noticed that there are defines in SDL to
handle systems without int64…?)

Any way, I should have suggested using a double precision
floating point time value. Most all computers support them, and they
have plenty of bits to support a time value with the precision the I
would prefer.

Yeah, that would work - but unfortunately, integer<->FP conversions are
quite expensive on most normal workstations and PCs. (Another of those
"general purpose CPU" problems that make the lives of audio plugin coders
harder… :-/ )

Funny things is, I LOOKED at the Linux and Windows time code, and now I
understand that the 32 bit limit is the result of portability
restraints. 32 bits is the least common denominator.

So, >32 bits would have to be “emulated” on some platforms… Seems like
a good reason to stay away from it.

//David Olofson — Programmer, Reologica Instruments AB

.- M A I A -------------------------------------------------.
| Multimedia Application Integration Architecture |
| A Free/Open Source Plugin API for Professional Multimedia |
----------------------------> http://www.linuxdj.com/maia -' .- David Olofson -------------------------------------------. | Audio Hacker - Open Source Advocate - Singer - Songwriter |-------------------------------------> http://olofson.net -'On Friday 18 January 2002 18:12, Bob Pendleton wrote: