Make SDL_GetTicks() use the performance counter?

Hi folks,

I’m using SDL2, and I understand that this introduces a high precision
counter, accessible via SDL_GetPerformanceCounter(). I understand that SDL2
also introduces timestamps with events, where the timestamps use the
millisecond-precision counter that is otherwise accessible via
SDL_GetTicks().

Is there a way to get SDL2 to use the high precision counter for event
timestamps instead of the millisecond-precision counter?

I know it seems silly to want sub-millisecond precision in event
timestamps, but I’m using SDL2 for scientific research and would like to
get as precise as I can. (Higher precision == greater statistical power for
a given amount of measurement effort)

Any help would be greatly appreciated :O)

Mike–
Mike Lawrence
Graduate Student
Department of Psychology & Neuroscience
Dalhousie University

~ Certainty is (possibly) folly ~

Hi Mike,
I think there have been similar questions here in the past (also for a
neuroscience student, I thought, but I could be mistaking). It all boils
down to events having to be polled and events can really only be as
accurate as when they are acknowledged in the SDL_PollEvent call. I can’t
speak with absolute authority because I’m not an SDL dev, and maybe there
is some way to do what you want, but I’m pretty sure this was the
conclusion from the previous thread I mentioned.
I’m sure someone with more knowledge on the subject can chime in and maybe
give you a better answer, though.On Sat, Nov 2, 2013 at 11:40 AM, Mike Lawrence <mike.lwrnc at gmail.com> wrote:

Hi folks,

I’m using SDL2, and I understand that this introduces a high precision
counter, accessible via SDL_GetPerformanceCounter(). I understand that SDL2
also introduces timestamps with events, where the timestamps use the
millisecond-precision counter that is otherwise accessible via
SDL_GetTicks().

Is there a way to get SDL2 to use the high precision counter for event
timestamps instead of the millisecond-precision counter?

I know it seems silly to want sub-millisecond precision in event
timestamps, but I’m using SDL2 for scientific research and would like to
get as precise as I can. (Higher precision == greater statistical power for
a given amount of measurement effort)

Any help would be greatly appreciated :O)

Mike


Mike Lawrence
Graduate Student
Department of Psychology & Neuroscience
Dalhousie University

~ Certainty is (possibly) folly ~


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org

Interesting, I was unaware that the frequency of calling SDL_PumpEvents()
affected the accuracy of the timestamps. To make sure I understand clearly,
is it the case that, for example, the timestamp of a keypress doesn’t
reflect the actual time of the keypress, but instead reflects the time of
the call to SDL_PumpEvents() that followed the keypress?

While this is good to know (hence my attempt at seeking clarification
above), it’s actually tangential to my original query insofar as the lag
between the input event and the timestamp is an issue of accuracy, whereas
my original query sought to improve precision. While it’s important to
ensure accurate measurements (and especially to avoid differential accuracy
across measurements that you’d later want to compare), it’s also helpful to
seek the highest precision you can achieve for the same reason you’d
probably find a meter stick with millimeter marks more useful than one with
merely centimeter marks when forced to record measurements to the nearest
marked unit of measurement.

So, my original question remains: is there a way to get SDL2 to use the
high precision counter for event timestamps instead of the
millisecond-precision counter?

MikeOn Sat, Nov 2, 2013 at 1:34 PM, Alex Barry <alex.barry at gmail.com> wrote:

Hi Mike,
I think there have been similar questions here in the past (also for a
neuroscience student, I thought, but I could be mistaking). It all boils
down to events having to be polled and events can really only be as
accurate as when they are acknowledged in the SDL_PollEvent call. I can’t
speak with absolute authority because I’m not an SDL dev, and maybe there
is some way to do what you want, but I’m pretty sure this was the
conclusion from the previous thread I mentioned.
I’m sure someone with more knowledge on the subject can chime in and maybe
give you a better answer, though.

On Sat, Nov 2, 2013 at 11:40 AM, Mike Lawrence <@Mike_Lawrence>wrote:

Hi folks,

I’m using SDL2, and I understand that this introduces a high precision
counter, accessible via SDL_GetPerformanceCounter(). I understand that SDL2
also introduces timestamps with events, where the timestamps use the
millisecond-precision counter that is otherwise accessible via
SDL_GetTicks().

Is there a way to get SDL2 to use the high precision counter for event
timestamps instead of the millisecond-precision counter?

I know it seems silly to want sub-millisecond precision in event
timestamps, but I’m using SDL2 for scientific research and would like to
get as precise as I can. (Higher precision == greater statistical power for
a given amount of measurement effort)

Any help would be greatly appreciated :O)

Mike


Mike Lawrence
Graduate Student
Department of Psychology & Neuroscience
Dalhousie University

~ Certainty is (possibly) folly ~


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org

It occurred to me belatedly just now to poke around in the source, and
discovered that it’s pretty simple to reconfigure things to achieve what I
was looking for. In events/SDL_events.c change the call to SDL_GetTicks()
on line 461 to SDL_GetPerformanceCounter(), then in include/SDL_events.h
change all instances of “Uint32 timestamp” to “Uint64 timestamp”.

This of course doesn’t help when I want to rely on the user’s pre-installed
version of SDL2, but so far as I can tell from looking at the source,
there’s no way to swap the timer used after the build.

Mike–
Mike Lawrence
Graduate Student
Department of Psychology & Neuroscience
Dalhousie University

~ Certainty is (possibly) folly ~

On Sat, Nov 2, 2013 at 7:16 PM, Mike Lawrence <@Mike_Lawrence> wrote:

Interesting, I was unaware that the frequency of calling SDL_PumpEvents()
affected the accuracy of the timestamps. To make sure I understand clearly,
is it the case that, for example, the timestamp of a keypress doesn’t
reflect the actual time of the keypress, but instead reflects the time of
the call to SDL_PumpEvents() that followed the keypress?

While this is good to know (hence my attempt at seeking clarification
above), it’s actually tangential to my original query insofar as the lag
between the input event and the timestamp is an issue of accuracy, whereas
my original query sought to improve precision. While it’s important to
ensure accurate measurements (and especially to avoid differential accuracy
across measurements that you’d later want to compare), it’s also helpful to
seek the highest precision you can achieve for the same reason you’d
probably find a meter stick with millimeter marks more useful than one with
merely centimeter marks when forced to record measurements to the nearest
marked unit of measurement.

So, my original question remains: is there a way to get SDL2 to use the
high precision counter for event timestamps instead of the
millisecond-precision counter?

Mike

On Sat, Nov 2, 2013 at 1:34 PM, Alex Barry <alex.barry at gmail.com> wrote:

Hi Mike,
I think there have been similar questions here in the past (also for a
neuroscience student, I thought, but I could be mistaking). It all boils
down to events having to be polled and events can really only be as
accurate as when they are acknowledged in the SDL_PollEvent call. I can’t
speak with absolute authority because I’m not an SDL dev, and maybe there
is some way to do what you want, but I’m pretty sure this was the
conclusion from the previous thread I mentioned.
I’m sure someone with more knowledge on the subject can chime in and
maybe give you a better answer, though.

On Sat, Nov 2, 2013 at 11:40 AM, Mike Lawrence <@Mike_Lawrence>wrote:

Hi folks,

I’m using SDL2, and I understand that this introduces a high precision
counter, accessible via SDL_GetPerformanceCounter(). I understand that SDL2
also introduces timestamps with events, where the timestamps use the
millisecond-precision counter that is otherwise accessible via
SDL_GetTicks().

Is there a way to get SDL2 to use the high precision counter for event
timestamps instead of the millisecond-precision counter?

I know it seems silly to want sub-millisecond precision in event
timestamps, but I’m using SDL2 for scientific research and would like to
get as precise as I can. (Higher precision == greater statistical power for
a given amount of measurement effort)

Any help would be greatly appreciated :O)

Mike


Mike Lawrence
Graduate Student
Department of Psychology & Neuroscience
Dalhousie University

~ Certainty is (possibly) folly ~


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org

API reasons aside, is there a particularly bad reason not to use the
performance counter throughout? My understanding is that to prevent
some descynch, QueryPerformanceFrequency’s return value needs some
calibration with lower resolution (but more accurate) functions like
TimeGetTime.

Supposedly the issues with this are fixed on newer hardware and
Windows versions, but I do not know for certain the minimum system
and Windows version required for that to be true and I’m still using
Windows XP myself, so? :slight_smile:

JosephOn Sat, Nov 02, 2013 at 07:52:29PM -0300, Mike Lawrence wrote:

It occurred to me belatedly just now to poke around in the source, and
discovered that it’s pretty simple to reconfigure things to achieve what I
was looking for. In events/SDL_events.c change the call to SDL_GetTicks()
on line 461 to SDL_GetPerformanceCounter(), then in include/SDL_events.h
change all instances of “Uint32 timestamp” to “Uint64 timestamp”.

This of course doesn’t help when I want to rely on the user’s pre-installed
version of SDL2, but so far as I can tell from looking at the source,
there’s no way to swap the timer used after the build.

Mike


Mike Lawrence
Graduate Student
Department of Psychology & Neuroscience
Dalhousie University

~ Certainty is (possibly) folly ~

On Sat, Nov 2, 2013 at 7:16 PM, Mike Lawrence <mike.lwrnc at gmail.com> wrote:

Interesting, I was unaware that the frequency of calling SDL_PumpEvents()
affected the accuracy of the timestamps. To make sure I understand clearly,
is it the case that, for example, the timestamp of a keypress doesn’t
reflect the actual time of the keypress, but instead reflects the time of
the call to SDL_PumpEvents() that followed the keypress?

While this is good to know (hence my attempt at seeking clarification
above), it’s actually tangential to my original query insofar as the lag
between the input event and the timestamp is an issue of accuracy, whereas
my original query sought to improve precision. While it’s important to
ensure accurate measurements (and especially to avoid differential accuracy
across measurements that you’d later want to compare), it’s also helpful to
seek the highest precision you can achieve for the same reason you’d
probably find a meter stick with millimeter marks more useful than one with
merely centimeter marks when forced to record measurements to the nearest
marked unit of measurement.

So, my original question remains: is there a way to get SDL2 to use the
high precision counter for event timestamps instead of the
millisecond-precision counter?

Mike

On Sat, Nov 2, 2013 at 1:34 PM, Alex Barry <alex.barry at gmail.com> wrote:

Hi Mike,
I think there have been similar questions here in the past (also for a
neuroscience student, I thought, but I could be mistaking). It all boils
down to events having to be polled and events can really only be as
accurate as when they are acknowledged in the SDL_PollEvent call. I can’t
speak with absolute authority because I’m not an SDL dev, and maybe there
is some way to do what you want, but I’m pretty sure this was the
conclusion from the previous thread I mentioned.
I’m sure someone with more knowledge on the subject can chime in and
maybe give you a better answer, though.

On Sat, Nov 2, 2013 at 11:40 AM, Mike Lawrence <mike.lwrnc at gmail.com>wrote:

Hi folks,

I’m using SDL2, and I understand that this introduces a high precision
counter, accessible via SDL_GetPerformanceCounter(). I understand that SDL2
also introduces timestamps with events, where the timestamps use the
millisecond-precision counter that is otherwise accessible via
SDL_GetTicks().

Is there a way to get SDL2 to use the high precision counter for event
timestamps instead of the millisecond-precision counter?

I know it seems silly to want sub-millisecond precision in event
timestamps, but I’m using SDL2 for scientific research and would like to
get as precise as I can. (Higher precision == greater statistical power for
a given amount of measurement effort)

Any help would be greatly appreciated :O)

Mike


Mike Lawrence
Graduate Student
Department of Psychology & Neuroscience
Dalhousie University

~ Certainty is (possibly) folly ~


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org

Message-ID:
<CAB+QPJAsO393mvtjHFLYeEgUz6s9F8Yijbj5wVmocFaLWhM+hA at mail.gmail.com>
Content-Type: text/plain; charset=“iso-8859-1”

Interesting, I was unaware that the frequency of calling SDL_PumpEvents()
affected the accuracy of the timestamps. To make sure I understand clearly,
is it the case that, for example, the timestamp of a keypress doesn’t
reflect the actual time of the keypress, but instead reflects the time of
the call to SDL_PumpEvents() that followed the keypress?

While this is good to know (hence my attempt at seeking clarification
above), it’s actually tangential to my original query insofar as the lag
between the input event and the timestamp is an issue of accuracy, whereas
my original query sought to improve precision. While it’s important to
ensure accurate measurements (and especially to avoid differential accuracy
across measurements that you’d later want to compare),

Guess what you’re going to run into? Differential accuracy across
measurements. If the details of key-press timing really matters then
you should custom-build a serial-port keyboard that includes
timestamps with the key-press data.

You can improve things somewhat by using a stripped-down Linux (or
similar), but you’re still going to get smacked around by preemption,
maybe paging, task-switching, etc. At least you aren’t quite like the
last guy, he needed to deal with monitor lag.

it’s also helpful to
seek the highest precision you can achieve for the same reason you’d
probably find a meter stick with millimeter marks more useful than one with
merely centimeter marks when forced to record measurements to the nearest
marked unit of measurement.

This is much more relevant when you’re measuring from both sides of an
object, than when you’re measuring between one side of an object and a
point that moves in unpredictable ways. You’ll be doing the later, not
the former. You can attempt to estimate the average differential
between key-press and timestamp, but SDL (whether 1 or 2) wasn’t
designed for scientific research, so it doesn’t have any API to
provide Quality Of Service info like that.

If you do go with SDL’s timestamps, be prepared for several different
groupings, and don’t be surprised if timestamp differentials are
evenly distributed over a relatively large range.> Date: Sat, 2 Nov 2013 19:16:15 -0300

From: Mike Lawrence <mike.lwrnc at gmail.com>
To: SDL Development List
Subject: Re: [SDL] Make SDL_GetTicks() use the performance counter?