Time of screen update?

Hi folks,

First, thanks to everyone that has helped with my previous queries;
this list has be very useful so far in my explorations on how I can
make my experiments more precise using SDL.

I think my last question is this: after SDL_Flip() is it possible to
get a timestamp for when the flip will actually take place? As I
understand it, SDL_Flip() will simply set up the flip to occur at the
next vertical retrace then return immediately, not waiting until the
actual retrace happens. It would be useful to me to know the time of
the retrace that presented the surface supplied to SDL_Flip().

Cheers,

Mike–
Mike Lawrence
Graduate Student
Department of Psychology
Dalhousie University

Looking to arrange a meeting? Check my public calendar: http://goo.gl/BYH99

~ Certainty is folly… I think. ~

Modern drivers often buffer rendered frames for a considerable amount of time - 3 frames is the default in NVIDIA and AMD driver control panels, and not changeable by the app though I do wonder if
there is a way to bypass this buffering.

Users often report input lag in games because of this buffering.

I know for certain this buffering occurs in D3D apps but not sure about OpenGL apps.

If you need low latency, I highly recommend changing this setting, and disabling the Threaded Optimization option if present as well (this certainly has an impact on OpenGL), sometimes this requires
thirdparty tools to change these settings if they are not exposed in the control panel programs on Windows.

I am not sure of the corresponding settings on Linux or OSX.

Sorry to not have exact info on this buffering behavior but that is what I know about it at present.On 02/09/2012 08:17 AM, Mike Lawrence wrote:

Hi folks,

First, thanks to everyone that has helped with my previous queries;
this list has be very useful so far in my explorations on how I can
make my experiments more precise using SDL.

I think my last question is this: after SDL_Flip() is it possible to
get a timestamp for when the flip will actually take place? As I
understand it, SDL_Flip() will simply set up the flip to occur at the
next vertical retrace then return immediately, not waiting until the
actual retrace happens. It would be useful to me to know the time of
the retrace that presented the surface supplied to SDL_Flip().

Cheers,

Mike


Mike Lawrence
Graduate Student
Department of Psychology
Dalhousie University

Looking to arrange a meeting? Check my public calendar: http://goo.gl/BYH99

~ Certainty is folly… I think. ~


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org


LordHavoc
Author of DarkPlaces Quake1 engine - http://icculus.org/twilight/darkplaces
Co-designer of Nexuiz - http://alientrap.org/nexuiz
"War does not prove who is right, it proves who is left." - Unknown
"Any sufficiently advanced technology is indistinguishable from a rigged demo." - James Klass
"A game is a series of interesting choices." - Sid Meier

Ah yes, preventing lag between the time a flip is requested and the
time that it actually occurs is one way of solving the problem I
posed; if there is zero lag, you can use the time of the flip request
as the time of actual stimulus presentation. However, barring
reduction of the request-to-flip lag to zero (a solution that, as you
indicate, might be made difficult by the behaviour of modern graphics
cards), I’m wondering if there might be a way to at least figure out
when the flip will/did occur so that my timing of human input in
response to the presentation of the visual stimulus can use this
actual-flip-time as the zero time for recording human response times.On Thu, Feb 9, 2012 at 12:39 PM, Forest Hale wrote:

Modern drivers often buffer rendered frames for a considerable amount of time - 3 frames is the default in NVIDIA and AMD driver control panels, and not changeable by the app though I do wonder if
there is a way to bypass this buffering.

Users often report input lag in games because of this buffering.

I know for certain this buffering occurs in D3D apps but not sure about OpenGL apps.

If you need low latency, I highly recommend changing this setting, and disabling the Threaded Optimization option if present as well (this certainly has an impact on OpenGL), sometimes this requires
thirdparty tools to change these settings if they are not exposed in the control panel programs on Windows.

I am not sure of the corresponding settings on Linux or OSX.

Sorry to not have exact info on this buffering behavior but that is what I know about it at present.

On 02/09/2012 08:17 AM, Mike Lawrence wrote:

Hi folks,

First, thanks to everyone that has helped with my previous queries;
this list has be very useful so far in my explorations on how I can
make my experiments more precise using SDL.

I think my last question is this: after SDL_Flip() is it possible to
get a timestamp for when the flip will actually take place? As I
understand it, SDL_Flip() will simply set up the flip to occur at the
next vertical retrace then return immediately, not waiting until the
actual retrace happens. It would be useful to me to know the time of
the retrace that presented the surface supplied to SDL_Flip().

Cheers,

Mike


Mike Lawrence
Graduate Student
Department of Psychology
Dalhousie University

Looking to arrange a meeting? Check my public calendar: http://goo.gl/BYH99

~ Certainty is folly… I think. ~


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org


LordHavoc
Author of DarkPlaces Quake1 engine - http://icculus.org/twilight/darkplaces
Co-designer of Nexuiz - http://alientrap.org/nexuiz
"War does not prove who is right, it proves who is left." - Unknown
"Any sufficiently advanced technology is indistinguishable from a rigged demo." - James Klass
"A game is a series of interesting choices." - Sid Meier


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org

gotta dumb it down for me. you want the game data to change according
to the user pushing buttons but you want whats on the screen to to be more
in sync than the refresh rate of the monitor?On Thu, Feb 9, 2012 at 11:06 AM, Mike Lawrence <Mike.Lawrence at dal.ca> wrote:

Ah yes, preventing lag between the time a flip is requested and the
time that it actually occurs is one way of solving the problem I
posed; if there is zero lag, you can use the time of the flip request
as the time of actual stimulus presentation. However, barring
reduction of the request-to-flip lag to zero (a solution that, as you
indicate, might be made difficult by the behaviour of modern graphics
cards), I’m wondering if there might be a way to at least figure out
when the flip will/did occur so that my timing of human input in
response to the presentation of the visual stimulus can use this
actual-flip-time as the zero time for recording human response times.

On Thu, Feb 9, 2012 at 12:39 PM, Forest Hale wrote:

Modern drivers often buffer rendered frames for a considerable amount of
time - 3 frames is the default in NVIDIA and AMD driver control panels, and
not changeable by the app though I do wonder if
there is a way to bypass this buffering.

Users often report input lag in games because of this buffering.

I know for certain this buffering occurs in D3D apps but not sure about
OpenGL apps.

If you need low latency, I highly recommend changing this setting, and
disabling the Threaded Optimization option if present as well (this
certainly has an impact on OpenGL), sometimes this requires
thirdparty tools to change these settings if they are not exposed in the
control panel programs on Windows.

I am not sure of the corresponding settings on Linux or OSX.

Sorry to not have exact info on this buffering behavior but that is what
I know about it at present.

On 02/09/2012 08:17 AM, Mike Lawrence wrote:

Hi folks,

First, thanks to everyone that has helped with my previous queries;
this list has be very useful so far in my explorations on how I can
make my experiments more precise using SDL.

I think my last question is this: after SDL_Flip() is it possible to
get a timestamp for when the flip will actually take place? As I
understand it, SDL_Flip() will simply set up the flip to occur at the
next vertical retrace then return immediately, not waiting until the
actual retrace happens. It would be useful to me to know the time of
the retrace that presented the surface supplied to SDL_Flip().

Cheers,

Mike


Mike Lawrence
Graduate Student
Department of Psychology
Dalhousie University

Looking to arrange a meeting? Check my public calendar:
http://goo.gl/BYH99

~ Certainty is folly… I think. ~


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org


LordHavoc
Author of DarkPlaces Quake1 engine -
http://icculus.org/twilight/darkplaces
Co-designer of Nexuiz - http://alientrap.org/nexuiz
"War does not prove who is right, it proves who is left." - Unknown
"Any sufficiently advanced technology is indistinguishable from a rigged
demo." - James Klass
"A game is a series of interesting choices." - Sid Meier


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org

Actually, I’m not coding games per-se, I’m coding experiments in
cognitive psychology, so my priorities are slightly different than
those of a game developer. My aim is to present visual stimuli to the
user and measure, as accurately as possible, the time between the
moment that the stimulus is actually displayed on the screen and the
time that the user responds (usually via keyboard or gamepad).On Tue, Feb 14, 2012 at 7:06 AM, R Manard wrote:

gotta dumb it down for me. you want the game data to change according to?the
user pushing buttons but you want whats on the screen to to be more in sync
than the refresh rate of the monitor?

Well, what I would do is use boost::cpu_timer after calling SDL_Flip()
to mark and track the time changes. It’s the most accurate timer I
know, although not actually SDL. It gets the job done.

Some info about it is at:
http://www.boost.org/doc/libs/1_48_0/libs/timer/doc/cpu_timers.html

Hope that helps…On 2/14/2012 3:48 PM, Mike Lawrence wrote:

On Tue, Feb 14, 2012 at 7:06 AM, R Manard wrote:

gotta dumb it down for me. you want the game data to change according to the
user pushing buttons but you want whats on the screen to to be more in sync
than the refresh rate of the monitor?
Actually, I’m not coding games per-se, I’m coding experiments in
cognitive psychology, so my priorities are slightly different than
those of a game developer. My aim is to present visual stimuli to the
user and measure, as accurately as possible, the time between the
moment that the stimulus is actually displayed on the screen and the
time that the user responds (usually via keyboard or gamepad).


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org

While a high resolution timer is indeed useful for timing the interval
between the screen update and the time of response (by the way, there
is also a high res timer in SDL 1.3:
http://hg.libsdl.org/SDL/rev/6bd701987ba9), most critical would be
knowing when the screen pixels actually changed. Simply putting the
start of a timer right after a call to SDL_Flip() isn’t sufficient
because SDL_Flip() doesn’t wait to return when the pixels change, but
instead returns immediately, meaning that it’s possible that the timer
starts anywhere up to a refresh too early. The consequent variability
in the degree to which the timer starts too early (i.e. for some
stimuli it will be, say, half a refresh too early, for others it’ll be
a whole refresh too early, etc) adds variance to the recorded response
times that undermines my ability to detect differences between
experimental conditions.

So, to reiterate my question (and by now I’m thinking that the answer
is “no”): Is there a way to know the time at which the screen was
actually changed following a call to SDL_Flip()?On Wed, Feb 15, 2012 at 12:48 AM, The Novice Coder wrote:

On 2/14/2012 3:48 PM, Mike Lawrence wrote:

On Tue, Feb 14, 2012 at 7:06 AM, R Manard ?wrote:

gotta dumb it down for me. you want the game data to change according to
the
user pushing buttons but you want whats on the screen to to be more in
sync
than the refresh rate of the monitor?

Actually, I’m not coding games per-se, I’m coding experiments in
cognitive psychology, so my priorities are slightly different than
those of a game developer. My aim is to present visual stimuli to the
user and measure, as accurately as possible, the time between the
moment that the stimulus is actually displayed on the screen and the
time that the user responds (usually via keyboard or gamepad).


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org

Well, what I would do is use boost::cpu_timer after calling SDL_Flip() to
mark and track the time changes. ?It’s the most accurate timer I know,
although not actually SDL. ?It gets the job done.

Some info about it is at:
http://www.boost.org/doc/libs/1_48_0/libs/timer/doc/cpu_timers.html

Hope that helps…


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org

Wouldn’t SDL_Flip wait if you were using vsync?On Wed, Feb 15, 2012 at 10:35 AM, Mike Lawrence <Mike.Lawrence at dal.ca>wrote:

While a high resolution timer is indeed useful for timing the interval
between the screen update and the time of response (by the way, there
is also a high res timer in SDL 1.3:
http://hg.libsdl.org/SDL/rev/6bd701987ba9), most critical would be
knowing when the screen pixels actually changed. Simply putting the
start of a timer right after a call to SDL_Flip() isn’t sufficient
because SDL_Flip() doesn’t wait to return when the pixels change, but
instead returns immediately, meaning that it’s possible that the timer
starts anywhere up to a refresh too early. The consequent variability
in the degree to which the timer starts too early (i.e. for some
stimuli it will be, say, half a refresh too early, for others it’ll be
a whole refresh too early, etc) adds variance to the recorded response
times that undermines my ability to detect differences between
experimental conditions.

So, to reiterate my question (and by now I’m thinking that the answer
is “no”): Is there a way to know the time at which the screen was
actually changed following a call to SDL_Flip()?

On Wed, Feb 15, 2012 at 12:48 AM, The Novice Coder wrote:

On 2/14/2012 3:48 PM, Mike Lawrence wrote:

On Tue, Feb 14, 2012 at 7:06 AM, R Manard wrote:

gotta dumb it down for me. you want the game data to change according
to

the
user pushing buttons but you want whats on the screen to to be more in
sync
than the refresh rate of the monitor?

Actually, I’m not coding games per-se, I’m coding experiments in
cognitive psychology, so my priorities are slightly different than
those of a game developer. My aim is to present visual stimuli to the
user and measure, as accurately as possible, the time between the
moment that the stimulus is actually displayed on the screen and the
time that the user responds (usually via keyboard or gamepad).


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org

Well, what I would do is use boost::cpu_timer after calling SDL_Flip() to
mark and track the time changes. It’s the most accurate timer I know,
although not actually SDL. It gets the job done.

Some info about it is at:
http://www.boost.org/doc/libs/1_48_0/libs/timer/doc/cpu_timers.html

Hope that helps…


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org

Message-ID:
<CAB+QPJDj3ykyJ=B+P4m4WDzkCwvQP2c20Cdp-6oN-bVY58EdnQ at mail.gmail.com>
Content-Type: text/plain; charset=ISO-8859-1

While a high resolution timer is indeed useful for timing the interval
between the screen update and the time of response (by the way, there
is also a high res timer in SDL 1.3:
http://hg.libsdl.org/SDL/rev/6bd701987ba9), most critical would be
knowing when the screen pixels actually changed. Simply putting the
start of a timer right after a call to SDL_Flip() isn’t sufficient
because SDL_Flip() doesn’t wait to return when the pixels change, but
instead returns immediately, meaning that it’s possible that the timer
starts anywhere up to a refresh too early. The consequent variability
in the degree to which the timer starts too early (i.e. for some
stimuli it will be, say, half a refresh too early, for others it’ll be
a whole refresh too early, etc) adds variance to the recorded response
times that undermines my ability to detect differences between
experimental conditions.

So, to reiterate my question (and by now I’m thinking that the answer
is “no”): Is there a way to know the time at which the screen was
actually changed following a call to SDL_Flip()?

I believe that the answer is CLOSE ENOUGH to “no”. Particularly since
high-enough resolution could run afoul of the video card/display
anyways. I’d suggest that you seek a hardware solution:

Find a high-speed shutter device. Rig it up so that when it opens, a
high-accuracy timer starts, and when a button is pressed the timer is
paused. The shutter will respond to a signal, displaying the image (I
assume by way of a projector display) and starting the timer at the
same time, the user will press the button pausing the timer, and your
computer (which controls both the shutter & the display that the
shutter hides/displays) will read back the reading from the timer,
providing a much more accurate timing value than you could otherwise
obtain (admittedly, this might be over-kill).

SDL would be useful for displaying the image, but for high-accuracy
things I’d normally suggest something specialty where you don’t have
to worry about other issues (consider, modern operating systems can
pause your program, potentially interfering with your research).

If you need something motion-based, I’d suggest some sort of
high-speed movie projector setup. Before everything moved to digital,
sound was commonly encoded with a black and white strip on the side of
the film, which could be used for the timing signals instead. I don’t
know where you’d find the equipment, but I’d expect it to be available
from somewhere.> Date: Wed, 15 Feb 2012 11:35:19 -0400

From: Mike Lawrence <Mike.Lawrence at dal.ca>
To: SDL Development List
Subject: Re: [SDL] Time of screen update?

Googling and searching this mailing list’s history yielded a mixed
answer to whether SDL_Flip() waits for the vsync. Apparently it
doesn’t work when using X11 (at least as of SDL 1.2; still the case
with 1.3?), but might if you specify an opengl context? I don’t quite
have a handle on the distinction between X11 vs OpenGL modes…

But yes, in theory if I were to somehow get SDL_Flip() to wait until
the vsync, then I could just start my timer when SDL_Flip() returns.On Wed, Feb 15, 2012 at 12:24 PM, Alex Barry <alex.barry at gmail.com> wrote:

Wouldn’t SDL_Flip wait if you were using vsync?

On Wed, Feb 15, 2012 at 10:35 AM, Mike Lawrence <Mike.Lawrence at dal.ca> wrote:

While a high resolution timer is indeed useful for timing the interval
between the screen update and the time of response (by the way, there
is also a high res timer in SDL 1.3:
http://hg.libsdl.org/SDL/rev/6bd701987ba9), most critical would be
knowing when the screen pixels actually changed. Simply putting the
start of a timer right after a call to SDL_Flip() isn’t sufficient
because SDL_Flip() doesn’t wait to return when the pixels change, but
instead returns immediately, meaning that it’s possible that the timer
starts anywhere up to a refresh too early. The consequent variability
in the degree to which the timer starts too early (i.e. for some
stimuli it will be, say, half a refresh too early, for others it’ll be
a whole refresh too early, etc) adds variance to the recorded response
times that undermines my ability to detect differences between
experimental conditions.

So, to reiterate my question (and by now I’m thinking that the answer
is “no”): Is there a way to know the time at which the screen was
actually changed following a call to SDL_Flip()?

On Wed, Feb 15, 2012 at 12:48 AM, The Novice Coder wrote:

On 2/14/2012 3:48 PM, Mike Lawrence wrote:

On Tue, Feb 14, 2012 at 7:06 AM, R Manard ?wrote:

gotta dumb it down for me. you want the game data to change according
to
the
user pushing buttons but you want whats on the screen to to be more in
sync
than the refresh rate of the monitor?

Actually, I’m not coding games per-se, I’m coding experiments in
cognitive psychology, so my priorities are slightly different than
those of a game developer. My aim is to present visual stimuli to the
user and measure, as accurately as possible, the time between the
moment that the stimulus is actually displayed on the screen and the
time that the user responds (usually via keyboard or gamepad).


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org

Well, what I would do is use boost::cpu_timer after calling SDL_Flip()
to
mark and track the time changes. ?It’s the most accurate timer I know,
although not actually SDL. ?It gets the job done.

Some info about it is at:
http://www.boost.org/doc/libs/1_48_0/libs/timer/doc/cpu_timers.html

Hope that helps…


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org

What you describe actually has a long history in cognitive science in
a device called a “tachistoscope”, which really is still a
gold-standard for accuracy in stimulus presentation, but rarely used
now because while computers are less accurate, they are more flexible
display devices.On Thu, Feb 16, 2012 at 3:38 AM, Jared Maddox wrote:

I’d suggest that you seek a hardware solution:
Find a high-speed shutter device. …

Okay, I get it. The hardware of the pc is not only not designed for this I
can’t believe there is a way to make it work other than taking apart the
hardware to insterting seperate home made sync hardware and making
a modified bios for the os to run on.
When I have seen scientists do such things as you say I notice that most
seem to use high speed digital camera and have a high speed digital display
in the shot as the time control. You know like a digital readout of the
time that shows milliseconds (which can not be read with the unaided human
eye anyway).On Tue, Feb 14, 2012 at 4:48 PM, Mike Lawrence <Mike.Lawrence at dal.ca> wrote:

On Tue, Feb 14, 2012 at 7:06 AM, R Manard <@R_Manard> wrote:

gotta dumb it down for me. you want the game data to change according
to the
user pushing buttons but you want whats on the screen to to be more in
sync
than the refresh rate of the monitor?

Actually, I’m not coding games per-se, I’m coding experiments in
cognitive psychology, so my priorities are slightly different than
those of a game developer. My aim is to present visual stimuli to the
user and measure, as accurately as possible, the time between the
moment that the stimulus is actually displayed on the screen and the
time that the user responds (usually via keyboard or gamepad).


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org

Hello !

Actually, I’m not coding games per-se, I’m coding experiments in
cognitive psychology, so my priorities are slightly different than
those of a game developer. My aim is to present visual stimuli to the
user and measure, as accurately as possible, the time between the
moment that the stimulus is actually displayed on the screen and the
time that the user responds (usually via keyboard or gamepad).

With SDL 1.2: SDL_Flip + OS + API that supported it, really waited for an actual Flip.

SDL 1.2 + Windows + DirectX, SDL_Flip really waited for a flip to happen.
SDL 1.2 + Linux + X11, it was not possible to HW flip, that was a limitation of X11,
but with SDL 1.2 + Linux + OpenGL it is possible.

Even if the HW supports it, if the OS + GFX Api does not support it,
you have no chance.

CU