XFree86 4.3.0, SDL 1.2.5 refresh rates

As anyone figured out what’s going on with this combination yet?
Specifically, when mode switches are done in SDL apps, SDL seems to be
using the VESA standard refresh rates, and not the ones specified in the
XF86Config file.

It’s really annoying having a monitor that can do 100 Hz refreshes, and
most games are going to 72, or even 60! It’s especially annoying since
the problem wasn’t there in X 4.2.1.

Based on quite a bit of experimentation, I’ve determined that SDL is the
cause. Games that don’t use SDL, and other apps (VMWare, mplayer, etc)
work fine.

Steve

This is actually because X have changed some of it’s code, and SDL isn’t
yet made to support that. I think somebody in the list have patched
SDL, but I don’t remember was it Winslow or X, because XP had the same
problem.On Monday 07 April 2003 02:26, Stephen Anthony wrote:

As anyone figured out what’s going on with this combination yet?
Specifically, when mode switches are done in SDL apps, SDL seems to
be using the VESA standard refresh rates, and not the ones specified
in the XF86Config file.

It’s really annoying having a monitor that can do 100 Hz refreshes,
and most games are going to 72, or even 60! It’s especially annoying
since the problem wasn’t there in X 4.2.1.

Based on quite a bit of experimentation, I’ve determined that SDL is
the cause. Games that don’t use SDL, and other apps (VMWare,
mplayer, etc) work fine.

Thanks for the info. Is there a place to look for this patch?

SteveOn April 7, 2003 10:58 am, Sami N??t?nen wrote:

On Monday 07 April 2003 02:26, Stephen Anthony wrote:

As anyone figured out what’s going on with this combination yet?
Specifically, when mode switches are done in SDL apps, SDL seems to
be using the VESA standard refresh rates, and not the ones specified
in the XF86Config file.

It’s really annoying having a monitor that can do 100 Hz refreshes,
and most games are going to 72, or even 60! It’s especially annoying
since the problem wasn’t there in X 4.2.1.

Based on quite a bit of experimentation, I’ve determined that SDL is
the cause. Games that don’t use SDL, and other apps (VMWare,
mplayer, etc) work fine.

This is actually because X have changed some of it’s code, and SDL
isn’t yet made to support that. I think somebody in the list have
patched SDL, but I don’t remember was it Winslow or X, because XP had
the same problem.

This is actually because X have changed some of it’s code, and SDL
isn’t yet made to support that. I think somebody in the list have
patched SDL, but I don’t remember was it Winslow or X, because XP had
the same problem.

Just wondering though; if it was because of a change in X, why was the
Windows XP version affected?

SteveOn April 7, 2003 10:58 am, Sami N??t?nen wrote:

If you have the bandwidth try to seacrh from the ML archive 93MB.On Monday 07 April 2003 16:52, Stephen Anthony wrote:

On April 7, 2003 10:58 am, Sami N??t?nen wrote:

On Monday 07 April 2003 02:26, Stephen Anthony wrote:

As anyone figured out what’s going on with this combination yet?
Specifically, when mode switches are done in SDL apps, SDL seems
to be using the VESA standard refresh rates, and not the ones
specified in the XF86Config file.

It’s really annoying having a monitor that can do 100 Hz
refreshes, and most games are going to 72, or even 60! It’s
especially annoying since the problem wasn’t there in X 4.2.1.

Based on quite a bit of experimentation, I’ve determined that SDL
is the cause. Games that don’t use SDL, and other apps (VMWare,
mplayer, etc) work fine.

This is actually because X have changed some of it’s code, and SDL
isn’t yet made to support that. I think somebody in the list have
patched SDL, but I don’t remember was it Winslow or X, because XP
had the same problem.

Thanks for the info. Is there a place to look for this patch?

Same reason. :wink:
XP doesn’t do things like older Winslows.On Monday 07 April 2003 16:54, Stephen Anthony wrote:

On April 7, 2003 10:58 am, Sami N??t?nen wrote:

This is actually because X have changed some of it’s code, and SDL
isn’t yet made to support that. I think somebody in the list have
patched SDL, but I don’t remember was it Winslow or X, because XP
had the same problem.

Just wondering though; if it was because of a change in X, why was
the Windows XP version affected?

Does anybody have any more information on this?

See ya,
-Sam Lantinga, Software Engineer, Blizzard Entertainment> On Monday 07 April 2003 02:26, Stephen Anthony wrote:

As anyone figured out what’s going on with this combination yet?
Specifically, when mode switches are done in SDL apps, SDL seems to
be using the VESA standard refresh rates, and not the ones specified
in the XF86Config file.

It’s really annoying having a monitor that can do 100 Hz refreshes,
and most games are going to 72, or even 60! It’s especially annoying
since the problem wasn’t there in X 4.2.1.

Based on quite a bit of experimentation, I’ve determined that SDL is
the cause. Games that don’t use SDL, and other apps (VMWare,
mplayer, etc) work fine.

This is actually because X have changed some of it’s code, and SDL isn’t
yet made to support that. I think somebody in the list have patched
SDL

After scouring the mailing list, I found someone who said they had hacked
SDL to use a refresh rate that they specified. I could do that, but I
was wondering what would be a more general, and correct solution.

As per my previous email, I think the change in X is that it now sees many
more modes than it did before, probably because it gets them from the
monitor directly and not through the XF86Config file (though modelines
listed there are used as well).

SteveOn April 8, 2003 02:09 am, Sam Lantinga wrote:

On Monday 07 April 2003 02:26, Stephen Anthony wrote:

As anyone figured out what’s going on with this combination yet?
Specifically, when mode switches are done in SDL apps, SDL seems to
be using the VESA standard refresh rates, and not the ones
specified in the XF86Config file.

It’s really annoying having a monitor that can do 100 Hz refreshes,
and most games are going to 72, or even 60! It’s especially
annoying since the problem wasn’t there in X 4.2.1.

Based on quite a bit of experimentation, I’ve determined that SDL
is the cause. Games that don’t use SDL, and other apps (VMWare,
mplayer, etc) work fine.

This is actually because X have changed some of it’s code, and SDL
isn’t yet made to support that. I think somebody in the list have
patched SDL

Does anybody have any more information on this?

See ya,
-Sam Lantinga, Software Engineer, Blizzard Entertainment

Sam Lantinga wrote:

Stephen Anthony wrote:

This is actually because X have changed some of it’s code, and SDL
isn’t yet made to support that. I think somebody in the list have
patched SDL

Does anybody have any more information on this?

See my other recent post. The server now returns multiple matching
modes for a given resolution at different refresh rates. SDL should
pick one appropriately (generally the fastest available, maybe clamped
to something like 90 Hz for sanity).

Andy–
Andrew J. Ross Beyond the Ordinary Plausibility Productions
Sole Proprietor Beneath the Infinite Hillsboro, OR
Experience… the Plausible?

I’m working on a patch to SDL that will do this, and will be enabled if
you use the environment variable ‘SDL_VIDEO_X11_USE_OPTIMAL_RATE’ and set
it to non-zero. If it’s set to 0 (or not set at all), then the default
is to do nothing (have it work as it is currently).

I don’t think there’s a need to clamp the refresh rate, since X itself
takes care of deleting incorrect modelines. If it doesn’t, then that is
a problem for X.

As for the XRandR extension, it may end up that it will be the optimal
solution. But for now, the best/quickest thing to do (for SDL 1.2.6) is
what you’ve suggested (and what I’m working on).

SteveOn April 8, 2003 02:24 pm, Andy Ross wrote:

Sam Lantinga wrote:

Stephen Anthony wrote:

This is actually because X have changed some of it’s code, and SDL
isn’t yet made to support that. I think somebody in the list have
patched SDL

Does anybody have any more information on this?

See my other recent post. The server now returns multiple matching
modes for a given resolution at different refresh rates. SDL should
pick one appropriately (generally the fastest available, maybe clamped
to something like 90 Hz for sanity).

Sam Lantinga wrote:

Stephen Anthony wrote:

This is actually because X have changed some of it’s code, and SDL
isn’t yet made to support that. I think somebody in the list have
patched SDL

Does anybody have any more information on this?

See my other recent post. The server now returns multiple matching
modes for a given resolution at different refresh rates. SDL should
pick one appropriately (generally the fastest available, maybe clamped
to something like 90 Hz for sanity).

I’m working on a patch to SDL that will do this, and will be enabled if
you use the environment variable ‘SDL_VIDEO_X11_USE_OPTIMAL_RATE’ and set
it to non-zero. If it’s set to 0 (or not set at all), then the default
is to do nothing (have it work as it is currently).

Is there a sane reason why somebody would want to run a refresh rate lower
than the maximum reported? Basically everybody I know wants to use the
maximum refresh rate all the time, because everything else just
unnecessarily hurts the eyes.
IMHO, the meaning of that environment variable should be reversed, i.e. use
the highest refresh rate by default, and revert to the old behaviour when
the environment variable is set. Users shouldn’t have to bother with
environment variables just to get expected behaviour.

I don’t think there’s a need to clamp the refresh rate, since X itself
takes care of deleting incorrect modelines. If it doesn’t, then that is
a problem for X.

I agree.

cu,
NicolaiOn Tuesday 08 April 2003 19:41, Stephen Anthony wrote:

On April 8, 2003 02:24 pm, Andy Ross wrote:

As for the XRandR extension, it may end up that it will be the optimal
solution. But for now, the best/quickest thing to do (for SDL 1.2.6) is
what you’ve suggested (and what I’m working on).

Steve

Nicolai Haehnle wrote:

Is there a sane reason why somebody would want to run a refresh rate
lower than the maximum reported? Basically everybody I know wants to
use the maximum refresh rate all the time, because everything else
just unnecessarily hurts the eyes.

Yes. I’ve seen windows games select ridiculous modes like 1024x768 @
150Hz. Beyond being an absurd choice (anything over about 80Hz is
undetectable to the human nervous system), it confuses monitors. The
image ends up squashed or stretched or very off-center. Monitors have
builtin mode calibrations for typical choices, but they don’t
understand 150 Hz, even if they can display it.

That said, it seems the nice folks at XFree have already dealt with
this for us. My monitor exports a maximum vertical refresh of 180Hz,
yet the XFree exported modes top out at a pleasant 85 Hz (even at the
very low resolutions where the card could do 180 if it wanted to).

So it’s apparently not a problem. XFree does the Right Thing, and SDL
should be able to select the fastest mode at the given resolution.

Andy

I’ve done this for Windows. The problem in Windows is that the
underlying APIs, on many systems, list refresh rates that the system
can’t really do, and if you try to activate them it’ll just desync the
monitor, so the only safe thing to do is to stick to 60Hz (the default),
and let the user change it. (I’m not sure what causes these invalid modes;
all I know is that when I had SDL automatically setting the highest known
refresh for the resolution, people complained that the game desynced
their monitor when it started.)On Tue, Apr 08, 2003 at 08:22:37AM -0230, Stephen Anthony wrote:

After scouring the mailing list, I found someone who said they had hacked
SDL to use a refresh rate that they specified. I could do that, but I
was wondering what would be a more general, and correct solution.


Glenn Maynard

Yes, this is what I see. I notice that some modes like 640x480 work out
to have a rate of 170Hz, but I think thats because they’re scan-doubled.
I’ll have to look at the flags and work those out.

But otherwise, all modes I get are a max of 85 Hz. So my patch will pick
the largest rate of any given resolution.

SteveOn April 8, 2003 04:13 pm, Andy Ross wrote:

Nicolai Haehnle wrote:

Is there a sane reason why somebody would want to run a refresh rate
lower than the maximum reported? Basically everybody I know wants to
use the maximum refresh rate all the time, because everything else
just unnecessarily hurts the eyes.

Yes. I’ve seen windows games select ridiculous modes like 1024x768 @
150Hz. Beyond being an absurd choice (anything over about 80Hz is
undetectable to the human nervous system), it confuses monitors. The
image ends up squashed or stretched or very off-center. Monitors have
builtin mode calibrations for typical choices, but they don’t
understand 150 Hz, even if they can display it.

That said, it seems the nice folks at XFree have already dealt with
this for us. My monitor exports a maximum vertical refresh of 180Hz,
yet the XFree exported modes top out at a pleasant 85 Hz (even at the
very low resolutions where the card could do 180 if it wanted to).

So it’s apparently not a problem. XFree does the Right Thing, and SDL
should be able to select the fastest mode at the given resolution.

I was thinking the same thing, but I didn’t want to change the behaviour
of SDL. But I guess that we could consider the behaviour already
changed, and a patch could restore it :slight_smile: I think I’ll rename the
variable to turn off the functionality of X 4.3, otherwise it will work
as normal (like X 4.2).

SteveOn April 8, 2003 03:50 pm, Nicolai Haehnle wrote:

On Tuesday 08 April 2003 19:41, Stephen Anthony wrote:

On April 8, 2003 02:24 pm, Andy Ross wrote:

Sam Lantinga wrote:

Stephen Anthony wrote:

This is actually because X have changed some of it’s code, and
SDL isn’t yet made to support that. I think somebody in the
list have patched SDL

Does anybody have any more information on this?

See my other recent post. The server now returns multiple matching
modes for a given resolution at different refresh rates. SDL
should pick one appropriately (generally the fastest available,
maybe clamped to something like 90 Hz for sanity).

I’m working on a patch to SDL that will do this, and will be enabled
if you use the environment variable 'SDL_VIDEO_X11_USE_OPTIMAL_RATE’
and set it to non-zero. If it’s set to 0 (or not set at all), then
the default is to do nothing (have it work as it is currently).

Is there a sane reason why somebody would want to run a refresh rate
lower than the maximum reported? Basically everybody I know wants to
use the maximum refresh rate all the time, because everything else just
unnecessarily hurts the eyes.
IMHO, the meaning of that environment variable should be reversed, i.e.
use the highest refresh rate by default, and revert to the old
behaviour when the environment variable is set. Users shouldn’t have to
bother with environment variables just to get expected behaviour.

Then I guess you were the person I found in the ML :slight_smile: But as someone else
mentioned, X does a nice job for us of clamping the modes to 85 Hz, and
deleting all those that don’t work.

My patch (for the X11 version) will just delete some more of these modes,
so, for example, if there is a mode of 800x600 with 85 and 72 Hz, SDL
will never see the 72 Hz one.

SteveOn April 8, 2003 04:24 pm, Glenn Maynard wrote:

On Tue, Apr 08, 2003 at 08:22:37AM -0230, Stephen Anthony wrote:

After scouring the mailing list, I found someone who said they had
hacked SDL to use a refresh rate that they specified. I could do
that, but I was wondering what would be a more general, and correct
solution.

I’ve done this for Windows. The problem in Windows is that the
underlying APIs, on many systems, list refresh rates that the system
can’t really do, and if you try to activate them it’ll just desync
the monitor, so the only safe thing to do is to stick to 60Hz (the
default), and let the user change it. (I’m not sure what causes these
invalid modes; all I know is that when I had SDL automatically setting
the highest known refresh for the resolution, people complained that
the game desynced their monitor when it started.)

My patch (for the X11 version) will just delete some more of these modes,
so, for example, if there is a mode of 800x600 with 85 and 72 Hz, SDL
will never see the 72 Hz one.

As someone working in psychophysics, where we need to be able to control
everything about a specific visual stimulus, it would be VERY helpful to
have control of refresh rate exposed.

SGI’s IrisGL lets us do this. Linux and XFree let us open a display at a
specific frequency. The ability to control this on Windows (or Linux, if
that’s possible) via SDL would be VERY useful!

—JoshOn Tue, 8 Apr 2003, Stephen Anthony wrote:

Well, from what I can see, this is trivial to do. In fact, the whole
algorithm is very simple (a tribute to how SDL is written).

Basically, we get an array, and we walk that array and delete certain
modes. My solution was to delete modes that have duplicate resolutions
but lower refresh rates. But that could be easily modified to search out
only those resolutions with a specific refresh rate. Standard search for
value algorithm. I’ll leave that part to someone else :slight_smile:

But I can only do it in Linux. I don’t know much about Windows stuff, and
someone already mentioned problems in Windows when attempting this.

SteveOn April 8, 2003 05:36 pm, Josh Fishman wrote:

On Tue, 8 Apr 2003, Stephen Anthony wrote:

My patch (for the X11 version) will just delete some more of these
modes, so, for example, if there is a mode of 800x600 with 85 and 72
Hz, SDL will never see the 72 Hz one.

As someone working in psychophysics, where we need to be able to
control everything about a specific visual stimulus, it would be VERY
helpful to have control of refresh rate exposed.

SGI’s IrisGL lets us do this. Linux and XFree let us open a display at
a specific frequency. The ability to control this on Windows (or Linux,
if that’s possible) via SDL would be VERY useful!

Nope. I can tell the difference between 85Hz and 150Hz (it’s much more
"solid", and FPSs that can maintain that framerate are smoother), and my
monitor can do it just fine in lower resolutions. I’d be very irritated
if software decided for me that I didn’t want to take full advantage of
my hardware because someone thought it was “absurd” or “undetectable”.

If X is doing this, X is buggy.

If the image is wrong in that refresh on your hardware, and it can’t
be compensated for with the monitor’s controls, then your hardware is
advertising a mode it can’t really do. That’s your monitor’s fault;
don’t assume all monitors are faulty.

(The existance of these buggy monitors is the reason I needed to make
refresh selection an option, instead of choosing the highest available.
Some day I’ll submit a patch; other things take priority right now …)On Tue, Apr 08, 2003 at 11:43:22AM -0700, Andy Ross wrote:

Yes. I’ve seen windows games select ridiculous modes like 1024x768 @
150Hz. Beyond being an absurd choice (anything over about 80Hz is
undetectable to the human nervous system), it confuses monitors. The
image ends up squashed or stretched or very off-center. Monitors have
builtin mode calibrations for typical choices, but they don’t
understand 150 Hz, even if they can display it.


Glenn Maynard

Glenn Maynard wrote:

Nope. I can tell the difference between 85Hz and 150Hz (it’s much
more “solid”, and FPSs that can maintain that framerate are
smoother)

Sigh. I knew when I wrote that that someone would call me out on it.

No, you can’t. You probably won’t believe me that you can’t, (just
like audiophiles who swear they can hear the 16 bit quantization
errors in CD audio) but you can’t. You just can’t. Really. People
aren’t built that way. Even the phosphors in most CRTs have a decay
rate less than that.

Get someone to do a blind test on you, or write a script that selects
a mode at random. Play a game of Quake for 10 minutes in each and
rate them for “solidity” and “smoothness”. Repeat a few times.
You’ll be surprised.

Andy