Mac OS X fullscreen and fades

If I were to make a patch that lets you set an environment variable to
disable the fades when changing to/from fullscreen modes on OS X, would it
be feasible to include in CVS?

I realize those fades are there to hide any flickering that might occur,
but they’re painfully long, especially when trying to change resolutions
mid-game.

Gregory

Hello !

If I were to make a patch that lets you set an environment variable to
disable the fades when changing to/from fullscreen modes on OS X, would it
be feasible to include in CVS?

I realize those fades are there to hide any flickering that might occur,
but they’re painfully long, especially when trying to change resolutions
mid-game.

Jup. It would be really good to
have something like this in SDL.

CU

Gregory Smith wrote:

If I were to make a patch that lets you set an environment variable to
disable the fades when changing to/from fullscreen modes on OS X, would
it be feasible to include in CVS?

I realize those fades are there to hide any flickering that might occur,
but they’re painfully long, especially when trying to change resolutions
mid-game.

I don’t mind the fades (although I agree that the fade-from-black could
be a bit faster), but what could be improved is that changing between
two fullscreen modes shouldn’t switch to the desktop resolution in between.

-Christian

Well, yes, that would be even better. I assumed there was a reason that
wasn’t possible. Can anyone enlighten me? Or should I dive in and try
messing with code, and find out for myself?

GregoryOn Thu, 26 Jan 2006, Christian Walther wrote:

Gregory Smith wrote:

If I were to make a patch that lets you set an environment variable to
disable the fades when changing to/from fullscreen modes on OS X, would it
be feasible to include in CVS?

I realize those fades are there to hide any flickering that might occur,
but they’re painfully long, especially when trying to change resolutions
mid-game.

I don’t mind the fades (although I agree that the fade-from-black could be a
bit faster), but what could be improved is that changing between two
fullscreen modes shouldn’t switch to the desktop resolution in between.

-Christian


SDL mailing list
SDL at libsdl.org
http://www.libsdl.org/mailman/listinfo/sdl

If I were to make a patch that lets you set an environment variable to
disable the fades when changing to/from fullscreen modes on OS X, would it
be feasible to include in CVS?

This is tracked as a bug now (Bug #100, hooray!), so we don’t forget to
figure out what would be appropriate to do with this issue:
https://bugzilla.libsdl.org/show_bug.cgi?id=100

–ryan.

Well, yes, that would be even better. I assumed there was a reason that
wasn’t possible. Can anyone enlighten me? Or should I dive in and try
messing with code, and find out for myself?

Go for it. :slight_smile:
https://bugzilla.libsdl.org/show_bug.cgi?id=100

-Sam Lantinga, Senior Software Engineer, Blizzard Entertainment

OK, I removed a few of the fades, and removed the switching to
desktop between full screen modes, and it seems to work fine with no
flickering.

What am I missing here? Seems to work a lot better, but it could not
possibly have been that easy.

GregoryOn Jan 30, 2006, at 7:42 AM, Sam Lantinga wrote:

Well, yes, that would be even better. I assumed there was a reason
that
wasn’t possible. Can anyone enlighten me? Or should I dive in and try
messing with code, and find out for myself?

Go for it. :slight_smile:
https://bugzilla.libsdl.org/show_bug.cgi?id=100

-Sam Lantinga, Senior Software Engineer, Blizzard Entertainment

Index: video/quartz/SDL_QuartzVideo.m

RCS file: /home/sdlweb/libsdl.org/cvs/SDL12/src/video/quartz/
SDL_QuartzVideo.m,v
retrieving revision 1.53
diff -r1.53 SDL_QuartzVideo.m
70c70
< static void QZ_UnsetVideoMode (_THIS);

static void QZ_UnsetVideoMode (_THIS, BOOL to_desktop);
438c438
< static void QZ_UnsetVideoMode (_THIS) {


static void QZ_UnsetVideoMode (_THIS, BOOL to_desktop) {
451d450
< int gamma_error;
454,455d452
< gamma_error = QZ_FadeGammaOut (this, &gamma_table);
<
475c472
<


    if (to_desktop) {

486,488c483
<
< if (! gamma_error)
< QZ_FadeGammaIn (this, &gamma_table);

    }

510d504
< int gamma_error;
517c511
< QZ_UnsetVideoMode (this);

    QZ_UnsetVideoMode (this, false);

530d523
< gamma_error = QZ_FadeGammaOut (this, &gamma_table);
632,635d624
< /* Fade the display to original gamma */
< if (! gamma_error )
< QZ_FadeGammaIn (this, &gamma_table);
<
658c647
< ERR_NO_CAPTURE: if (!gamma_error) { QZ_FadeGammaIn (this,
&gamma_table); }

ERR_NO_CAPTURE:
688c677
< QZ_UnsetVideoMode (this);


        QZ_UnsetVideoMode (this, true);

1505c1494
< QZ_UnsetVideoMode (this);

QZ_UnsetVideoMode (this, true);

Nice work! From reading the code, it seems OK, and as far as I have
tested, it works with my application. I do have a few pieces of
criticism, though:

  • Please attach patches instead of pasting them. Mail programs tend to
    mangle them (remove spaces at the end of lines, insert line breaks, …).

OK, I removed a few of the fades…

  • A few of them? As far as I can see, you removed all of them! I don’t
    like that.

However, apparently the OS does some fading on its own: fade to black
before switching from fullscreen to a different resolution, and fade
from black after switching to fullscreen from a different resolution.
It’s much faster than the SDL fades (which is nice), but it’s not
sufficient.

Give me some time and I’ll try to identify all the cases that need to be
handled and work out what I propose to do.

Two problems I see at the moment:

  • In the current SDL (without your patch): I’d get rid (at least) of the
    fade-from-black after switching to a fullscreen mode. I’m not sure about
    2D, but with OpenGL it just fades from black to black because at that
    time there hasn’t been a chance to render anything yet (can some 2D user
    comment on that?). We should be able to let the OS fade handle this - it
    runs asynchronously while the game is already rendering (which is the
    proper thing to do, IMHO).

  • With your patch: When switching from fullscreen, I get garbage on the
    screen before the OS fade kicks in (this is with OpenGL). It looks like
    the previous frame buffer interpreted with an incorrect pitch. We should
    have SDL fade to black to hide that.

  • There’s a leftover comment on line 523.

  • The contents of the “if (to_desktop) {…}” (lines 473 - 482) should
    be indented.

… and it seems to work fine with no flickering.

Not flickering per se, but I do see some garbage (see above).

-Christian

I get this both with and without my patch, actually, in windowed mode
with OpenGL enabled. In fullscreen, or in 2D windowed mode, everything
looks perfect here with my patch. I’m running 10.4, though, so I may
have Apple enhancements not present in earlier versions of Mac OS X.

If you’re on one of those versions and can get it working better, go for
it.

If we need to use SDL’s fades at any point, there’s an SDL_Delay(10) in
them that should make them 100 ms long, but they feel much longer than
that. I don’t know why.

GregoryOn Wed, 2006-02-01 at 12:43 +0100, Christian Walther wrote:

  • With your patch: When switching from fullscreen, I get garbage on
    the
    screen before the OS fade kicks in (this is with OpenGL). It looks
    like
    the previous frame buffer interpreted with an incorrect pitch. We
    should
    have SDL fade to black to hide that.

I wrote:

Give me some time and I’ll try to identify all the cases that need to be
handled and work out what I propose to do.

OK, here we go. We basically have a state machine with 3 states:
windowed (w), fullscreen (f), and no video mode set (where we start and
return to before quitting) (n). Only the transitions to and from f are
interesting because the others don’t involve fading.

How things work in current SDL, without Gregory’s patch - in brackets
the issues we’d like to improve:

f->w:
fade to black, fade to desktop synchronously [window is not visible
during fade in, but appears afterwards]

f->f:
fade to black, fade to desktop, fade to black, fade in synchronously
[1. too many fades, 2. fading in synchronously means fading from black
to black, with the game content abruptly appearing afterwards]

f->n:
fade to black, fade to desktop

w->f:
n->f:
fade to black, fade in synchronously [i.e. from black to black, see above]

[in general, the fades are too slow]

How things work with Gregory’s patch, according to my obervations (on
10.3.9, with OpenGL):

f->w:
fade to black, abrupt transition to desktop, window appears [garbage
is displayed while fading out (at least here, with OpenGL)]

f->f:
fade to black, fade in asynchronously [garbage]

f->n:
fade to black, abruptly to desktop [garbage]

w->f:
abruptly to black, fade in asynchronously

n->f:
abruptly to black, abruptly to game

[it may be a matter of personal preference, but I don’t like the abrupt
transitions]

How I think things should work:

f->w:
f->f:
f->n:
fade to black (to hide garbage), fade in (easier on the eyes)
asynchronously (so that actual game content fades in)

w->f:
n->f:
I’d recommend the same thing as above for the "easier on the eyes"
reason, but as there’s no garbage to hide, no fading could work too
(although it might flicker).

Fading to black should take 0.3 seconds, fading from black 0.5 seconds
(these are Apple’s default values, see the documentation cited below).

Any opinions on this? If people agree, and Gregory doesn’t beat me to
it, I can have a go at implementing it. We might throw out the
QZ_FadeGamma{In|Out}() functions entirely and use Apple’s Quartz fading
API instead (unless the observed garbage turns out to be caused by the
Quartz fading, which would mean we’d have to stick with the SDL fading).
It’s available since 10.2, and it can do asynchronous fading
(http://developer.apple.com/documentation/GraphicsImaging/Reference/Quartz_Services_Ref/Reference/reference.html).

-Christian

How I think things should work:

f->w:
f->f:
f->n:
fade to black (to hide garbage), fade in (easier on the eyes)
asynchronously (so that actual game content fades in)

w->f:
n->f:
I’d recommend the same thing as above for the "easier on the eyes"
reason, but as there’s no garbage to hide, no fading could work too
(although it might flicker).

Fading to black should take 0.3 seconds, fading from black 0.5 seconds
(these are Apple’s default values, see the documentation cited below).

That sounds OK, if that’s what Apple says to do. Think different, I guess.
Sounds like you’re seeing more garbage than I was.

I see garbage on w->w, so if we can fade in asynchronously we might as
well do it for all transitions? I’m less of a Mac user than I once as, so
I don’t know what the convention is.

Any opinions on this? If people agree, and Gregory doesn’t beat me to
it, I can have a go at implementing it. We might throw out the
QZ_FadeGamma{In|Out}() functions entirely and use Apple’s Quartz fading
API instead (unless the observed garbage turns out to be caused by the
Quartz fading, which would mean we’d have to stick with the SDL fading).
It’s available since 10.2, and it can do asynchronous fading
(http://developer.apple.com/documentation/GraphicsImaging/Reference/Quartz_Services_Ref/Reference/reference.html).

Asynchronous fading would be very good, 0.5 more seconds to start
recreating my OpenGL context would be nice to have.

I’d appreciate it if you’d have a go at this. I don’t know anything about
Quartz.

GregoryOn Wed, 1 Feb 2006, Christian Walther wrote:

Gregory Smith wrote:

Fading to black should take 0.3 seconds, fading from black 0.5 seconds
(these are Apple’s default values, see the documentation cited below).

That sounds OK, if that’s what Apple says to do. Think different, I guess.

Well, it’s not that Apple says to do it like this, the values are just
mentioned as what they do by default, but why not do the same. Half a
second is about what I’d have chosen anyway, and the fade-in being
slower than the fade-out makes sense because brightening the screen
while the eye is dark-adapted blinds you, whereas there’s no trouble in
darkening the screen when the eye is light-adapted.

I see garbage on w->w, so if we can fade in asynchronously we might as
well do it for all transitions? I’m less of a Mac user than I once as, so
I don’t know what the convention is.

Doesn’t seem like a good idea to me. I’ve gotten used to the window
jumping around when manually resizing it, but fading the whole screen
out and in every time you resize the window? (Maybe your windows aren’t
resizable, but mine are.)

I don’t quite understand when you are seeing garbage in w->w (or w->f).
A w->w switch only takes a split second, there is no time for any
garbage to be visible, or is there? Is it that some time passes between
SDL_SetVideoMode() and the first SDL_GL_SwapBuffers() in your app? If
so, that could probably be cured by including a
glClear(GL_COLOR_BUFFER_BIT) in QZ_SetVideoWindowed(), as it exists in
QZ_SetVideoFullScreen().

I’d appreciate it if you’d have a go at this.

I already started :). What I found out so far is that the garbage I’m
seeing is caused by the OpenGL context being destroyed before the
resolution switch (and therefore before the fade out). Hiding this by
fading to black earlier (using Apple’s fading function) works fine. And,
switching resolutions without blackening does flicker. Badly.

-Christian

Doesn’t seem like a good idea to me. I’ve gotten used to the window
jumping around when manually resizing it, but fading the whole screen
out and in every time you resize the window? (Maybe your windows aren’t
resizable, but mine are.)

I’m not using resizable windows, and didn’t realize Set_VideoMode would be
called continuously when doing so. In that case, fading is definitely bad.

I don’t quite understand when you are seeing garbage in w->w (or w->f).
A w->w switch only takes a split second, there is no time for any
garbage to be visible, or is there?

Hehe, it was w->w and f->w.

Is it that some time passes between SDL_SetVideoMode() and the first
SDL_GL_SwapBuffers() in your app? If so, that could probably be cured by
including a glClear(GL_COLOR_BUFFER_BIT) in QZ_SetVideoWindowed(), as it
exists in QZ_SetVideoFullScreen().

On reflection, I’m pretty sure it’s my fault I’m getting garbage there (I
don’t glClear until after all 100+MB of textures are reloaded).

I already started :). What I found out so far is that the garbage I’m
seeing is caused by the OpenGL context being destroyed before the
resolution switch (and therefore before the fade out). Hiding this by
fading to black earlier (using Apple’s fading function) works fine. And,
switching resolutions without blackening does flicker. Badly.

Sounds good, thanks for looking into this.

GregoryOn Wed, 1 Feb 2006, Christian Walther wrote:

Gregory Smith wrote:

I’m not using resizable windows, and didn’t realize Set_VideoMode would be
called continuously when doing so. In that case, fading is definitely bad.

It’s not being called continuously, just once when the resize is
finished (by the application reacting to the SDL_VIDEORESIZE event). But
I think fading the whole screen to black and back even once is too
distracting. Fading the window contents only would be nice, but
implementing that is probably not worth the effort - it’s not as easy as
fading the whole screen by scaling the gamma tables.

Is it that some time passes between SDL_SetVideoMode() and the first
SDL_GL_SwapBuffers() in your app? If so, that could probably be cured by
including a glClear(GL_COLOR_BUFFER_BIT) in QZ_SetVideoWindowed(), as it
exists in QZ_SetVideoFullScreen().

On reflection, I’m pretty sure it’s my fault I’m getting garbage there (I
don’t glClear until after all 100+MB of textures are reloaded).

Duh, of course my app takes some time to reload textures too, now that I
think about it. And I don’t glClear either. But I don’t see garbage
during that time, just the standard gray striped window background. Have
to test this on 10.4 some time…

-Christian

So, here’s my take on a patch that implements what I proposed:
https://bugzilla.libsdl.org/show_bug.cgi?id=100

Gregory, can you test if this works for you (and meets your initial
expectations)?

-Christian

I like it!

GregoryOn Sat, 2006-02-04 at 15:58 +0100, Christian Walther wrote:

So, here’s my take on a patch that implements what I proposed:
https://bugzilla.libsdl.org/show_bug.cgi?id=100

Gregory, can you test if this works for you (and meets your initial
expectations)?

-Christian