C++ fade in and fade out transition help

I know how to fade a surface from transparent to opaque, but how do I do
that, then a fade-out, changing the background of my game.–
View this message in context: http://www.nabble.com/C%2B%2B-fade-in-and-fade-out-transition-help-tp21008769p21008769.html
Sent from the SDL mailing list archive at Nabble.com.

I know how to fade a surface from transparent to opaque, but how do I do
that, then a fade-out, changing the background of my game.

A really easy way (sometimes not so efficient, but it depends on the
hardware and video mode) to do a fade is to put a fully transparent
surface in front of everything, filled with black. Then change it from
transparent to opaque, change the background when it’s 100% opaque,
then fade it back to transparent. If there’s an accelerated backend
(OpenGL on SDL 1.3 would be the most reliable for this, I think?), it
should be decently fast, but if it’s done in software, it might suck.

Another way is to mess around with the physical palette if you’re in a
paletted mode (that’s what we do in Quadra), but if you’re not on a
paletted mode (or if it’s emulated), it’s also not the fastest thing
ever.

I don’t remember if SDL can get and set the gamma, that’d be another
way that might work on more hardware (but only in fullscreen, I’d
guess). You’d have to be careful to make sure setting the gamma
actually works, otherwise you’ll end up with stupid delays in the game
where nothing happen (since the fade doesn’t work), and the user will
wonder what the heck is going on… ;-)On Mon, Dec 15, 2008 at 2:09 AM, charlie murphy wrote:


http://pphaneuf.livejournal.com/

Hi,

another way would be to write a pixel shader. But it only makes sense if you
are using
OpenGL in your game.–
Paulo

On Tue, Mar 24, 2009 at 3:49 PM, Pierre Phaneuf wrote:

On Mon, Dec 15, 2008 at 2:09 AM, charlie murphy wrote:

I know how to fade a surface from transparent to opaque, but how do I do
that, then a fade-out, changing the background of my game.

A really easy way (sometimes not so efficient, but it depends on the
hardware and video mode) to do a fade is to put a fully transparent
surface in front of everything, filled with black. Then change it from
transparent to opaque, change the background when it’s 100% opaque,
then fade it back to transparent. If there’s an accelerated backend
(OpenGL on SDL 1.3 would be the most reliable for this, I think?), it
should be decently fast, but if it’s done in software, it might suck.

Another way is to mess around with the physical palette if you’re in a
paletted mode (that’s what we do in Quadra), but if you’re not on a
paletted mode (or if it’s emulated), it’s also not the fastest thing
ever.

I don’t remember if SDL can get and set the gamma, that’d be another
way that might work on more hardware (but only in fullscreen, I’d
guess). You’d have to be careful to make sure setting the gamma
actually works, otherwise you’ll end up with stupid delays in the game
where nothing happen (since the fade doesn’t work), and the user will
wonder what the heck is going on… :wink:


http://pphaneuf.livejournal.com/


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org

2009/3/24 Paulo Pinto :

another way would be to write a pixel shader. But it only makes sense if you
are using
OpenGL in your game.

If you’re using OpenGL, you can also just multiply your glColor()
values by some fade modifier. If you’re always using full
intensity+opacity glColor(), obviously no multiplication is really
needed.–
http://codebad.com/

That wouldn’t be exactly like a linear fade, but the function you want is this:

/**

  • \fn int SDL_SetGamma(float red, float green, float blue)On Tue, Mar 24, 2009 at 10:49 AM, Pierre Phaneuf wrote:

I don’t remember if SDL can get and set the gamma, that’d be another
way that might work on more hardware (but only in fullscreen, I’d
guess). You’d have to be careful to make sure setting the gamma
actually works, otherwise you’ll end up with stupid delays in the game
where nothing happen (since the fade doesn’t work), and the user will
wonder what the heck is going on… :wink:

  • \brief Set the gamma correction for each of the color channels on
    the currently selected display.
  • \return 0 on success, or -1 if setting the gamma isn’t supported.
  • \sa SDL_SetGammaRamp()
    */
    extern DECLSPEC int SDLCALL SDL_SetGamma(float red, float green, float blue);


http://codebad.com/

What am I saying? SDL also has SDL_SetGammaRamp() which should be able
to do a linear fade fine!

/**

  • \fn int SDL_SetGammaRamp(const Uint16 * red, const Uint16 * green,
    const Uint16 * blue)On Tue, Mar 24, 2009 at 10:21 PM, Donny Viszneki <@Donny_Viszneki> wrote:

That wouldn’t be exactly like a linear fade, but the function you want is this:
extern DECLSPEC int SDLCALL SDL_SetGamma(float red, float green, float blue);

  • \brief Set the gamma ramp for the currently selected display.
  • \param red The translation table for the red channel, or NULL
  • \param green The translation table for the green channel, or NULL
  • \param blue The translation table for the blue channel, or NULL
  • \return 0 on success, or -1 if gamma ramps are unsupported.
  • Set the gamma translation table for the red, green, and blue channels
  • of the video hardware. Each table is an array of 256 16-bit quantities,
  • representing a mapping between the input and output for that channel.
  • The input is the index into the array, and the output is the 16-bit
  • gamma value at that index, scaled to the output color precision.
  • \sa SDL_GetGammaRamp()
    */
    extern DECLSPEC int SDLCALL SDL_SetGammaRamp(const Uint16 * red,
    const Uint16 * green,
    const Uint16 * blue);


http://codebad.com/