1.2 SDL_BlitSurface and colorkeying (long)

[ Lots of rambling here, sorry. I wrote and debugged at the same time. ]
[ Skip to the end for a possible bug description. ]

In loki.open-source.sdl, you wrote:

Okay, turns out that wasn’t the problem in the end, the problem was that
the destination surface now has SDL_SRCALPHA set where previously it
didn’t. I’m not sure I understand the reasoning, but hey, problem solved.

oh no, don’t think you’ll get away that easily :slight_smile:

SDL_SRCALPHA should have no bearing on the result when set on the
destination surface, or there is a bug in SDL. Did you try my patch,
or using DisplayFormatAlpha?

Actually because I cheated to start with and didn’t tell the whole story.

The texture in question is a cockpit for a flight sim game I’m working
on (yeah, I know, I know, everyone has a flight sim now, but it’s fun
and at least I’m not trying to hype it as something useable).

The textures (for backwards compatibility with an old flight sim engine)
are palettized PCX files, where color index 0 is the color key (i.e.,
transparent).

The cockpit init loads the PCX files as I specified in my original
code, using the SDL_BlitSurface alpha-fill/copy hack to set color 0
pixels to alpha=0. After this it is split into sub-images with ^2
sizes for loading into opengl textures. It was at this stage
that the problem occurred: the original load ended up with images
with SDL_SRCALPHA set, so the blits to make the sub-tiles weren’t
copying anything somehow.

A code fragment that would show the problem is (with suitable gdb dumps
of surface formats and comments):

    #if SDL_BYTEORDER == SDL_LIL_ENDIAN
        static const Uint32 rmask = 0x000000FF;
        static const Uint32 bmask = 0x0000FF00;
        static const Uint32 gmask = 0x00FF0000;
        static const Uint32 amask = 0xFF000000;
    #else
        static const Uint32 rmask = 0xFF000000;
        static const Uint32 bmask = 0x00FF0000;
        static const Uint32 gmask = 0x0000FF00;
        static const Uint32 amask = 0x000000FF;
    #endif

    SDL_Surface *orig, *rgba, *tile;
    SDL_Rect src;

    orig = IMG_Load("FILENAME.PCX");

/*****************************************
(gdb) print *orig                     
$3 = {flags = 0x0, format = 0x9a678d8, w = 640, h = 480, pitch = 640, 
pixels = 0x49c17008, offset = 0, hwdata = 0x0,
    clip_rect = {x = 0, y = 0, w = 640, h = 480}, unused1 = 0xffffffff,
locked = 0, map = 0x8ed99b8, format_version = 0, refcount = 1}
(gdb) print *orig->format
$4 = {palette = 0x8ecd878, BitsPerPixel = 8 '\b',
BytesPerPixel = 1 '\001', Rloss = 8 '\b', Gloss = 8 '\b',
Bloss = 8 '\b', Aloss = 8 '\b', Rshift = 0 '\000', Gshift = 0 '\000',
Bshift = 0 '\000', Ashift = 0 '\000', Rmask = 0, Gmask = 0, Bmask = 0,
Amask = 0, colorkey = 0, alpha = 255 '?'}
******************************************/

    rgba = SDL_CreateRGBSurface(SDL_SWSURFACE, orig->w, orig->h, 32,
                                rmask, bmask, gmask, amask);

    /* Palettized images have color index 0 transparent.  We set the
     * color key (transparent color) here, and fill the alpha channel
     * in the dest with 0.  When we do the blit the color key pixels
     * aren't copied and retain the 0 alpha value. */
    SDL_SetColorKey(orig, SDL_SRCCOLORKEY, 0);
    SDL_FillRect(rgba, NULL, SDL_MapRGBA(rgba->format, 0, 0, 0, 0));
    SDL_BlitSurface(orig, NULL, rgba, NULL);

/*****************************************
(gdb) print *rgba
$5 = {flags = 0x10000, format = 0x9a67d50, w = 640, h = 480,
pitch = 2560, pixels = 0x4a613008, offset = 0, hwdata = 0x0,
clip_rect = {x = 0, y = 0, w = 640, h = 480}, unused1 = 0xffffffff,
locked = 0, map = 0x8ed99e0, format_version = 0, refcount = 1}
(gdb) print *rgba->format
$6 = {palette = 0x0, BitsPerPixel = 32 ' ', BytesPerPixel = 4 '\004', 
Rloss = 0 '\000', Gloss = 0 '\000', Bloss = 0 '\000', Aloss = 0 '\000',
Rshift = 0 '\000', Gshift = 8 '\b', Bshift = 16 '\020',
Ashift = 24 '\030', Rmask = 0xff, Gmask = 0xff00, Bmask = 0xff0000,
Amask = 0xff000000, colorkey = 0, alpha = 255 '?'}
******************************************/

    tile = SDL_CreateRGBSurface(SDL_SWSURFACE, 256, 256,
                                32, rmask, bmask, gmask, amask);
    src.x = 0; src.y = 0; src.w = 256; src.h = 256;
    SDL_BlitSurface(rgba, &src, subimg, NULL);

After this final blit the “subimg” surface is either all black or all
alpha=0 (I’m not 100% sure which). The SDL_SetAlpha doc says that in
this case:

The source is alpha-blended with the destination using the source
alpha channel. The alpha channel in the destination surface is
left untouched.

So I assume that the alpha=0 for every pixel is remaining in the
destination after the blit (I’m having trouble imagining a scenario where
this would be helpful, but I’m probably not thinking hard enough).

If I add:

rgba->flags =& ~SDL_SRCALPHA;

just before the final blit, all is fine.

Phew, that was a long message to satisfy curiousity. I guess the
difference I’m noticing is that SDL_SRCALPHA is ON by default now when
using SDL_CreateRGBSurface, whereas previously it was OFF unless I
specified it in the flags argument. Does that sound right?

These are the RPMS of SDL 1.2. I haven’t tried the CVS version recently.

For reference, here is a small test-case that illustrates the “problem”:

#include <stdio.h>
#include <SDL/SDL.h>

static const Uint32 rm = 0x000000FF; static const Uint32 bm = 0x0000FF00;
static const Uint32 gm = 0x00FF0000; static const Uint32 am = 0xFF000000;

int
main(int argc, const char *argv[])
{
    SDL_Surface *s;

    SDL_Init(SDL_INIT_VIDEO);
    s = SDL_CreateRGBSurface(SDL_SWSURFACE, 100, 100, 32, rm, bm, gm, am);
    fprintf(stdout, "Surface flags were: %x\n", s->flags);
    SDL_FreeSurface(s);
    SDL_Quit();
    return 0;
}

I would have expected this to show 0x0, but it shows 0x10000 (SDL_SRCALPHA).

  • Mike

Actually because I cheated to start with and didn’t tell the whole story.

no wonder then

(gdb) print *rgba
$5 = {flags = 0x10000, format = 0x9a67d50, w = 640, h = 480,

very nice way of showing what is happening - wish more people did that.
Note that flags = SDL_SRCALPHA; this is because that flag is set
automatically by CreateRGBSurface when you specify an alpha channel
(that is, when Amask != 0)

If I add:

rgba->flags =& ~SDL_SRCALPHA;

just before the final blit, all is fine.

use SDL_SetAlpha to clear the SDL_SRCALPHA instead - it takes care to
invalidate blit mappings and other stuff so it should be safer

Phew, that was a long message to satisfy curiousity. I guess the
difference I’m noticing is that SDL_SRCALPHA is ON by default now when
using SDL_CreateRGBSurface, whereas previously it was OFF unless I
specified it in the flags argument. Does that sound right?

I’d guess it happened around 1.1.5 or so, when we changed a lot of the
alpha blit semantics. I’ll add a note to the docs of CreateRGBSurface[from]
to clarify

Note that flags = SDL_SRCALPHA; this is because that flag is set
automatically by CreateRGBSurface when you specify an alpha channel
(that is, when Amask != 0)

Okay, this is the bit that doesn’t make sense to me. Why is SDL_SRCALPHA
set if Amask != 0? In fact, why is SDL_SRCALPHA in the surface flags
at all? It doesn’t seem to be a property inherent to a surface,
but rather a behaviour for SDL_BlitSurface. It seems like it would
make more sense for SDL_SRCALPHA to be given in a “flags” argument to
SDL_BlitSurface than being a surface property. What am I missing? I
guess this must be an artifact of the hardware accel somehow, right?

rgba->flags =& ~SDL_SRCALPHA;
use SDL_SetAlpha to clear the SDL_SRCALPHA instead - it takes care to
invalidate blit mappings and other stuff so it should be safer

Is this the officially sanctioned way of removing SDL_SRCALPHA while
leaving the other flags alone?:

SDL_SetAlpha(rgba, rgba->flags & SDL_RLEACCEL, 0);

I’ll add a note to the docs of CreateRGBSurface[from] >to clarify

Thanks.

  • Mike

Okay, this is the bit that doesn’t make sense to me. Why is SDL_SRCALPHA
set if Amask != 0?

I think it was Sam who put it in so I’ll let him answer. Anyway it’s just
the default and you can change it later so I don’t think it’s a big deal.
The online docs now point this out

In fact, why is SDL_SRCALPHA in the surface flags
at all? It doesn’t seem to be a property inherent to a surface,
but rather a behaviour for SDL_BlitSurface. It seems like it would
make more sense for SDL_SRCALPHA to be given in a “flags” argument to
SDL_BlitSurface than being a surface property.

my opinion as well and one of many things I would like to change for
an API revision, but in 1.2 we’re stuck with this

Is this the officially sanctioned way of removing SDL_SRCALPHA while
leaving the other flags alone?:

SDL_SetAlpha(rgba, rgba->flags & SDL_RLEACCEL, 0);

make it

SDL_SetAlpha(rgba, rgba->flags & (SDL_RLEACCEL | SDL_RLEACCELOK), 0);

and it should work. This is another quirk: when you set SDL_RLEACCEL
(either by calling SetAlpha or SetColorKey), SDL_RLEACCEL isn’t set at
once but just SDL_RLEACCELOK to mark the surface as RLE-able, and at
the next blit it will be RLE-coded and SDL_RLEACCEL will be set to
mark it as such

yes, this is another thing that should be cleaned up