Colorkey and GL renderer

As I mentioned this morning, when I use the GL renderer, all my sprites are suddenly showing their colorkeys. I’ve managed to trace this down to a bug in my own code where I was assigning the colorkey to the SDL_Surface after calling SDL_CreateTextureFromSurface for some sprites. This was apparently working correctly under the software renderer because the texture is based very closely on the underlying surface, or something like that.

But fixing it makes things even stranger: If I’ve assigned a colorkey to the surface using SDL_SetColorKey before converting it to a texture, the resulting texture is a 24x16 rectangle of pure black. (The original image was 24x384!) Something is badly broken here…

As I mentioned this morning, when I use the GL renderer,
all my sprites are suddenly showing their colorkeys. I’ve
managed to trace this down to a bug in my own code where
I was assigning the colorkey to the SDL_Surface after
calling SDL_CreateTextureFromSurface for some sprites. This
was apparently working correctly under the software renderer
because the texture is based very closely on the underlying
surface, or something like that.

But fixing it makes things even stranger: If I’ve assigned a
colorkey to the surface using SDL_SetColorKey before
converting it to a texture, the resulting texture is a 24x16
rectangle of pure black. (The original image was 24x384!)
Something is badly broken here…

Oops! That was at my end too. (Maybe this is my code’s
way of telling me it’s time for some cleanup?) Now I’m
just getting the same thing as before: the entire sprite
showing, including the colorkey.

So… why doesn’t SDL_CreateTextureFromSurface respect
colorkeys when creating OpenGL textures?>----- Original Message ----

From: Mason Wheeler <@Mason_Wheeler>
Subject: [SDL] Colorkey and GL renderer

As I mentioned this morning, when I use the GL renderer,
all my sprites are suddenly showing their colorkeys. I’ve
managed to trace this down to a bug in my own code where
I was assigning the colorkey to the SDL_Surface after
calling SDL_CreateTextureFromSurface for some sprites. This
was apparently working correctly under the software renderer
because the texture is based very closely on the underlying
surface, or something like that.

So… why doesn’t SDL_CreateTextureFromSurface respect
colorkeys when creating OpenGL textures?

OK, I found out why. Hard to say whether or not this is by design,
though.

Inside SDL_CreateTextureFromSurface, if the surface you passed
in is not of the same pixel format as the texture you’re creating, it
does the following:

    dst = SDL_ConvertSurface(surface, &dst_fmt, 0);
    if (dst) {
        SDL_UpdateTexture(texture, NULL, dst->pixels, dst->pitch);
        SDL_FreeSurface(dst);
    }

Inside SDL_ConvertSurface, it carefully translates colorkey data into
per-pixel alpha. It even says,
/* This is needed when converting for 3D texture upload */
Then it sets the blend mode to take advantage of the per-pixel alpha:
SDL_SetSurfaceBlendMode(convert, SDL_BLENDMODE_BLEND);

…and then it returns, uploads the texture from the converted surface,
and throws the converted surface (and its blend mode) away. All this
has been run in a conditional block, of course, so it won’t necessarily
have executed.

After this, the following unconditional block of code runs. “surface” is
the original surface that was passed in.

{
    Uint8 r, g, b, a;
    int blendMode;
    int scaleMode;

    SDL_GetSurfaceColorMod(surface, &r, &g, &b);
    SDL_SetTextureColorMod(texture, r, g, b);

    SDL_GetSurfaceAlphaMod(surface, &a);
    SDL_SetTextureAlphaMod(texture, a);

    SDL_GetSurfaceBlendMode(surface, &blendMode);
    SDL_SetTextureBlendMode(texture, blendMode);

    SDL_GetSurfaceScaleMode(surface, &scaleMode);
    SDL_SetTextureScaleMode(texture, scaleMode);
}

It’s using the original surface’s blend mode, which was set (or not set)
as appropriate for the original surface’s pixel format, instead of the
new one that was calculated inside SDL_ConvertSurface specifically
for this purpose.

This suggests an obvious workaround: If you know about this, you can
make sure to set the blend mode that will be correct for the texture on
the original surface, and then it works. But I don’t really like that
solution. It doesn’t feel right. It’s requiring you to change one value to
something that may be inappropriate in its context, in order to achieve
a side effect in another context later on.

So there’s the problem. Does anyone have any ideas as to how to fix
it?>>----- Original Message ----

From: Mason Wheeler <@Mason_Wheeler>
Subject: [SDL] Colorkey and GL renderer

Examining that code, I think the unconditional block should be taking
the parameters from ‘dst’ if it did the surface conversion.

Can you try that and see if it does the right thing?On Wed, Apr 14, 2010 at 8:51 PM, Mason Wheeler wrote:

----- Original Message ----

From: Mason Wheeler
Subject: [SDL] Colorkey and GL renderer

As I mentioned this morning, when I use the GL renderer,
all my sprites are suddenly showing their colorkeys. ?I’ve
managed to trace this down to a bug in my own code where
I was assigning the colorkey to the SDL_Surface after
calling SDL_CreateTextureFromSurface for some sprites. ?This
was apparently working correctly under the software renderer
because the texture is based very closely on the underlying
surface, or something like that.

So… why doesn’t SDL_CreateTextureFromSurface respect
colorkeys when creating OpenGL textures?

OK, I found out why. ?Hard to say whether or not this is by design,
though.

Inside SDL_CreateTextureFromSurface, if the surface you passed
in is not of the same pixel format as the texture you’re creating, it
does the following:

? ? ? ?dst = SDL_ConvertSurface(surface, &dst_fmt, 0);
? ? ? ?if (dst) {
? ? ? ? ? ?SDL_UpdateTexture(texture, NULL, dst->pixels, dst->pitch);
? ? ? ? ? ?SDL_FreeSurface(dst);
? ? ? ?}

Inside SDL_ConvertSurface, it carefully translates colorkey data into
per-pixel alpha. ?It even says,
/* This is needed when converting for 3D texture upload */
Then it sets the blend mode to take advantage of the per-pixel alpha:
? ? ? ?SDL_SetSurfaceBlendMode(convert, SDL_BLENDMODE_BLEND);

…and then it returns, uploads the texture from the converted surface,
and throws the converted surface (and its blend mode) away. ?All this
has been run in a conditional block, of course, so it won’t necessarily
have executed.

After this, the following unconditional block of code runs. “surface” is
the original surface that was passed in.

? ?{
? ? ? ?Uint8 r, g, b, a;
? ? ? ?int blendMode;
? ? ? ?int scaleMode;

? ? ? ?SDL_GetSurfaceColorMod(surface, &r, &g, &b);
? ? ? ?SDL_SetTextureColorMod(texture, r, g, b);

? ? ? ?SDL_GetSurfaceAlphaMod(surface, &a);
? ? ? ?SDL_SetTextureAlphaMod(texture, a);

? ? ? ?SDL_GetSurfaceBlendMode(surface, &blendMode);
? ? ? ?SDL_SetTextureBlendMode(texture, blendMode);

? ? ? ?SDL_GetSurfaceScaleMode(surface, &scaleMode);
? ? ? ?SDL_SetTextureScaleMode(texture, scaleMode);
? ?}

It’s using the original surface’s blend mode, which was set (or not set)
as appropriate for the original surface’s pixel format, instead of the
new one that was calculated inside SDL_ConvertSurface specifically
for this purpose.

This suggests an obvious workaround: ?If you know about this, you can
make sure to set the blend mode that will be correct for the texture on
the original surface, and then it works. ?But I don’t really like that
solution. ?It doesn’t feel right. ?It’s requiring you to change one value to
something that may be inappropriate in its context, in order to achieve
a side effect in another context later on.

So there’s the problem. ?Does anyone have any ideas as to how to fix
it?


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org


-Sam Lantinga, Founder and President, Galaxy Gameworks LLC