SDL 2.0 and PNG alpha

The TL;DR of this question is, how do you load images in SDL 2.0 while
preserving their alpha values?

I recently attempted to migrate a small project of mine to SDL 2.0 from
1.2, and so far I have everything working, save for the alpha blending.
I’ve not actually taken advantage of the texture render, I’m still building
each frame from surfaces in software, then pushing the result out to the
GPU.

The way I’ve handled alpha blending previously is that my PNG images
contained the per pixel alpha data. I loaded surfaces using IMG_Load from
the SDL Image library, and then optimized them, as explained in tutorials,
by doing something like:
optimizedImage = SDL_DisplayFormatAlpha( loadedImage );

From what I understand, SDL_DisplayFormatAlpha() uses an internal variable
that stores the Display Format, and calls convert surface, and somehow adds
the alpha channel into all of this.

The function is gone from SDL 2.0 though, so I improvised based on my
understanding, by doing:
optimizedSurface = SDL_ConvertSurface( loadedSurface, Format, NULL );

where Format was obtained from: Format=Screen->format;
and Screen is an SDL Surface, tied to the window SDL 2.0 gives:
SDL Surface* Screen = SDL_GetWindowSurface( window );

Unfortunately, this causes me to lose the alpha values from my images. I’ve
tried going without the function, and therefore the optimization.
Everything works, including alpha, though the framerate slows to an
unplayable crawl.

Even if I’m to start using the more advanced features of SDL 2.0, I still
have the fundamental problem of loading these images correctly. Solutions
and insight are greatly appreciated.

Do you set blend mode before rendering ?

You can use the SDL_ConvertSurface() function (yes, it’s still in SDL2) to convert a surface into the same pixel format as the screen surface.

Example code:

Code:

SDL_Surface* pTemporarySurface = NULL;

// Create a temporary surface
pTemporarySurface = IMG_Load(“Image.png”);

if(pTemporarySurface)
{
// Create the image surface in the same pixel format as the window
m_pImageSurface = SDL_ConvertSurface(pTemporarySurface, m_pScreen->format, NULL);
}

Then you just render the converted image surface as usual. The alpha channel in the PNG image is preserved.

When you want to switch to using SDL_Texture’s only, you can do it like this:

Exampel code:

Code:

SDL_Texture* m_pTexture = NULL;

m_pTexture = IMG_LoadTexture(m_pRenderer, “Image.png”);

if(m_pTexture)
{
SDL_Rect Rect = {0, 0, 0, 0};

SDL_QueryTexture(m_pTexture, NULL, NULL, &Rect.w, &Rect.h);

SDL_RenderCopy(m_pRenderer, m_pTexture, NULL, &Rect);

}

The IMG_LoadTexture() function is a function in the SDL_Image library that creates an SDL_Texture from a image file. The alpha channel in the PNG image is preserved when using this function aswell.

1 Like

Hello SDL community,

I know this is a draft:
https://wiki.libsdl.org//SDL_GameControllerGetButton/

BUT, how and why can this function expect an enum
https://wiki.libsdl.org/SDL_GameControllerButton?highlight=(\bCategoryGameController\b)|(CategoryEnum)
as an argument?
For me, a simple integer (which represents SDL_CONTROLLER_BUTTON_Y for
example) would make more sense. (Still, compile error says: "…
invalid conversion from ?int? to ?SDL_GameControllerButton?)
Where (/in which file of the SDL source) is SDL_GameControllerButton
defined anyway?

Thanks a lot in advance.

Cheers
Julian

include/SDL_gamecontroller.h

What’s wrong with an enum anyway? You’re only supposed to pass
values from that enum anyway. If you ever attempt to pass something
else you’re asking for trouble. The enum is there explicitly to state
the valid values. The only place where using an integer would make
sense is to store the value in a file, and in that case you can do an
explicit cast (assuming C++, since in C an implicit cast would be
valid).

For the record, this is the enum:

typedef enum
{
SDL_CONTROLLER_BUTTON_INVALID = -1,
SDL_CONTROLLER_BUTTON_A,
SDL_CONTROLLER_BUTTON_B,
SDL_CONTROLLER_BUTTON_X,
SDL_CONTROLLER_BUTTON_Y,
SDL_CONTROLLER_BUTTON_BACK,
SDL_CONTROLLER_BUTTON_GUIDE,
SDL_CONTROLLER_BUTTON_START,
SDL_CONTROLLER_BUTTON_LEFTSTICK,
SDL_CONTROLLER_BUTTON_RIGHTSTICK,
SDL_CONTROLLER_BUTTON_LEFTSHOULDER,
SDL_CONTROLLER_BUTTON_RIGHTSHOULDER,
SDL_CONTROLLER_BUTTON_DPAD_UP,
SDL_CONTROLLER_BUTTON_DPAD_DOWN,
SDL_CONTROLLER_BUTTON_DPAD_LEFT,
SDL_CONTROLLER_BUTTON_DPAD_RIGHT,
SDL_CONTROLLER_BUTTON_MAX
} SDL_GameControllerButton;

Hmm, that is unfortunate. So the fake display surface in SDL 2 does not
have an alpha channel, so images converted to its format lose their alpha…

Well, maybe you can try constructing your own format to see if the
performance is still affected? I would try making a copy of the screen
surface’s format, then add an alpha mask (which might be a guess for now).

Something like:
SDL_PixelFormat my_format = *Screen->format;
my_format.Amask = 0x000000ff; // A guess… Try 0xff000000 too?
optimizedSurface = SDL_ConvertSurface( loadedSurface, &my_format, NULL );

On Windows, SDL 2 always uses SDL_PIXELFORMAT_RGB888 for the display
surface (see
src/video/windows/SDL_windowsframebuffer.c:WIN_CreateWindowFramebuffer).
So if you want to do a little more work to get the mask right without
guessing, then you can try copying and adjusting the logic in
src/video/SDL_pixels.c:SDL_PixelFormatEnumToMasks to produce a reasonable
Amask from that format enum.

Of course, the most obvious way forward is to stop using
SDL_GetWindowSurface(). I personally have an older project for which I use
my own constructed “screen” surface, then upload that surface each frame
and then render it. Or you can use the render API or something else.

Good luck!
Jonny D

Ok, so you recommend doing something like this:


bool getButtonState(unsigned short int button) {
return ( SDL_GameControllerGetButton( this->gamecontroller,
*(SDL_GameControllerButton)*button )>0);
}
??

You’re only supposed to pass
values from that enum anyway. If you ever attempt to pass something
else you’re asking for trouble.
What? What kind of trouble? What if my controller has more than 15 Buttons?Am 12.10.2014 um 19:25 schrieb Sik the hedgehog:
include/SDL_gamecontroller.h

What’s wrong with an enum anyway? You’re only supposed to pass
values from that enum anyway. If you ever attempt to pass something
else you’re asking for trouble. The enum is there explicitly to state
the valid values. The only place where using an integer would make
sense is to store the value in a file, and in that case you can do an
explicit cast (assuming C++, since in C an implicit cast would be
valid).

For the record, this is the enum:

typedef enum
{
SDL_CONTROLLER_BUTTON_INVALID = -1,
SDL_CONTROLLER_BUTTON_A,
SDL_CONTROLLER_BUTTON_B,
SDL_CONTROLLER_BUTTON_X,
SDL_CONTROLLER_BUTTON_Y,
SDL_CONTROLLER_BUTTON_BACK,
SDL_CONTROLLER_BUTTON_GUIDE,
SDL_CONTROLLER_BUTTON_START,
SDL_CONTROLLER_BUTTON_LEFTSTICK,
SDL_CONTROLLER_BUTTON_RIGHTSTICK,
SDL_CONTROLLER_BUTTON_LEFTSHOULDER,
SDL_CONTROLLER_BUTTON_RIGHTSHOULDER,
SDL_CONTROLLER_BUTTON_DPAD_UP,
SDL_CONTROLLER_BUTTON_DPAD_DOWN,
SDL_CONTROLLER_BUTTON_DPAD_LEFT,
SDL_CONTROLLER_BUTTON_DPAD_RIGHT,
SDL_CONTROLLER_BUTTON_MAX
} SDL_GameControllerButton;


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org

OK I think I know what’s your actual problem.

2014-10-12 14:55 GMT-03:00, Julian Winter :

What? What kind of trouble? What if my controller has more than 15 Buttons?

The Game Controller API is made specifically for those controllers
that are based around the XBox and PlayStation ones - mostly so SDL
manages the mappings for you. If you want to support any other kind of
controller, you should be using the Joystick API instead, which indeed
has no limits on number of buttons, axes or hats (but then you have to
figure out the mappings yourself, although you can use both together
to get mappings for most common controllers while still supporting
other ones).

@ Sik: Thanks!> … although you can use both together

to get mappings for most common controllers while still supporting
other ones).

Am 12.10.2014 um 23:10 schrieb Sik the hedgehog:

OK I think I know what’s your actual problem.

2014-10-12 14:55 GMT-03:00, Julian Winter <@Julian_Winter>:

What? What kind of trouble? What if my controller has more than 15 Buttons?
The Game Controller API is made specifically for those controllers
that are based around the XBox and PlayStation ones - mostly so SDL
manages the mappings for you. If you want to support any other kind of
controller, you should be using the Joystick API instead, which indeed
has no limits on number of buttons, axes or hats (but then you have to
figure out the mappings yourself, although you can use both together
to get mappings for most common controllers while still supporting
other ones).


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org

Manually setting Format.Amask does work, but as with not converting at all,
it kills the performance. Thing is, reading the code in 1.2, this was
exactly what the Display_Format_Alpha function did, without impacting
performance.On Sun, Oct 12, 2014 at 9:26 AM, Jonathan Dearborn wrote:

Hmm, that is unfortunate. So the fake display surface in SDL 2 does not
have an alpha channel, so images converted to its format lose their alpha…

Well, maybe you can try constructing your own format to see if the
performance is still affected? I would try making a copy of the screen
surface’s format, then add an alpha mask (which might be a guess for now).

Something like:
SDL_PixelFormat my_format = *Screen->format;
my_format.Amask = 0x000000ff; // A guess… Try 0xff000000 too?
optimizedSurface = SDL_ConvertSurface( loadedSurface, &my_format, NULL );

On Windows, SDL 2 always uses SDL_PIXELFORMAT_RGB888 for the display
surface (see
src/video/windows/SDL_windowsframebuffer.c:WIN_CreateWindowFramebuffer).
So if you want to do a little more work to get the mask right without
guessing, then you can try copying and adjusting the logic in
src/video/SDL_pixels.c:SDL_PixelFormatEnumToMasks to produce a reasonable
Amask from that format enum.

Of course, the most obvious way forward is to stop using
SDL_GetWindowSurface(). I personally have an older project for which I use
my own constructed “screen” surface, then upload that surface each frame
and then render it. Or you can use the render API or something else.

Good luck!
Jonny D


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org