I want to better understand SDL_SetAlpha

Hello,
I am writing to ask for some help understanding why mode code works
the way it does. I am writing a simple pong clone and decided I wanted
to use alpha channels for the ball and paddle images. So I created the
images with the alpha channels and saved them as PNGs. Then, in my program:

// Load the image
SDL_Surface *temp;
temp = IMG_Load(filename.c_str());

SDL_SetAlpha(temp, 0, 0);

// Construct and store the new SpriteFrame
sf = new SpriteFrame( SDL_DisplayFormatAlpha(temp), delay );

m_Frame.push_back(sf);
// Destroy the temporary image
SDL_FreeSurface(temp);

This code does what I want it to, but I don't exactly understand 

why. As you can see, I am using SDL_image to load in the image. I
looked at the SDL_image source a bit and it appears to detect per-pixel
alpha channels and call SDL_SetAlpha accordingly. So at first I
thought, “then why do I need to call SDL_SetAlpha at all?” And I’m
still not sure why. The SDL docs led me to think that calling
SDL_SetAlpha with the mask set to 0 causes the alpha channel of the
surface it is called on to be completely ignored, but if I call it with
SDL_SRCALPHA it doesn’t blit properly. I’m assuming its my
misunderstanding of the various blit types, i.e. RGBA -> RGB, etc.
That’s another question of mine; is the screen surface RGB or RGBA? I’m
thinking RGB but I couldn’t really find any confirmation on that. My
thoughts on the above code, however, are that SDL_SetAlpha perhaps has
to do with how the alpha channels of BOTH surfaces in the blit are
combined, rather than just the surface it is called on? I’m really
confused I think. Any help would be greatly appreciated; maybe I just
misunderstand alpha, but I thought I understood the basic idea and just
misunderstand SDL_SetAlpha. I looked at the SDL source code as well;
both the header and the implementation.

Matthew Hurne

I looked at the man page on the subject. This relevant excerpt
corroborates the behavior you expect:

“If SDL_SRCALPHA is not passed as a flag then all alpha information is
ignored when blitting the surface.”

However you didn’t tell us any of the following:
Your operating system
Your SDL build-time version
Your SDL run-time version
Your screen surface color depth

Another big problem here is that you’re very vague. For instance, you
say when you use SDL_SRCALPHA with SDL_SetAlpha, “it doesn’t blit
properly.” Well what does that mean, exactly?

Also, just so we don’t waste a lot of time over nothing, did you try
passing SDL_SRCALPHA as the second or third SDL_SetAlpha() parameter?On Jul 11, 2005, at 5:33 AM, Matthew Hurne wrote:

Hello,
I am writing to ask for some help understanding why mode code works
the way it does. I am writing a simple pong clone and decided I
wanted to use alpha channels for the ball and paddle images. So I
created the images with the alpha channels and saved them as PNGs.
Then, in my program:

// Load the image
SDL_Surface *temp;
temp = IMG_Load(filename.c_str());
SDL_SetAlpha(temp, 0, 0);
// Construct and store the new SpriteFrame
sf = new SpriteFrame( SDL_DisplayFormatAlpha(temp), delay );

m_Frame.push_back(sf);
// Destroy the temporary image
SDL_FreeSurface(temp);

This code does what I want it to, but I don’t exactly understand
why. As you can see, I am using SDL_image to load in the image. I
looked at the SDL_image source a bit and it appears to detect
per-pixel alpha channels and call SDL_SetAlpha accordingly. So at
first I thought, "then why do I need to call SDL_SetAlpha at all?"
And I’m still not sure why. The SDL docs led me to think that calling
SDL_SetAlpha with the mask set to 0 causes the alpha channel of the
surface it is called on to be completely ignored, but if I call it
with SDL_SRCALPHA it doesn’t blit properly. I’m assuming its my
misunderstanding of the various blit types, i.e. RGBA -> RGB, etc.
That’s another question of mine; is the screen surface RGB or RGBA?
I’m thinking RGB but I couldn’t really find any confirmation on that.
My thoughts on the above code, however, are that SDL_SetAlpha perhaps
has to do with how the alpha channels of BOTH surfaces in the blit are
combined, rather than just the surface it is called on? I’m really
confused I think. Any help would be greatly appreciated; maybe I just
misunderstand alpha, but I thought I understood the basic idea and
just misunderstand SDL_SetAlpha. I looked at the SDL source code as
well; both the header and the implementation.

Matthew Hurne


SDL mailing list
SDL at libsdl.org
http://www.libsdl.org/mailman/listinfo/sdl

First of all thanks for replying, I didn’t think anyone would. Its my
first post to the list so I’m sorry it was vague and incomplete. I
looked for some kind of “guide” for first-posters so I wouldn’t piss
anyone off but couldn’t find one.

Donny Viszneki wrote:

I looked at the man page on the subject. This relevant excerpt
corroborates the behavior you expect:

“If SDL_SRCALPHA is not passed as a flag then all alpha information is
ignored when blitting the surface.”

Yes, that is entirely relevant. “The SDL docs led me to think that
calling SDL_SetAlpha with the mask set to 0 causes the alpha channel of
the surface it is called on to be completely ignored.”

However you didn’t tell us any of the following:
Your operating system
Your SDL build-time version
Your SDL run-time version
Your screen surface color depth

Windows XP
Run-time:
    SDL 1.2.8 (SDL-1.2.8-win32.zip)
    SDL_image 1.2.4 (SDL_image-1.2.4-win32.zip)
Build-time:
   SDL 1.2.8 (SDL-devel-1.2.8-mingw32.tar.gz)
   SDL_image 1.2.4 (SDL_image-devel-1.2.4-VC6.zip)
I'm using Dev-C++, so my compiler is mingw.
The screen surface is a software surface and its bpp matches the 

desktop screen depth (in theory at least): 32-bit:
SDL_SetVideoMode(800, 600, 0, SDL_ANYFORMAT | SDL_SWSURFACE);

However, I see most of that as being fairly unimportant since I 

don’t think SDL is doing anything incorrectly - i.e. its not a bug. Its
just my lack of understanding. I could decide that “hey at least it
works, who cares what it means” but I’d prefer not to.

Another big problem here is that you’re very vague. For instance, you
say when you use SDL_SRCALPHA with SDL_SetAlpha, “it doesn’t blit
properly.” Well what does that mean, exactly?

Sorry about that.  If I call SDL_SetAlpha(surface, SDL_SRCALPHA, 0) 

then interestingly enough I get a ball or paddles with transparent
corners and a bit of a curve at each corner, and then the ball/paddle
itself is all opaque white. Its seems as if it uses a colorkey for the
pixels which are completely masked by the alpha channel. In addition,
the sprites leave trails behind on the screen surface.
So some of the more specific questions I have:
Is the screen suface RGB or RGBA?
Is the image surface created by loading a PNG with an alpha
channel using SDL_image RGB or RGBA, and does SDL_image call any or all
of SDL_SetColorKey, SDL_SetAlpha, SDL_DisplayFormat,
SDL_DisplayFormatAlpha, and if so, which and when?
What is the SRC in SDL_SRCALPHA, anyway?
Why do I have to call SDL_SetAlpha(surface, 0, 0) to have the
sprite surface semi-opaque pixels blend their colors with the screen
surface? To me, SDL_SetAlpha(surface, 0, 0) seems to mean “just draw it
as if it doesn’t have an alpha channel”, but that’s not what it appears
to do, so clearly I don’t understand.

Thanks…I hope I’ve been more clear.

Matt> Also, just so we don’t waste a lot of time over nothing, did you try

passing SDL_SRCALPHA as the second or third SDL_SetAlpha() parameter?

On Jul 11, 2005, at 5:33 AM, Matthew Hurne wrote:

Hello,
I am writing to ask for some help understanding why mode code
works the way it does. I am writing a simple pong clone and decided
I wanted to use alpha channels for the ball and paddle images. So I
created the images with the alpha channels and saved them as PNGs.
Then, in my program:

// Load the image
SDL_Surface *temp;
temp = IMG_Load(filename.c_str());
SDL_SetAlpha(temp, 0, 0);
// Construct and store the new SpriteFrame
sf = new SpriteFrame( SDL_DisplayFormatAlpha(temp), delay );

m_Frame.push_back(sf);
// Destroy the temporary image
SDL_FreeSurface(temp);

This code does what I want it to, but I don’t exactly understand
why. As you can see, I am using SDL_image to load in the image. I
looked at the SDL_image source a bit and it appears to detect
per-pixel alpha channels and call SDL_SetAlpha accordingly. So at
first I thought, "then why do I need to call SDL_SetAlpha at all?"
And I’m still not sure why. The SDL docs led me to think that
calling SDL_SetAlpha with the mask set to 0 causes the alpha channel
of the surface it is called on to be completely ignored, but if I
call it with SDL_SRCALPHA it doesn’t blit properly. I’m assuming its
my misunderstanding of the various blit types, i.e. RGBA -> RGB,
etc. That’s another question of mine; is the screen surface RGB or
RGBA? I’m thinking RGB but I couldn’t really find any confirmation
on that. My thoughts on the above code, however, are that
SDL_SetAlpha perhaps has to do with how the alpha channels of BOTH
surfaces in the blit are combined, rather than just the surface it is
called on? I’m really confused I think. Any help would be greatly
appreciated; maybe I just misunderstand alpha, but I thought I
understood the basic idea and just misunderstand SDL_SetAlpha. I
looked at the SDL source code as well; both the header and the
implementation.

Matthew Hurne


SDL mailing list
SDL at libsdl.org
http://www.libsdl.org/mailman/listinfo/sdl


SDL mailing list
SDL at libsdl.org
http://www.libsdl.org/mailman/listinfo/sdl

Sorry about that. If I call SDL_SetAlpha(surface, SDL_SRCALPHA,
0) then interestingly enough I get a ball or paddles with transparent
corners and a bit of a curve at each corner, and then the ball/paddle
itself is all opaque white. Its seems as if it uses a colorkey for
the pixels which are completely masked by the alpha channel. In
addition, the sprites leave trails behind on the screen surface.

What blitting technique are you using to get rid of trails?

Every sprite has a draw(), loadBG() and replaceBG() method. draw()
blits the sprite surface to the screen. loadBG() blits the screen
surface to a bgreplacement surface owned by the sprite using the
sprite’s rect (so the location and dimensions of where the sprite is
about to be drawn). replaceBG() puts back the screen surface data from
bgreplacement before moving the sprite. It works fine using images
without alpha or using the png’s with alpha channels if I call
SDL_SetAlpha(surface, 0, 0).

So some of the more specific questions I have:
Is the screen suface RGB or RGBA?

See for yourself. Here are some SDL_video.h excerpts to get you going:

typedef struct SDL_Surface {
Uint32 flags; /* Read-only */
SDL_PixelFormat format; / Read-only */
. . .

typedef struct SDL_PixelFormat {
SDL_Palette *palette;
Uint8 BitsPerPixel;
Uint8 BytesPerPixel;
. . .
Uint32 Rmask;
Uint32 Gmask;
Uint32 Bmask;
Uint32 Amask;
. . .

Using that you should be able to gather this information from your
screen surface. Although officially the behavior is supposed to be

Ok, I added the following to the SpriteFrame’s constructor, which
receives the surface in question:

if (m_Image->flags & SDL_SRCALPHA == 0) {
cout << “The frame surface is RGB” << endl;
} else {
cout << “The frame surface is RGBA” << endl;
}

After running the program, my stdout.txt had "The frame surface is RGBA"
printed twice (there are only two sprite frames, the ball and the
paddle). If I changed the call to SDL_SetAlpha(temp, SDL_SRCALPHA, 0),
I get the same thing. Interesting. Though I’m not well-practiced with
flags, so please let me know if that test is useless. I used a similar
test on the screen surface and it told me it was RGBA; but that seems
strange to me, so I bet my test is bunk anyway.

  Is the image surface created by loading a PNG with an alpha 

channel using SDL_image RGB or RGBA, and does SDL_image call any or
all of SDL_SetColorKey, SDL_SetAlpha, SDL_DisplayFormat,
SDL_DisplayFormatAlpha, and if so, which and when?

Wow, this surprised me. I had assumed that SDL_Image never set any of
my rendering flags. But I guess it makes sense to set the color key
for some image file formats, because some image file formats support
specifying a transparent color key!

~/Desktop/SDL_image-1.2.4$ grep SDL_DisplayFormat *.c
~/Desktop/SDL_image-1.2.4$ grep SDL_SetAlpha *.c
~/Desktop/SDL_image-1.2.4$ grep SDL_SetColorKey *.c
IMG_gif.c: SDL_SetColorKey(image, SDL_SRCCOLORKEY,
Gif89.transparent);
IMG_lbm.c: SDL_SetColorKey( Image, SDL_SRCCOLORKEY,
bmhd.tcolor );
IMG_png.c: SDL_SetColorKey(surface, SDL_SRCCOLORKEY, ckey);
IMG_tga.c: SDL_SetColorKey(img, SDL_SRCCOLORKEY, ckey);
IMG_xpm.c: SDL_SetColorKey(image,
SDL_SRCCOLORKEY, pixel);

Reading some of the context of the line from IMG_png.c, it would seem
that if your PNG is in a paletted color mode, it uses the function
png_get_tRNS() to retrieve the transparent color key from the PNG
image. So the PNG editor you’re using may be storing a color key, but
I can’t imagine that it’s storing both an alpha channel AND using a
paletted color mode.

I wanted to assume that SDL_image didn’t mess with colorkey or alpha
either, until I couldn’t figure out what was going on and took a look at
the source like you did. Trouble is, I don’t fully understand it. :slight_smile:

I created the images in Paint Shop Pro, which allowed me to create an
alpha channel when saving as PNG using a mask. I wasn’t positive if PSP
was doing things properly or not, so I downloaded AlphaMix
(http://www.schaik.com/png/alphamix.html) , loaded in their sample image
which they claim has an alpha channel to see how AlphaMix displayed it,
and then loaded in my sprite surfaces to compare. Everything seemed to
work perfectly, what was supposed to be transparent was transparent and
what wasn’t supposed to be transparent wasn’t. PSP has an “image
information” dialog which states “Number of alphas: 0” for my sprite
images, but it also says that for the image I tested that came with
AlphaMix, so I don’t think that indicates any problem with the images
themselves.

You can also have AlphaMix ignore the alpha channel. Interestingly
enough, the images looked the same in AlphaMix not displaying their
alpha channels as they do in the game when I call SDL_SetAlpha(temp,
SDL_SRCALPHA, 0) rather than SDL_SetAlpha(temp, 0, 0).

  What is the SRC in SDL_SRCALPHA, anyway?

The SRC in question is the source of the blit. In our case,
SDL_SRCALPHA is meant to imply that “when blitting, respect the alpha
data from the source surface.” Similarly named blending modes exist in
DirectDraw, Direct3D. and OpenGL.

So SDL_SetAlpha sets the flag SDL_SRCALPHA (or whatever flag the surface
uses) of the surface parameter given, and when that surface is blitted
on to another surface the blit respects the alpha data of that surface,
correct? Or at least in theory?

It does NOT mean “when blitting, respect the alpha data of the surface
we are blitting on to,” correct? I wouldn’t expect it to mean that, but
I thought I’d ask.

  Why do I have to call SDL_SetAlpha(surface, 0, 0) to have the 

sprite surface semi-opaque pixels blend their colors with the screen
surface? To me, SDL_SetAlpha(surface, 0, 0) seems to mean “just draw
it as if it doesn’t have an alpha channel”, but that’s not what it
appears to do, so clearly I don’t understand.

You’re correct, so something else must be going wrong. In particular,
I observed but forgot to comment on this earlier:

// Load the image
SDL_Surface *temp;
temp = IMG_Load(filename.c_str());
SDL_SetAlpha(temp, 0, 0);
// Construct and store the new SpriteFrame
sf = new SpriteFrame( SDL_DisplayFormatAlpha(temp), delay );

Why are you calling SetAlpha on the surface that gets freed? I’m not
sure how SDL should behave in this situation.

Well, here’s what’s happening, or at least what I think is happening:

  1. SDL_image loads the image file and creates a new surface with the
    image data, and returns that ( IMG_Load() )
  2. I call SDL_SetAlpha on that surface.
  3. I call SDL_DisplayFormatAlpha() on that same surface, we creates a
    new surface that is a copy of the temporary surface, but in the pixel
    format of the screen including an alpha channel.
  4. The SpriteFrame constructor copies the pointer for the surface
    from the call to SDL_DisplayFormatAlpha for its m_Image surface pointer,
    so now the SpriteFrame is pointing to the new surface created by
    SDL_DisplayFormatAlpha.
  5. I need to call FreeSurface on the temporary surface, since I don’t
    need it anymore.

Just in case I was confused, I commented out the call to FreeSurface,
but the way the program works is unaffected.

Also you didn’t seem to notice this question:

Also, just so we don’t waste a lot of time over nothing, did you try
passing SDL_SRCALPHA as the second or third SDL_SetAlpha() parameter?

I implied my answer to that in the first paragraph of my response: “If I
call SDL_SetAlpha(surface, SDL_SRCALPHA, 0) then interestingly enough I
get a ball or…”, sorry I wasn’t more clear. I passed SDL_SRC_ALPHA as
the second parameter, flags. My understanding is that the third
parameter is for per-surface alpha, from 0 to 255, 0 being transparent
and 255 fully opaque. I also understand that SDL will use either
per-pixel alpha or per-surface, but not both.

if (m_Image->flags & SDL_SRCALPHA == 0) {
cout << “The frame surface is RGB” << endl;
} else {
cout << “The frame surface is RGBA” << endl;
}

After running the program, my stdout.txt had "The frame surface is RGBA"
printed twice (there are only two sprite frames, the ball and the
paddle). If I changed the call to SDL_SetAlpha(temp, SDL_SRCALPHA, 0),
I get the same thing. Interesting. Though I’m not well-practiced with
flags, so please let me know if that test is useless.

Your if-expression is incorrect. To fix it, you need parentheses
around the bitwise-and expression:

((m_Image->flags & SDL_SRCALPHA) == 0)

The way you have it now is read by the C compiler as:

(m_Image->flags & (SDL_SRCALPHA == 0))

… which will always be false.

b

Brian Raiter wrote:

if (m_Image->flags & SDL_SRCALPHA == 0) {
cout << “The frame surface is RGB” << endl;
} else {
cout << “The frame surface is RGBA” << endl;
}

After running the program, my stdout.txt had "The frame surface is RGBA"
printed twice (there are only two sprite frames, the ball and the
paddle). If I changed the call to SDL_SetAlpha(temp, SDL_SRCALPHA, 0),
I get the same thing. Interesting. Though I’m not well-practiced with
flags, so please let me know if that test is useless.

Your if-expression is incorrect. To fix it, you need parentheses
around the bitwise-and expression:

((m_Image->flags & SDL_SRCALPHA) == 0)

The way you have it now is read by the C compiler as:

(m_Image->flags & (SDL_SRCALPHA == 0))

… which will always be false.

b

Haha, yup, I thought something was probably wrong with that. Thanks!

Matt