Transparency not that transparent

Hi,
I’m working on a demo for a program that will emulate
cel-animation layers, so it needs to stack a series
of partially transparent surfaces up and display them.

The obvious (I thought) way to do this is by making
all the surfaces alpha surfaces. Using SDL_Image, I
loaded PNG file resources to do this.

However, I have found that the PNG resources are not
all consistent (probably my use of ImageMagick) in that
some use the Alpha channel and some use Color Key to
do transparency. Alpha channel appears to work better
with the SGE rotations I’m using, so I’d like them
to all be in an Alpha channel format.

What I really want is to simply use the colorkey
information to reconstruct the missing alpha channel.
It’s probably possible to fix this by fixing the
resources themselves, but it’d be more robust if my
program could handle either type. But doing this
is not so transparent, if you’ll pardon the pun. :slight_smile:

In principle, one ought to be able to do it by simply
setting the alpha value of every pixel to either
fully transparent or fully opaque based on whether
the RGB values match the color key. This seems
fairly tricky, though, especially since both
RGB and RGBA are represented as Uint32. I assume the
color key is probably just RGB?

I can imagine breaking each pixel into RGBA values and
comparing channel-by-channel, then reconstructing the
pixel and re-inserting it in the image. But that sounds
terminally slow. If I understood the internals of
the format, I’d do it with bitwise math of some kind,
but of course, I want the code to stay portable.

I’ve tried using the SDL_DisplayFormat() and
SDL_DisplayFormatAlpha() functions, but they seem to
be doing weird additional stuff, because they mess up
the colors of the images (and the alpha channel doesn’t
seem to be as expected, either). This might have
something to do with the 16 bpp screen mode it has to
be converted to for display, but I don’t have color
problems if I eliminate this step (but no transparency
either).

I’d really appreciate some advice about how to solve
this from someone who’s been using this longer than
I have (i.e. anyone :slight_smile: ).

Thanks!–
Terry Hancock
@Terry_Hancock

In principle, one ought to be able to do it by simply setting the alpha
value of every pixel to either fully transparent or fully opaque based
on whether the RGB values match the color key. This seems fairly
tricky, though, especially since both RGB and RGBA are represented
as Uint32. I assume the color key is probably just RGB?

The easiest way to do this I found was to simply copy the Surface into
a similar-sized Surface filled with alpha = 0. This way the color-keyed
pixels won’t be copied and will retain their 0 alpha value, and the
non-color-keyed pixels will get the correct colors and the default alpha.

C code something like this (may not be optimal, suggestions for improvement
gratefully accepted).

SDL_Surface *orig, *rgba;

orig = IMG_Load(filename);

rgba = SDL_CreateRGBSurface(SDL_SWSURFACE, orig->w, orig->h,
			    orig->format->BitsPerPixel,
			    orig->format->rmask, orig->format->bmask,
			    orig->format->gmask, orig->format->amask);

/* May need this if your image loader hasn't set the color key */
/* SDL_SetColorKey(orig, SDL_SRCCOLORKEY, colorkeyvalue); */

SDL_FillRect(rgba, NULL, SDL_MapRGBA(rgba->format, 0, 0, 0, 0));

SDL_BlitSurface(orig, NULL, rgba, NULL);
SDL_FreeSurface(orig);

This might be coloured by the fact that I needed to change the image
format anyway for OpenGL (my version of this code uses new depth and
Xmask params).

I guess you could do something like this if your Surface is already in the
right format:

foreach pixel
	SDL_GetRGB(pixel, format, &r, &g, &b);
	if (SDL_MapRGB(format, r, g, b) == format->colorkey)
		pixel = SDL_MapRGBA(format, r, g, b, 0);

Cheers,

  • Mike

However, I have found that the PNG resources are not
all consistent (probably my use of ImageMagick) in that
some use the Alpha channel and some use Color Key to
do transparency.

SDL_Image will load 8-bit images with transparency (tRNS chunks) into
colourkeyed surfaces, as SDL doesn’t support an alpha channel for
indexed images

In principle, one ought to be able to do it by simply
setting the alpha value of every pixel to either
fully transparent or fully opaque based on whether
the RGB values match the color key. This seems
fairly tricky, though, especially since both
RGB and RGBA are represented as Uint32. I assume the
color key is probably just RGB?

SDL can handle both 24-bit and 32-bit RGB surfaces (in the latter case,
addressing is simplified at the cost of wasting 8 bits/pixel).
What format SDL_image generates varies, but you can always convert it
to a format of your choice.

Easiest would be to do the following:

  1. Create a empty RGBA surface with SDL_CreateRGBSurface(). It will initially
    be zero everywhere. Use any mask you like, for instance
    (0x00ff0000, 0x0000ff00, 0x000000ff, 0xff000000).
  2. Blit the colourkeyed image to your surface. All non-transparent pixels
    will have their destination alpha set to fully opaque.

I’ve tried using the SDL_DisplayFormat() and
SDL_DisplayFormatAlpha() functions, but they seem to
be doing weird additional stuff, because they mess up
the colors of the images (and the alpha channel doesn’t
seem to be as expected, either).

I would appreciate a minimal example of your problems, as
SDL_DisplayFormatAlpha() really should do the above. I’ll have a look
at it

orig = IMG_Load(filename);

rgba = SDL_CreateRGBSurface(SDL_SWSURFACE, orig->w, orig->h,
orig->format->BitsPerPixel,
orig->format->rmask, orig->format->bmask,
orig->format->gmask, orig->format->amask);

you don’t want to use the format of ‘orig’ if it’s an indexed surface

SDL_FillRect(rgba, NULL, SDL_MapRGBA(rgba->format, 0, 0, 0, 0));

this shouldn’t be necessary - SDL_CreateRGBSurface zeroes the surface
by default (too much code relies on it so we might just as well document
it)

ok, here’s a patch (against current CVS) that:

  1. changes SDL_ConvertSurface() to not preserve a colourkey to the new
    surface if SDL_SRCCOLORKEY isn’t present in the flags argument, AND
    the destination has an alpha channel. In this case, it translates the
    colourkey into an alpha channel
  2. makes SDL_DisplayFormatAlpha() use this to translate colourkeyed
    surfaces to RGBA surfaces

I doubt this breaks anybody’s code. Comments?

(don’t worry Martin, I’ll write the needed doc changes if needed)

— SDL_video.c Wed Feb 21 14:39:42 2001
+++ /afs/nada.kth.se/home/f91/f91-men/SDL_video.c Thu Mar 15 14:34:12 2001
@@ -870,17 +870,12 @@

    default:
	/* We have no other optimised formats right now. When/if a new
  •      optimised alpha format is written, add the converter here. */
    
  •      optimised alpha format is written, add the converter here */
      break;
    
    }
    format = SDL_AllocFormat(32, rmask, gmask, bmask, amask);
    flags = SDL_PublicSurface->flags & SDL_HWSURFACE;
    -#ifdef AUTORLE_DISPLAYFORMAT
  • flags |= (surface->flags & (SDL_SRCCOLORKEY|SDL_SRCALPHA));
  • flags |= SDL_RLEACCELOK;
    -#else
  • flags |= surface->flags & (SDL_SRCCOLORKEY|SDL_SRCALPHA|SDL_RLEACCELOK);
    -#endif
  • flags |= surface->flags & (SDL_SRCALPHA | SDL_RLEACCELOK);
    converted = SDL_ConvertSurface(surface, format, flags);
    SDL_FreeFormat(format);
    return(converted);
    — SDL_surface.c Sat Feb 10 14:06:52 2001
    +++ /afs/nada.kth.se/home/f91/f91-men/SDL_surface.c Thu Mar 15 14:34:24 2001
    @@ -726,8 +726,14 @@
    /* Save the original surface color key and alpha */
    surface_flags = surface->flags;
    if ( (surface_flags & SDL_SRCCOLORKEY) == SDL_SRCCOLORKEY ) {
  •   colorkey = surface->format->colorkey;
    
  •   SDL_SetColorKey(surface, 0, 0);
    
  •   /* Convert colourkeyed surfaces to RGBA if requested */
    
  •   if((flags & SDL_SRCCOLORKEY) != SDL_SRCCOLORKEY
    
  •      && format->Amask) {
    
  •   	surface_flags &= ~SDL_SRCCOLORKEY;
    
  •   } else {
    
  •   	colorkey = surface->format->colorkey;
    
  •   	SDL_SetColorKey(surface, 0, 0);
    
  •   }
    
    }
    if ( (surface_flags & SDL_SRCALPHA) == SDL_SRCALPHA ) {
    alpha = surface->format->alpha;

Mike Battersby wrote:

In principle, one ought to be able to do it by simply setting the alpha
value of every pixel to either fully transparent or fully opaque based
on whether the RGB values match the color key. This seems fairly
tricky, though, especially since both RGB and RGBA are represented
as Uint32. I assume the color key is probably just RGB?

The easiest way to do this I found was to simply copy the Surface into
a similar-sized Surface filled with alpha = 0. This way the color-keyed
pixels won’t be copied and will retain their 0 alpha value, and the
non-color-keyed pixels will get the correct colors and the default alpha.

Thank you – that sounds like the right solution for me.

I’m really inexperienced with this stuff – I haven’t done computer
graphics programming for LONG time (10yrs?). And of course,
everything has changed. Apparently for the better, but I’m still
kind of lost. :slight_smile:

As for the actual code, it will be in the AutoManga CVS towards the
end of the week – I’m going to check it in whether it works or not.
Of course, it doesn’t do much, but you can browse to the source file
online and see what I’m doing with it (perhaps I should say “how
I’m fumbling with it”):

CVS: amdemo/automanga2.cpp

Thanks to you and Mattias Engdeg?rd for the replies!–
Terry Hancock
@Terry_Hancock

I haven’t extensively checked over the code, but the logic makes perfect
sense to me.

  • MikeOn 15 Mar 2001 06:11:40 -0800, Mattias Engdeg?rd wrote:

ok, here’s a patch (against current CVS) that:

  1. changes SDL_ConvertSurface() to not preserve a colourkey to the new
    surface if SDL_SRCCOLORKEY isn’t present in the flags argument, AND
    the destination has an alpha channel. In this case, it translates the
    colourkey into an alpha channel
  2. makes SDL_DisplayFormatAlpha() use this to translate colourkeyed
    surfaces to RGBA surfaces

I doubt this breaks anybody’s code. Comments?

orig = IMG_Load(filename);
rgba = SDL_CreateRGBSurface(SDL_SWSURFACE, orig->w, orig->h,
orig->format->BitsPerPixel,
orig->format->rmask, orig->format->bmask,
orig->format->gmask, orig->format->amask);

you don’t want to use the format of ‘orig’ if it’s an indexed surface

No, of course you are right. I didn’t check fully into the behaviour of
color keying – I thought the original poster implied that he had an RGBA
surface already and wanted to replace the color key pixels, but I’ve only
ever used color keying with indexed surfaces (is it even valid for
RGB(A) surfaces?) hence my fudging of this code for the reply.

That leads to another annoying question, in that all the sample code I’ve
seen for OpenGL has to test the machine endianness to set up the masks
for the correct format. Is there an easier way of doing this:

#if SDL_BYTEORDER == SDL_LIL_ENDIAN
    static const Uint32 rmask = 0x000000FF;
    static const Uint32 bmask = 0x0000FF00;
    static const Uint32 gmask = 0x00FF0000;
    static const Uint32 amask = 0xFF000000;
#else
    static const Uint32 rmask = 0xFF000000;
    static const Uint32 bmask = 0x00FF0000;
    static const Uint32 gmask = 0x0000FF00;
    static const Uint32 amask = 0x000000FF;
#endif
SDL_CreateRGBSurface(...);

SDL_FillRect(rgba, NULL, SDL_MapRGBA(rgba->format, 0, 0, 0, 0));

this shouldn’t be necessary - SDL_CreateRGBSurface zeroes the surface by
default (too much code relies on it so we might just as well document it)

That’s handy to know.

Cheers,

  • Mike

Mike Battersby wrote:

No, of course you are right. I didn’t check fully into the behaviour of
color keying – I thought the original poster implied that he had an RGBA
surface already and wanted to replace the color key pixels, but I’ve only
ever used color keying with indexed surfaces (is it even valid for
RGB(A) surfaces?) hence my fudging of this code for the reply.

I’ve been doing some experimenting (also banging the table a lot, but
nevermind). What I’ve got is a surface with BOTH an Alpha channel and
a Colorkey. Looks like the colorkey contains the information I want,
and I’m not sure where the alpha channel came from. When you blit
this image to an RGBA surface as recommended, both are transferred,
but the flags are flipped so that color keying is turned off in the
new image and the alpha channel is preserved. I don’t know if that’s
what should happen, but it’s annoying the heck out of me right
now. I’ll come back to it after I’ve cooled off a little and give
a more complete report.

I’ve also written some code snippets for displaying this stuff –
I’m thinking about making an sdl_debug.c with utilities for
instrumenting the code. Sound useful? I’ve currently got it displaying
the colorkey and alpha in separate channels so you can see both
at once. I need to clean it up a bit, though – tomorrow.

Good night folks! :slight_smile:

Terry–
Terry Hancock
@Terry_Hancock

[…]

No, of course you are right. I didn’t check fully into the behaviour of
color keying – I thought the original poster implied that he had an RGBA
surface already and wanted to replace the color key pixels, but I’ve only
ever used color keying with indexed surfaces (is it even valid for
RGB(A) surfaces?)

Yes.

BTW, chroma keying (the method used with video; done with analog cirquitry
once upon a time) is basically “RGB color keying” - although with some
multimidensional fuzz factor.

hence my fudging of this code for the reply.

That leads to another annoying question, in that all the sample code I’ve
seen for OpenGL has to test the machine endianness to set up the masks
for the correct format. Is there an easier way of doing this:

#if SDL_BYTEORDER == SDL_LIL_ENDIAN
    static const Uint32 rmask = 0x000000FF;
    static const Uint32 bmask = 0x0000FF00;
    static const Uint32 gmask = 0x00FF0000;
    static const Uint32 amask = 0xFF000000;
#else
    static const Uint32 rmask = 0xFF000000;
    static const Uint32 bmask = 0x00FF0000;
    static const Uint32 gmask = 0x0000FF00;
    static const Uint32 amask = 0x000000FF;
#endif

SDL_CreateRGBSurface(…);

Possibly, code wise, but I don’t think you can avoid it if you really want to
stick to the hardware surface format…

As to the source level, how about something like

static const Uint32 rmask = BE2NE(0xFF000000);
static const Uint32 bmask = BE2NE(0x00FF0000);
static const Uint32 gmask = BE2NE(0x0000FF00);
static const Uint32 amask = BE2NE(0x000000FF);

where BE2SE() is a “big endian to native endian” conversion.

That would work on more exotic endian formats as well, although I hardly
think you’ll ever see anything but LE and BE on current platforms. I’d be
interested in pointers current platforms with “exotic” endian formats, if
there are any! (I’m not interested in PDP/11 and other collector’s items. :wink:

//David

.- M A I A -------------------------------------------------.
| Multimedia Application Integration Architecture |
| A Free/Open Source Plugin API for Professional Multimedia |
----------------------> http://www.linuxaudiodev.com/maia -' .- David Olofson -------------------------------------------. | Audio Hacker - Open Source Advocate - Singer - Songwriter |--------------------------------------> david at linuxdj.com -'On Friday 16 March 2001 02:08, Mike Battersby wrote:

Mike Battersby wrote:

No, of course you are right. I didn’t check fully into the behaviour of
color keying – I thought the original poster implied that he had an RGBA
surface already and wanted to replace the color key pixels, but I’ve only
ever used color keying with indexed surfaces (is it even valid for
RGB(A) surfaces?) hence my fudging of this code for the reply.

I’ve been doing some experimenting (also banging the table a lot, but
nevermind).

Poor table… :wink:

What I’ve got is a surface with BOTH an Alpha channel and a Colorkey.

Hmm… Might make sense if you’re about to construct surface from various
sources, all with alpha channels.

Looks like the colorkey contains the information I want,
and I’m not sure where the alpha channel came from. When you blit
this image to an RGBA surface as recommended, both are transferred,

I’m not sure “thransferred” is the right term here; a color key is not a
channel or anything like that (unless we’re talking about RLE or similar
encoding); just a value that tells the blitter which pixels not to transfer
to the target.

Hence, wherever there are pixels with an RGB(A?) value that matches the color
key, the target surface will be unchanged. (I think, but I’m not entirely
sure about what SDL does in this case.)

but the flags are flipped so that color keying is turned off in the
new image and the alpha channel is preserved.

Which alpha channel is preserved?

I don’t know if that’s
what should happen, but it’s annoying the heck out of me right
now.

Well… Is the target surface color keyed? (Not that I think that would
affect how data is blitted to it, though… Am I missing something?)

Also note that if you actually apply the alpha channel (to the RGB
channels, rather than just copying it) during any intermediate blit, the
colors will change, thus breaking the color keying. (Kind of obvious,
considering that the color key is a full pixel value, rather than a single
component of HSV or similar color code…)

I’ve also written some code snippets for displaying this stuff –
I’m thinking about making an sdl_debug.c with utilities for
instrumenting the code. Sound useful?

Yeah, but I’m not sure I understand exactly what you’re talking about… :wink:

I’ve currently got it displaying
the colorkey and alpha in separate channels so you can see both
at once.

Sounds handy.

//David

.- M A I A -------------------------------------------------.
| Multimedia Application Integration Architecture |
| A Free/Open Source Plugin API for Professional Multimedia |
----------------------> http://www.linuxaudiodev.com/maia -' .- David Olofson -------------------------------------------. | Audio Hacker - Open Source Advocate - Singer - Songwriter |--------------------------------------> david at linuxdj.com -'On Friday 16 March 2001 04:41, Terry Hancock wrote:

[…] but I’ve only
ever used color keying with indexed surfaces (is it even valid for
RGB(A) surfaces?)

yes, but if you have an RGBA surface with SDL_SRCALPHA and SDL_COLORKEY
both set, then the alpha channel “wins” and colourkeying isn’t used

Is there an easier way of doing this:

#if SDL_BYTEORDER == SDL_LIL_ENDIAN
static const Uint32 rmask = 0x000000FF;
static const Uint32 bmask = 0x0000FF00;
static const Uint32 gmask = 0x00FF0000;
static const Uint32 amask = 0xFF000000;
#else
static const Uint32 rmask = 0xFF000000;
static const Uint32 bmask = 0x00FF0000;
static const Uint32 gmask = 0x0000FF00;
static const Uint32 amask = 0x000000FF;
#endif

not really, since the openGL format is endian-dependent if you consider
each pixel a 32-bit integer (which SDL does), or endian-independent if you
think it is a sequence of bytes (which openGL does)

I’ve been doing some experimenting (also banging the table a lot, but
nevermind). What I’ve got is a surface with BOTH an Alpha channel and
a Colorkey. Looks like the colorkey contains the information I want,
and I’m not sure where the alpha channel came from.

SDL_SRCCOLORKEY and SDL_SRCALPHA control the behaviour of SDL_BlitSurface(),
basically like this:

if (source surface has SDL_SRCALPHA set) {
if (source surface has alpha channel (that is, format->Amask != 0))
blit using per-pixel alpha, ignoring any colour key ()
else {
if (source surface has SDL_SRCCOLORKEY set)
blit using the colour key AND the per-surface alpha value (
)
else
blit using the per-surface alpha value
}
} else {
if (source surface has SDL_SRCCOLORKEY set)
blit using the colour key (*)
else
ordinary opaque rectangular blit
}

The with (*) marked blitters can use RLE acceleration

SDL_SRCCOLORKEY and SDL_SRCALPHA control the behaviour of SDL_BlitSurface(),
basically like this:

Thanks Mattias, that was really helpful. :slight_smile:

See ya,
-Sam Lantinga, Lead Programmer, Loki Entertainment Software

Great summary, thanks. Can we add this to the SDL_BlitSurface doc page?

  • MikeOn 16 Mar 2001 02:21:43 -0800, Mattias Engdeg?rd wrote:

SDL_SRCCOLORKEY and SDL_SRCALPHA control the behaviour of SDL_BlitSurface(),
basically like this:

ok, here’s a patch (against current CVS) that:

  1. changes SDL_ConvertSurface() to not preserve a colourkey to the new
    surface if SDL_SRCCOLORKEY isn’t present in the flags argument, AND
    the destination has an alpha channel. In this case, it translates the
    colourkey into an alpha channel
  2. makes SDL_DisplayFormatAlpha() use this to translate colourkeyed
    surfaces to RGBA surfaces

Thanks, it’s applied!

See ya,
-Sam Lantinga, Lead Programmer, Loki Entertainment Software

Hi,
It’s me again. :wink:

Let me make sure I understand what’s supposed to happen in the
following two cases:

  1. A surface – either 8-bit/palette OR 32-bit/RGB with
    SDL_SRCCOLORKEY set (and SDL_SRCALPHA not set) is
    blitted to a 32-bit RGBA surface with SDL_SRCALPHA
    set (and SDL_SRCCOLORKEY not set). E.g.:

    BlitSurface(source, NULL, dest, NULL);

    It shouldn’t matter what the status of SDL_HWSURFACE
    or SDL_SWSURFACE is, or whether the surface is a display
    surface or not.

    The resulting image should be an alpha-channel RGBA surface
    in which the alpha channel reflects the transparency that
    the color key did in the original. However this resulting
    image is not color-keyed (i.e. SDL_SRCCOLORKEY will continue
    not to be set).

  2. An 32-bit RGBA surface (with SDL_SRCALPHA – possibly the
    result from the above operation) is blitted onto another
    32-bit RGBA surface which may be blank/transparent, or may
    have previous RGBA data in it.

    In the transparent case, the result should be essentially
    identical to the source RGBA image. In the case where
    there was previously data, the previous (dest) data would
    show through wherever the source image is transparent
    (in my particular example, there’s no partial transparency,
    but there could be in principle).

But I can’t seem to make this happen. I wrote a routine to
copy the Color Key to Alpha channel (a pixel-by-pixel copy
which is not optimized) which seems to work, but the blit didn’t
do it.

The most baffling thing to me though, is the second step: I
can’t seem to get the alpha channel to be copied – it always
seems to give an alpha channel that’s 0xFF everywhere. The
resulting surface also doesn’t seem to have the information I
blitted to it at all, because blitting it to the screen and
displaying it shows nothing.

Have I hit a bug, or have I just dropped the ball here
somewhere?

The code is at:
http://cvs.sourceforge.net/cgi-bin/cvsweb.cgi/amdemo/?cvsroot=automanga

The relevant files are:

Makefile
blits.c
seesdl.c
seesdl.h
Background/arbor.set1.c1.fount.gif

“blits” is a demo code extracting the problem code from my
application (which was actually in automanga2.cpp).

The seesdl.c/h is the debugging module I promised to share
in my previous posts. It gives you two handy functions for
instrumenting the code:

void seeSDL_info(SDL_Surface *surface, char * title_text);
Dumps basic info about a surface (size, flags, etc) to stdout.

void seeSDL_transparency(SDL_Surace *surface);
Maps the alpha-channel and color-key transparency patterns
into the green and blue image channels and displays them.

void seeSDL_screen(SDL_Surface *);
Use this to tell seeSDL_transparency() where to display. It’s
up to you to make sure you can see the output, but the display
function updates the screen after blitting the data to it.

These use the SGE library as well as SDL for pixel operations
(e.g. as used in key2alpha() ).

Libraries:
SDL 1.1.8 (Debian Woody libsdl1.1 and libsdl1.1-dev )
SDL Image 1.1.0 (Debian Woody libsdl-image1.1 and
libsdl-image1.1-dev )
SGE 1.2.24 (tar.gz from source)

Well, I think that’s all the information. If anyone has any
idea what I’m doing wrong, please let me know. I’m pretty
inexperienced with SDL, so anything’s possible.

Thanks,
Terry–
Terry Hancock
@Terry_Hancock

  1. A surface – either 8-bit/palette OR 32-bit/RGB with
    SDL_SRCCOLORKEY set (and SDL_SRCALPHA not set) is
    blitted to a 32-bit RGBA surface with SDL_SRCALPHA
    set (and SDL_SRCCOLORKEY not set). E.g.:

BlitSurface(source, NULL, dest, NULL);

It shouldn’t matter what the status of SDL_HWSURFACE
or SDL_SWSURFACE is, or whether the surface is a display
surface or not.

The resulting image should be an alpha-channel RGBA surface
in which the alpha channel reflects the transparency that
the color key did in the original. However this resulting
image is not color-keyed (i.e. SDL_SRCCOLORKEY will continue
not to be set).

right, the destination flags neither affect the blit operation, nor
are they modified. Note that the image must be zeroed for the above to
be true — destination pixels corresponding to transparent source
pixels aren’t zeroed, just skipped

  1. An 32-bit RGBA surface (with SDL_SRCALPHA – possibly the
    result from the above operation) is blitted onto another
    32-bit RGBA surface which may be blank/transparent, or may
    have previous RGBA data in it.

In the transparent case, the result should be essentially
identical to the source RGBA image. In the case where
there was previously data, the previous (dest) data would
show through wherever the source image is transparent
(in my particular example, there’s no partial transparency,
but there could be in principle).

since you are using SRCALPHA, the operation is to alpha-blend the
source onto the destination, as if the destination was opaque
(destination alpha channel is not used in the process and is kept
unchanged).

if you want to just copy the contents, alpha and all, to another
RGBA surface, make sure SDL_SRCALPHA is not set on the source
surface

I agree that the semantics are more than a bit muddy (see my earlier
message where I wrote the algorithm in pseudocode). Your questions are
not stupid but in fact essential for us to understand the lib
ourselves and how people think when using it