For a while, I had my project set up to work with either SDL 1.2 and
SDL 1.3 (mostly using the backwards compatibility layer for now).
Recently, as I started doing some more complex rendering, I noticed
differences in the behaviour. In order to debug, I’ve written a small
test program that exercises the different blit combinations. Tests
where made on X11 using SDL 1.2.14 and SDL 1.3-4903.
SDL 1.2 gives this:
SDL 1.3 initially gave this
until I removed a call to SDL_SetSurfaceRLE(vis, (SDL_RLEACCEL)); from
the render pipeline. Now it gives the following:
It’s mostly identical to SDL 1.2, except for the 4 tiles in the lower
right corner, which represent blitting from RGBA to RGBA surfaces. It
appears that SDL 1.2 handles the alpha channel differently from SDL
1.3 in that case.
Is there a way to replicate the behaviour of SDL 1.2? I tried the
different blend modes of SDL 1.3, but none of those came close. What’s
the algorithm used by SDL 1.2 blitting?
And what’s up with the RLE? Is that a bug when handling RGB surfaces
with per surface alpha?
P.S: The test program is here:
Unfortunately, it is not self-contained but requires the rest of
Adonthell to run.
The SDL bits are implemented here (for SDL 1.2):
and here for (SDL 1.3):
Maybe that’s enough to spot what is going on.