This mail is addressed mainly to Sam and the other SDL core developers.
I’ve a question about the correctness of the SDL amiga (and compatible
OS) implementation, maybe it may be of interest for other big endian
systems.
AmigaOS (and MorphOS, a derivative system) runs on a big endian cpu but
uses often (but not always) little endian graphics card.
The Amiga Display Database identifies this with different identifier
(PIXFMT_RGB16, PIXFMT_RGB16PC, PIXFMT_RGB, PIXFMT_BGRA …).
Actually the SDL port detect every pixel format and always FORCE the
screen surface (also if SDL_SWSURFACE) to have the best pixel format to
blit to the gfxcard.
S, SDL apps that DON’T relies on format->Xmask, format->Xshift,
format->Xloss to determine the component bits work with wrong colors.
Actually I was thinking about switching to behaviour to a more
compatible, but slower one, with the allocation of SWSURFACEs with the
native endian pixel format (eg: 0xf800 for red on big endian cpus) and
then swap them at hand with conversion tables.
This will be partially slower but not that much since writing to
videomem will be something like (with src->depth == 15|16bpp):
*dest++ = cvt[*src++];
I fear similar problems are also present on linuxppc machines since also
these machines use unmodified PCI/AGP cards (while macs use cards with
the native endianness AFAIK).
So the question (finally ) is:
It’s better be as compatible as possible (with a performance loss on
well written apps) or be as fast as possible and blame bad written apps
for color problems?
Bye,
Gabry