About SDL implementation on Big Endian systems with Little Endian gfx cards

This mail is addressed mainly to Sam and the other SDL core developers.

I’ve a question about the correctness of the SDL amiga (and compatible
OS) implementation, maybe it may be of interest for other big endian
systems.

AmigaOS (and MorphOS, a derivative system) runs on a big endian cpu but
uses often (but not always) little endian graphics card.

The Amiga Display Database identifies this with different identifier
(PIXFMT_RGB16, PIXFMT_RGB16PC, PIXFMT_RGB, PIXFMT_BGRA …).

Actually the SDL port detect every pixel format and always FORCE the
screen surface (also if SDL_SWSURFACE) to have the best pixel format to
blit to the gfxcard.

S, SDL apps that DON’T relies on format->Xmask, format->Xshift,
format->Xloss to determine the component bits work with wrong colors.

Actually I was thinking about switching to behaviour to a more
compatible, but slower one, with the allocation of SWSURFACEs with the
native endian pixel format (eg: 0xf800 for red on big endian cpus) and
then swap them at hand with conversion tables.

This will be partially slower but not that much since writing to
videomem will be something like (with src->depth == 15|16bpp):

*dest++ = cvt[*src++];

I fear similar problems are also present on linuxppc machines since also
these machines use unmodified PCI/AGP cards (while macs use cards with
the native endianness AFAIK).

So the question (finally :slight_smile: ) is:

It’s better be as compatible as possible (with a performance loss on
well written apps) or be as fast as possible and blame bad written apps
for color problems? :slight_smile:

Bye,
Gabry

It’s better be as compatible as possible (with a performance loss on
well written apps) or be as fast as possible and blame bad written apps
for color problems? :slight_smile:

This is a really good question. :slight_smile:

I would probably include both codepaths, and default the fast one on,
but allow people to select the compatible one via an environment variable.

See ya!
-Sam Lantinga, Software Engineer, Blizzard Entertainment

Gabriele Greco wrote:

This mail is addressed mainly to Sam and the other SDL core developers.

I’ve a question about the correctness of the SDL amiga (and compatible
OS) implementation, maybe it may be of interest for other big endian
systems.

AmigaOS (and MorphOS, a derivative system) runs on a big endian cpu
but uses often (but not always) little endian graphics card.

The Amiga Display Database identifies this with different identifier
(PIXFMT_RGB16, PIXFMT_RGB16PC, PIXFMT_RGB, PIXFMT_BGRA …).

Actually the SDL port detect every pixel format and always FORCE the
screen surface (also if SDL_SWSURFACE) to have the best pixel format
to blit to the gfxcard.

S, SDL apps that DON’T relies on format->Xmask, format->Xshift,
format->Xloss to determine the component bits work with wrong colors.

Actually I was thinking about switching to behaviour to a more
compatible, but slower one, with the allocation of SWSURFACEs with the
native endian pixel format (eg: 0xf800 for red on big endian cpus) and
then swap them at hand with conversion tables.

This will be partially slower but not that much since writing to
videomem will be something like (with src->depth == 15|16bpp):

*dest++ = cvt[*src++];

I fear similar problems are also present on linuxppc machines since
also these machines use unmodified PCI/AGP cards (while macs use cards
with the native endianness AFAIK).

No, there is no such thing as a little or big endian graphics card.
The low level pixel representation for a given card is fixed, whatever
the platform. But then, most graphics hardware can do the big/little
endian conversion for free (most ati and nvidia cards can do that for
example) during a blit operation depending on how they are configured.

The modification done to mac cards is just the bios, btw (the open
firmware won’t accept booting with an x86 video bios, it needs a special
one).
[ hint : instead of buying an expensive “mac” radeon, buy a PC one with
a flashable bios and flash it with a mac bios :slight_smile: ]

So the question (finally :slight_smile: ) is:

It’s better be as compatible as possible (with a performance loss on
well written apps) or be as fast as possible and blame bad written
apps for color problems? :slight_smile:

So my question is :
What kind of hardware do you have that is unable to do the pixel
conversion and/or accelerate dma ops ? If your hardware can do both, you
can provide the most widespread kind of visuals at no speed cost.

That said, my opinion is that one should fix the broken things where the
breakage is, not workaround in a library :slight_smile: At least in the case of free
software developers they are quite open on such issues (send patches,
educate people to not do the same mistakes again and again, maybe write
a “how to support big endian platforms when working with SDL”).

Stephane

Stephane Marchesin wrote:

So my question is :
What kind of hardware do you have that is unable to do the pixel
conversion and/or accelerate dma ops ? If your hardware can do both, you
can provide the most widespread kind of visuals at no speed cost.

The hardware may vary from a PPC motherboard with a “stock” PC radeon
card to the same motherboard with a Voodoo3 or a Permedia2 card
(MorphOS, www.morphos.org), to a 68k system with a pci bridge with a
Voodoo3/Permedia2/Cirrus/… card (AmigaOS classic www.amiga.com), to a
stock intel/amd box (AROS, www.aros.org).

All those systems share the same API (the AmigaOS API) and work with the
same SDL codepath (video cybergfx, audio ahisound).

The problem is that I don’t have low level gfx primitives and I DONT
have any specific 15/16bit blitting function in the OS (while I have a
24/32bit to anything accelerated function).

Anyway I think I’ll follow Sam advice adding an enviroment variable:

SDL_AMIGA_MAX_COMPATIBILITY

If this variable is enabled the library will blit through a conversion
table for 15/16bit and will return the same Rmask/Bmask/Gmask for little
endian and big endian 15/16bit modes.

Bye,
Gabry