SDL_DisplayFormat() loses color info

The screen mode is 800x600x16 under Linux, but I am having
problems using SDL_DisplayFormat() to convert surfaces for
faster blit. Some color pixels on the original surface would
just disappear in the converted surface. However, things
are ok if I just blit the original surface onto the screen.

I only tested on Linux and 16-bit color depth. Is there anybody
else have this problem?

Regards,
.paul.

To add more info, the source image is read by IMG_LoadTyped_RW(…)
from a gif file, which has a transparent colorkey. Could it be
that pixels of color closer to the colorkey are converted to the
same as the colorkey during SDL_ConvertSurface()?

If that is the case, why this does not happen when the original
surface is directly blitted to the screen (SDL_SWSURFACE)? how do
I prevent this behavior?

Regards,
.paul.On Tue, Jan 01, 2002 at 08:31:47PM +0800, paul at theV.net wrote:

The screen mode is 800x600x16 under Linux, but I am having
problems using SDL_DisplayFormat() to convert surfaces for
faster blit. Some color pixels on the original surface would
just disappear in the converted surface. However, things
are ok if I just blit the original surface onto the screen.

I only tested on Linux and 16-bit color depth. Is there anybody
else have this problem?

Regards,
.paul.


SDL mailing list
SDL at libsdl.org
http://www.libsdl.org/mailman/listinfo/sdl

paul at theV.net wrote:

To add more info, the source image is read by IMG_LoadTyped_RW(…)
from a gif file, which has a transparent colorkey. Could it be
that pixels of color closer to the colorkey are converted to the
same as the colorkey during SDL_ConvertSurface()?

This can be it. There are in fact two effects: First, since images
from GIF files are indexed, the colourkey index can have the same
colour as another index. Second, the reduced colour precision in
16-bit mode can cause two close but distinct colours to map to the
same 16-bit pixel value. (The 8bpp->16bpp conversion is in general
not lossless)

If that is the case, why this does not happen when the original
surface is directly blitted to the screen (SDL_SWSURFACE)?

This is because the colourkey blitter from 8bpp to 16bpp operates on
8bpp pixels and colourkeys, so the aliasing problem does not occur

how do I prevent this behavior?

Some solutions:

  • Use a more distinct colour key that doesn’t alias.
    Either do it with an image editor, or pick a 16-bit colour you know isn’t
    going to be used and fill a newly created surface with it (FillRect).
    Then colourkey-blit the loaded surface onto this surface, and set
    the colour key of the new surface to whatever colour you chose.

  • Use DisplayFormatAlpha to convert your 8bpp images to an RGBA
    image, with the colourkey represented as pixels with zero alpha.
    With RLE it should be as fast as a normal colourkeyed blit,
    although you cannot take advantage of hardware acceleration

Thank you very much!

That solves my problem. Seems RLE + DisplayFormatAlpha does not affect
the FPS so far, I’ll take your word for it :slight_smile:

Regards,
.paul.On Tue, Jan 01, 2002 at 02:34:41PM +0100, Mattias Engdeg?rd wrote:

Some solutions:

  • Use a more distinct colour key that doesn’t alias.
    Either do it with an image editor, or pick a 16-bit colour you know isn’t
    going to be used and fill a newly created surface with it (FillRect).
    Then colourkey-blit the loaded surface onto this surface, and set
    the colour key of the new surface to whatever colour you chose.

  • Use DisplayFormatAlpha to convert your 8bpp images to an RGBA
    image, with the colourkey represented as pixels with zero alpha.
    With RLE it should be as fast as a normal colourkeyed blit,
    although you cannot take advantage of hardware acceleration