SDL_image and gradients; some research:

“Ryan C. Gordon” wrote:

Just for reference, I tried the gradient in question (available from
http://www.icculus.org/tmp/example.png) with the following programs, with
my X-server in 16-bit color mode.

Gimp : smooth.
ImageMagick’s “display” : banding.
ee : banding, but different from ImageMagick’s.
showimage : Looked a lot like ee’s output.
Mozilla (nightly build from the 4th) : smooth.

None of these had a problem with the image when my Xserver was in 32-bit
mode (obviously).

Hmm, I looked at it in Netscape and there was banding too. Unfortunately
I can’t test it at 32-bit color because XFree86 4.0 doesn’t seem to like
my card in 32bit mode.

The best bet would be to either convert the surface to 16-bit yourself if
using a 16-bit display, or patch SDL_ConvertSurface to dither it like
Gimp or Mozilla does. This is beyond the scope of a simple bugfix, though.
Anyone want to take a stab at it?

I don’t think it would matter. As I said, I tried saving the file as an
8bit png, 8bit gif, and 8bit xpm and both produced the banding. However
the problem remains in Netscape and SDL_Image so it’s obviously not
SDL’s fault. It’s just odd because 8bit XPM files output a color per
pixel, you can even open the file in kwrite and see that never are more
than two adjacent pixels the same color, yet they are when viewed. Very
strange.

Just for reference, I tried the gradient in question (available from
http://www.icculus.org/tmp/example.png) with the following programs, with
my X-server in 16-bit color mode.

Gimp : smooth.
ImageMagick’s “display” : banding.
ee : banding, but different from ImageMagick’s.
showimage : Looked a lot like ee’s output.
Mozilla (nightly build from the 4th) : smooth.

None of these had a problem with the image when my Xserver was in 32-bit
mode (obviously).

Digging into SDL_image, the PNG file is loaded into the (32-bit) surface
correctly, and retains the same colors per pixel that Gimp does, according
to the eyedropper tool run over each pixel in Gimp vs. looking at the
bytes of the surface in gdb.

The problem (if it IS a problem), is in SDL_blit.c, function SDL_CalculateBlit.
Gimp jumps through more hoops to get the image to look right on a 16-bit
display. Gimp also has the luxury of taking as long as it likes to do so.

The best bet would be to either convert the surface to 16-bit yourself if
using a 16-bit display, or patch SDL_ConvertSurface to dither it like
Gimp or Mozilla does. This is beyond the scope of a simple bugfix, though.
Anyone want to take a stab at it?

SDL_ConvertSurface() gives the same results right now, but, in terms of
the API, it’s more acceptable for ConvertSurface to do more processing
than it would be for SDL_BlitSurface(). showimage just calls SDL_BlitSurface()
and lets that do the conversion on the fly, currently, btw.

–ryan.

“John Garrison” wrote in message
news:3B477973.1EAF8FDF at visi.net

I don’t think it would matter. As I said, I tried saving the file as an
8bit png, 8bit gif, and 8bit xpm and both produced the banding. However
the problem remains in Netscape and SDL_Image so it’s obviously not
SDL’s fault. It’s just odd because 8bit XPM files output a color per
pixel, you can even open the file in kwrite and see that never are more
than two adjacent pixels the same color, yet they are when viewed. Very
strange.

8 bit is an indexed mode, i.e. individual colors are stored as 24 bit (with
an image-wide limit of 256 unique colors).–
Rainer Deyke (root at rainerdeyke.com)
Shareware computer games - http://rainerdeyke.com
"In ihren Reihen zu stehen heisst unter Feinden zu kaempfen" - Abigor

Rainer Deyke wrote:

8 bit is an indexed mode, i.e. individual colors are stored as 24 bit (with
an image-wide limit of 256 unique colors).

Oh, you’re right of course, I didn’t think of that. Well, I suppose I’ll
have to either ask the GIMP people how to save in 16bit color of find a
program that can. Obviously if the gimp can display it in 16bits the
image can be saved in 16bits, no?

8 bit is an indexed mode, i.e. individual colors are stored as 24
bit (with

an image-wide limit of 256 unique colors).
Oh, you’re right of course, I didn’t think of that. Well, I suppose
I’ll
have to either ask the GIMP people how to save in 16bit color of
find a
program that can. Obviously if the gimp can display it in 16bits the
image can be saved in 16bits, no?

That depends on the file format. As far as I know, only TGA supports
16-bit color natively.–

Olivier A. Dagenais - Software Architect and Developer

That depends on the file format. As far as I know, only TGA supports
16-bit color natively.

that’s irrelevant — what he need is a dither. He can still save it in any
24-bit format

The best bet would be to either convert the surface to 16-bit yourself if
using a 16-bit display, or patch SDL_ConvertSurface to dither it like
Gimp or Mozilla does. This is beyond the scope of a simple bugfix, though.

SDL should never dither (unless it can be accelerated in hardware :slight_smile:
It can be done in a platform-independent way by the user, and thus we
should not weigh down SDL for those not needing it.

This is a feature - SDL displays what you give it

I’m amazed nobody has taken the 30 minutes to write a simple 24->16 FS dither
and be done with it