SDL_ConvertSurface wierdness

Greetings.

In SDL_Surface.c

SDL_Surface *SDL_ConvertSurface(..., ..., ...)

line 641

/* Copy the palette if any */
if ( format->palette ) {
  memcpy(convert->format->palette->colors,
		format->palette->colors,
		format->palette->ncolors * sizeof(SDL_Color));
}

convert->format->palette is NULL when the video mode is in 16 bit 

full screen hardware, which causes an exception to occur. it results
from the use of SDL_CreateRGBSurface which uses the screen depth if
flags have SDL_HWSURFACE set (see SDL_Surface.c line 87).

when i pass an 8 bit surface to SDL_ConvertSurface format->palette 

is non NULL and so it try to set the palette for the new 16 bit surface
and fails as convert->format->palette is NULL.

this is a bug because SDL_ConvertSurface should base it's 

conversion on the passed surface and not on the video surface.

-dv

/* Copy the palette if any */
if ( format->palette ) {

  memcpy(convert->format->palette->colors,
  	format->palette->colors,
  	format->palette->ncolors * sizeof(SDL_Color));

}

urgh, and I thought I went through that code already. There’s another
problem here: is palette->ncolors the number of colours actually used in
the palette, or the maximum palette size?

Several image loaders set palette->ncolors to the number of colours in
the colormap of the image, and this can be useful (testpalette uses it).
This means that SDL_SetPalette() (and SDL_SetColors()) which interpret
palette->ncolors as the maximum palette size are doing wrong.
They should use 2**(BitsPerPixel) instead. Right, Sam?

urgh, and I thought I went through that code already. There’s another
problem here: is palette->ncolors the number of colours actually used in
the palette, or the maximum palette size?

It’s actually the maximum palette size, but I don’t see any reason not
to change the semantics to be the number of colors used, as long as the
size of the palette == 2**(BitsPerPixel) is a valid assumption.

See ya,
-Sam Lantinga, Lead Programmer, Loki Entertainment Software