I’ve got an 8-bit BMP that SDL is loading as 24 bits for whatever
reason. I try to save it as a PNG, but the PNG encoder chokes on the
24-bit depth. Since the data in here can be represented in just 8 BPP
anyway, what’s the best way to convert the 24-bit SDL_Surface to an
8-bit SDL_Surface so I can save it? I know I’ve gotta call
SDL_ConvertSurface, but how do I get the right pixel format to pass in?
Are you loading the BMP through SDL, or SDL_image? Both should be able
to give you an 8-bit, paletted surface.
Is it possible you’re using SDL_image on Mac OS X? Because that tries to
use Apple’s decoders by default, and those don’t play around with palettes.
If you have to have 24-bit data, and you have to convert it to 8-bit,
you probably don’t want SDL_ConvertSurface()…it sort of assumes you
have two existing surfaces so you’ll have the SDL_PixelFormat you want
available. It’s really meant to be used for “I just loaded this .bmp
file to a surface, now I want it in the same format as the screen
surface I got from SDL_SetVideoMode() so I don’t have to convert it
every time I blit.”
Fortunately, you can do the same thing as SDL_ConvertSurface() yourself,
without all the work it does for corner cases like colorkeys.
I’d try something like this:
surf24 = WhereverYouGotThe24BitSurface();
surf8 = SDL_CreateRGBSurface(SDL_SWSURFACE, surf24->w, surf24->h, 8, 0,
0, 0, 0);
// Create a palette here for however you want those 8 bits mapped.
// Let’s say you want this to be a gradient from black to white.
for (int i = 0; i < 256; i++)
palette[i].r = i;
palette[i].g = i;
palette[i].b = i;
palette[i].unused = 0;
SDL_SetPalette(surf8, SDL_LOGPAL, palette, 0, 256);
// Now blit from the 24-bit surface to the 8-bit one. SDL will use the
// palette to map between the colors as closely as possible.
SDL_BlitSurface(surf24, NULL, surf8, NULL);
// You’re done with the 24-bit surface now. Free it if you own it.
// Use the 8-bit one as you see fit. Don’t forget to SDL_FreeSurface()
// it afterwards!