Pixelformat quirkiness

I just noticed (i.e. got bitten by :slight_smile: a quirk in the way that SDL deals
with 15 and 16 bit display modes.

The BitsPerPixel field in the SDL_PixelFormat struct is 16 for both 15
(eg rgb555) and 16 (eg rgb565) bit modes. You can still distinguish
between modes using the colour bitmasks.
SDL_SetVideoMode() seems to take a bitsperpixel of 15 or 16 - and no
bit masks - to distinguish between 15 and 16 bit modes.

Is this a bug or a quirk that should be documented?
Should BitsPerPixel in SDL_PixelFormat should be 15 for 15 bit modes?

Ben.–
Ben Campbell (ben.campbell at creaturelabs.com)
Programmer, Creature Labs.
http://www.creaturelabs.com

I just noticed (i.e. got bitten by :slight_smile: a quirk in the way that SDL deals
with 15 and 16 bit display modes.

The BitsPerPixel field in the SDL_PixelFormat struct is 16 for both 15
(eg rgb555) and 16 (eg rgb565) bit modes. You can still distinguish
between modes using the colour bitmasks.

BitsPerPixel is not the depth (=the number of significant bits per pixel),
but the number of bits of storage that each pixel occupies.

SDL_SetVideoMode() seems to take a bitsperpixel of 15 or 16 - and no
bit masks - to distinguish between 15 and 16 bit modes.

The terminology is admittedly a bit vague and mixed up here. I propose
we use the X11 terms: depth = bits actually used, and bpp = bits per
pixel = bits the pixel occupies in memory. So a 32bpp mode has depth=24 bits.

SDL_SetVideoMode should take a depth parameter, not bpp (or the 15 vs 16
makes no sense).

Is this a bug or a quirk that should be documented?

Yes it should probably be clarified. Martin! :slight_smile:

Should BitsPerPixel in SDL_PixelFormat should be 15 for 15 bit modes?

No, I think not. It isn’t really important, is it? They only differ
in the masks anyway