What are the valid values for bpp in SDL_SetVideoMode()?
I’ve been assuming that passing in 15 requests a 16bit 555 mode and
passing in 16 requests a 16bit 565 mode, but now I’m beginning to think
that 15 is not a legal value. That is, you just pass in 16 and 555 or
565 is selected based on your video card capabilities…
BTW, under linux, passing 15 into SDL_SetVideoMode() does in fact seem
to produce a 555 mode… Not sure anymore if this is correcty
behaviour…
It’s actually a little fuzzy. If you pass in 16, the video driver will
pick 555 or 565 16-bit modes, depending on its capabilities. If you pass
in 15, you will get a 555 15-bit mode, but since the video driver usually
returns 555 and 565 as variants of 16-bit depth, it will often be emulated.
It should probably be cleaned up, one way or the other.
555 and 565 are valid 16-bit modes, but 555 is often advertised as a 15
bit mode by the drivers. The ‘bpp’ in the video mode is the number of
bits used to represent a pixel. The number of bytes used to represent
a pixel is calculated from the formula: (bpp+7)/8
Thus, an 8-bit mode takes 1 byte, a 15-bit mode takes 2 bytes, 16-bit
takes 2 bytes, 24-bit mode takes 3 bytes, 32-bit mode takes 4 bytes, etc.
The ordering of the color components within the bytes is not defined.
For example, a 32-bit mode might be XRGB, ARGB, ABGR, BGRA, GABR, etc.,
and a 16-bit mode might be X555, 565, 555X, RGB, BGR. The SDL blitters
are flexible enough to be able to handle really odd formats like
4444 ARGB, or even 13-bit modes, if a little slowly.
So, for consistency, 555 modes should probably always be returned as
15-bit modes, although if you request a 16-bit mode you may get a
15-bit mode as it is functionally equivalent but may be much faster…
Hum.
Maybe inside the drivers, 555 modes can satisfy requests for either
15 or 16 bit modes, and the returned BitsPerPixel will reflect the
request.
Suggestions?
-Sam Lantinga, Lead Programmer, Loki Entertainment Software