SDL_SetVideoMode() question

What are the valid values for bpp in SDL_SetVideoMode()?

I’ve been assuming that passing in 15 requests a 16bit 555 mode and
passing in 16 requests a 16bit 565 mode, but now I’m beginning to think
that 15 is not a legal value. That is, you just pass in 16 and 555 or
565 is selected based on your video card capabilities…

BTW, under linux, passing 15 into SDL_SetVideoMode() does in fact seem
to produce a 555 mode… Not sure anymore if this is correcty
behaviour…

Sam - many, many apologies for being so slack about getting back to you
about the windib patch!
Once this question is cleared up, I’ll be back onto it!

Ben.–
Ben Campbell
Programmer, Creature Labs
ben.campbell at creaturelabs.com
http://www.creaturelabs.com

What are the valid values for bpp in SDL_SetVideoMode()?

I’ve been assuming that passing in 15 requests a 16bit 555 mode and
passing in 16 requests a 16bit 565 mode, but now I’m beginning to think
that 15 is not a legal value. That is, you just pass in 16 and 555 or
565 is selected based on your video card capabilities…

BTW, under linux, passing 15 into SDL_SetVideoMode() does in fact seem
to produce a 555 mode… Not sure anymore if this is correcty
behaviour…

It’s actually a little fuzzy. If you pass in 16, the video driver will
pick 555 or 565 16-bit modes, depending on its capabilities. If you pass
in 15, you will get a 555 15-bit mode, but since the video driver usually
returns 555 and 565 as variants of 16-bit depth, it will often be emulated.

It should probably be cleaned up, one way or the other.

555 and 565 are valid 16-bit modes, but 555 is often advertised as a 15
bit mode by the drivers. The ‘bpp’ in the video mode is the number of
bits used to represent a pixel. The number of bytes used to represent
a pixel is calculated from the formula: (bpp+7)/8

Thus, an 8-bit mode takes 1 byte, a 15-bit mode takes 2 bytes, 16-bit
takes 2 bytes, 24-bit mode takes 3 bytes, 32-bit mode takes 4 bytes, etc.
The ordering of the color components within the bytes is not defined.
For example, a 32-bit mode might be XRGB, ARGB, ABGR, BGRA, GABR, etc.,
and a 16-bit mode might be X555, 565, 555X, RGB, BGR. The SDL blitters
are flexible enough to be able to handle really odd formats like
4444 ARGB, or even 13-bit modes, if a little slowly. :slight_smile:

So, for consistency, 555 modes should probably always be returned as
15-bit modes, although if you request a 16-bit mode you may get a
15-bit mode as it is functionally equivalent but may be much faster…
Hum.

Maybe inside the drivers, 555 modes can satisfy requests for either
15 or 16 bit modes, and the returned BitsPerPixel will reflect the
request.

Suggestions?
-Sam Lantinga, Lead Programmer, Loki Entertainment Software

So, for consistency, 555 modes should probably always be returned as
15-bit modes, although if you request a 16-bit mode you may get a
15-bit mode as it is functionally equivalent but may be much faster…

We already have SDL_VideoModeOK() that I wish a lot more people would use.
If you ask for 16bpp you should get it, even if emulated. Same with 15bpp.

Uhm, that didn’t get through did it. I’ll try again:***********************************
* *
* Use SDL_VideoModeOK() *
* *
***********************************

:slight_smile:

Especially in the PC world lots of people come to assume 16bpp (if that’s
what they have) and hard-code their games for that. So anyone with a 32bpp
framebuffer will have a hard time (conversion will make it slow).
Now the clever programmer uses ** SDL_VideoModeOK() **, and sets the depth
to what it recommends (if it’s within reasonably limits), and converts
the graphics accordingly. And everyone lived happily ever after.

We already have SDL_VideoModeOK() that I wish a lot more people would use.
If you ask for 16bpp you should get it, even if emulated. Same with 15bpp.

My point is 555 is also a 16-bit pixel format.

See ya,
-Sam Lantinga, Lead Programmer, Loki Entertainment Software

“Sam Lantinga” wrote in message
news:E13gzub-0002PJ-00 at roboto.devolution.com

We already have SDL_VideoModeOK() that I wish a lot more people would
use.

If you ask for 16bpp you should get it, even if emulated. Same with
15bpp.

My point is 555 is also a 16-bit pixel format.

I agree. 555 is a 16-bit format with an unused bit. This is analogous to a
32 bit format with 24 bits of color and 8 unused bits. IMO, the bpp
argument should only be used for pixel size in bytes, not to guarantee
bits per rgb component. This could increase speed in some emulated modes,
since it would make 555 and 565 (with unused bits) perfectly legal 32 bit
modes, reducing the need for conversions.

On a somewhat unrelated note, I would prefer it if the decision to use a rgb
mode versus a palette mode was independent from the choice of bits per
pixel.–
Rainer Deyke (root at rainerdeyke.com)
Shareware computer games - http://rainerdeyke.com
"In ihren Reihen zu stehen heisst unter Feinden zu kaempfen" - Abigor

My point is 555 is also a 16-bit pixel format.

Yes, it’s a bit inconsistent to abuse the bpp parameter as a depth indicator.
So how do you propose a user should ask for 555 vs 565 modes? There is no
other channel for that right now. You can’t even add an optional parameter
to SDL_SetVideoMode since variadic functions might not have the same calling
conventions as non-variadic on all platforms.

It’s either overloading bpp, making a new function call or adding a new flag.

Sam Lantinga wrote:

What are the valid values for bpp in SDL_SetVideoMode()?

It’s actually a little fuzzy. If you pass in 16, the video driver will
pick 555 or 565 16-bit modes, depending on its capabilities. If you pass
in 15, you will get a 555 15-bit mode, but since the video driver usually
returns 555 and 565 as variants of 16-bit depth, it will often be emulated.

It should probably be cleaned up, one way or the other.

Yep. Agreed. I don’t even think it’s that important which convention is
selected, as long as it’s consistant across the different
drivers/platforms.

555 and 565 are valid 16-bit modes, but 555 is often advertised as a 15
bit mode by the drivers. The ‘bpp’ in the video mode is the number of
bits used to represent a pixel. The number of bytes used to represent
a pixel is calculated from the formula: (bpp+7)/8

Thus, an 8-bit mode takes 1 byte, a 15-bit mode takes 2 bytes, 16-bit
takes 2 bytes, 24-bit mode takes 3 bytes, 32-bit mode takes 4 bytes, etc.
The ordering of the color components within the bytes is not defined.
For example, a 32-bit mode might be XRGB, ARGB, ABGR, BGRA, GABR, etc.,
and a 16-bit mode might be X555, 565, 555X, RGB, BGR. The SDL blitters
are flexible enough to be able to handle really odd formats like
4444 ARGB, or even 13-bit modes, if a little slowly. :slight_smile:

Practically, it probably wouldn’t matter too much if the more exotic
modes aren’t supported, but it would irk me a little - the support for
is already there by the sounds of it, and the more common paths have
optimised implementations. Support for arbitary pixelformats (even if
the odder ones are slow) is a worthy ambition.

Suggestions?
-Sam Lantinga, Lead Programmer, Loki Entertainment Software

OK - arbitary suggestion:
Maybe SDL_SetVideoMode() should treat 555 and 565 both as 16 bit modes
(15 would be an illegal value).
The app could then look at the surface pixelformat to determine which
actual 16bit format the hardware/sdl driver thought was ‘best’ (and the
app would convert graphics data accordingly).

To specify that you want a particular pixelformat, I see that there are
a few options:

  1. change the args of SDL_SetVideoMode() to include a pixelformat arg
    instead of just bpp.
  2. add another function which could be called instead of
    SDL_SetVideoMode() when you need a specific pixelformat.
  3. add a function to be called before SDL_SetVideoMode(), to say which
    exact pixelformat you’d prefer for a given bpp (eg “when I say 16 bits,
    I mean RGB 555 format, please”).

None of these sound particularly great. I especially don’t like 1)
because it breaks compatability and makes it more effort for the 99% of
apps which don’t have to worry about this stuff.

I think I’d go for 2. Maybe a function called SDL_SetVideoModePedantic(
int w, int h, SDL_PixelFormat* fmt, int flags) or similar.

Anyway, It’d be really nice to standardise SDL_SetVideoMode() behaviour
over the various drivers. It’s a pain in the neck if you get one result
under X11 and something other in win32. Not ideal for a cross-platform
lib.

Any opinions?

Ben.–
Ben Campbell
Programmer, Creature Labs
ben.campbell at creaturelabs.com
http://www.creaturelabs.com