A question about SDL_DisplayFormat()

Hi guys!

I have a little question, maybe someone has the right answer.
Well, I set the video mode to 16bits and I load a 32bits surface, how is quite possible that its format will be different to screen format(in fact, depth color are different…), I used SDL_DisplayFormat() or SDL_DisplayFormatAlpha(), depend on the kind of surface, but none of them seem to change the depth color… when I saw the BitsPerPixel value of my loaded surface after to do a displayformat, free the original surface and set last one to the returned one by SDL_DisplayFormat()… I can see 32bits!

Before that, I though that everything is converted(depth color too) but right now I’m a little intrigued with this behaviour, is something wrong? is must be converted to 16bits? or is a normal behaviour?

Thanks you very much in advance!

Cheers

Rober

I have to add an important note, that behaviour only occurs when my loaded surface has alpha per pixel information… maybe as it needs the alpha component…Does it always store the surface as 32bits?

I’m thinking that is possible that converted 32bits surface has the R,G,B pixel information stores as screen format…but it needs a 32bits pixel to store alpha…

Could anyone clarify me this aspect?

thanks you!----- Original Message -----
From: Roberto Prieto
To: sdl at libsdl.org
Sent: Thursday, September 01, 2005 9:44 PM
Subject: [SDL] A question about SDL_DisplayFormat()…

Hi guys!

I have a little question, maybe someone has the right answer.
Well, I set the video mode to 16bits and I load a 32bits surface, how is quite possible that its format will be different to screen format(in fact, depth color are different…), I used SDL_DisplayFormat() or SDL_DisplayFormatAlpha(), depend on the kind of surface, but none of them seem to change the depth color… when I saw the BitsPerPixel value of my loaded surface after to do a displayformat, free the original surface and set last one to the returned one by SDL_DisplayFormat()… I can see 32bits!

Before that, I though that everything is converted(depth color too) but right now I’m a little intrigued with this behaviour, is something wrong? is must be converted to 16bits? or is a normal behaviour?

Thanks you very much in advance!

Cheers

Rober



SDL mailing list
SDL at libsdl.org
http://www.libsdl.org/mailman/listinfo/sdl

you misunderstand( or else i do, and im then screwed )

AFAIK, the surfaces you load are left near enough to the state they were as
a file on disk.

it is only when the blitting to the display surface occurs is conversion
done and , again as far as i have gathered, the conversion is temporary and
does not afffect the blitted surface.

this is needless to say, performance intensive.

the function

my_fast_new_surface = SDL_DisplayFormat( my_crappy_old_surface );

copys the surface, so the original is again, untouched.

maybe i misread you, but i hope this helps

“Roberto Prieto” wrote:

I have to add an important note, that behaviour only occurs when my
loaded surface has alpha per pixel information… maybe as it needs
the alpha component…Does it always store the surface as 32bits?

yes.

I’m thinking that is possible that converted 32bits surface has the
R,G,B pixel information stores as screen format…but it needs a
32bits pixel to store alpha…

exactly.