Behaviour of SDL_LoadBMP

Hi, I’m using SDL_LoadBMP to load a BMP into a surface for an OpenGL texture,
I’ve found that the pixel data is stored in BGR color mode.

Does this behaviour vary with the BMP file, or is the surface guaranteed to 

be in BGR “mode” always? Or do I need to implement a solution to every
possible pixel mode? Maybe there’s a way to convert the surface to a standard
format using SDL?

Thanks for your time.-- 

Simon Ejsing, Systemudvikler
esoft ApS, http://www.esoft.dk
Skibhusvej 52C, DK-5000 Odense C.
Tlf: 70 222 466, Fax: 63 122 466

-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

Le jeudi 10 Juin 2004 10:17, Simon Ejsing a ?crit :

Hi, I’m using SDL_LoadBMP to load a BMP into a surface for an OpenGL
texture, I’ve found that the pixel data is stored in BGR color mode.

Does this behaviour vary with the BMP file, or is the surface guaranteed
to be in BGR “mode” always? Or do I need to implement a solution to every
possible pixel mode? Maybe there’s a way to convert the surface to a
standard format using SDL?

Thanks for your time.

Normally it won’t vary very much although this is due to the fact the most of
the BMP you’ll find are non compressed Windows BMP files. Whenever you turn
to use OS/2 format (which is produced by windows applications sometimes !) or
RLE compressed or JPEG embedded BMP’s then you would face serious bugs.

There are two solutions :

The brutal and not clean way :
Create a new SDL_Surface with the pixel format you want and the size of the
surface returned by SDL_LoadBMP then blit the first onto the latter and
finally free the original BMP surface to avoid memory waste.

The cleaner but less known solution :
Use SDL_ConvertSurface whose signature is the following
SDL_Surface *SDL_ConvertSurface(SDL_Surface *src, SDL_PixelFormat *fmt, Uint32
flags);
You can directly pass the result of SDL_LoadBMP as src but do not forget to
free it after the conversion to avoid memory leaks !


Michel Nolard
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.2.4 (GNU/Linux)

iD8DBQFAyDW8yAKwOMHoSb0RAkYoAJ4sk55lU/w/USQWBHtVPdNF7kfPIACgtXQr
K6YCGM6uLKqkSCfC4N+s2Xc=
=vBBE
-----END PGP SIGNATURE-----

Normally it won’t vary very much although this is due to the fact the most
of the BMP you’ll find are non compressed Windows BMP files. Whenever you
turn to use OS/2 format (which is produced by windows applications
sometimes !) or RLE compressed or JPEG embedded BMP’s then you would face
serious bugs.

That's what I thought, and also the reason I asked :)

There are two solutions :

The brutal and not clean way :
Create a new SDL_Surface with the pixel format you want and the size of the
surface returned by SDL_LoadBMP then blit the first onto the latter and
finally free the original BMP surface to avoid memory waste.

The cleaner but less known solution :
Use SDL_ConvertSurface whose signature is the following
SDL_Surface *SDL_ConvertSurface(SDL_Surface *src, SDL_PixelFormat *fmt,
Uint32 flags);
You can directly pass the result of SDL_LoadBMP as src but do not forget to
free it after the conversion to avoid memory leaks !

I think I'll stick with the "correct" version :). How do I create a proper 

PixelFormat? Am I supposed to set all flags in the struct, or can I load a
"standard" and then modify R/G/Bshift to represent an RGA image? Or is there
a cleaner way to create a RGB PixelFormat?On Thursday 10 June 2004 12:19, Michel Nolard wrote:


Simon Ejsing, Systemudvikler
esoft ApS, http://www.esoft.dk
Skibhusvej 52C, DK-5000 Odense C.
Tlf: 70 222 466, Fax: 63 122 466

-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

Le jeudi 10 Juin 2004 13:07, Simon Ejsing a ?crit :> On Thursday 10 June 2004 12:19, Michel Nolard wrote:

Normally it won’t vary very much although this is due to the fact the
most of the BMP you’ll find are non compressed Windows BMP files.
Whenever you turn to use OS/2 format (which is produced by windows
applications sometimes !) or RLE compressed or JPEG embedded BMP’s then
you would face serious bugs.

That’s what I thought, and also the reason I asked :slight_smile:

There are two solutions :

The brutal and not clean way :
Create a new SDL_Surface with the pixel format you want and the size of
the surface returned by SDL_LoadBMP then blit the first onto the latter
and finally free the original BMP surface to avoid memory waste.

The cleaner but less known solution :
Use SDL_ConvertSurface whose signature is the following
SDL_Surface *SDL_ConvertSurface(SDL_Surface *src, SDL_PixelFormat *fmt,
Uint32 flags);
You can directly pass the result of SDL_LoadBMP as src but do not forget
to free it after the conversion to avoid memory leaks !

I think I’ll stick with the “correct” version :). How do I create a proper
PixelFormat? Am I supposed to set all flags in the struct, or can I load a
"standard" and then modify R/G/Bshift to represent an RGA image? Or is
there a cleaner way to create a RGB PixelFormat?

If you have an SDL_Surface which can be used as a reference, you can simply
use its properties. Here is some pseudo-code :

SDL_Surface * ref;
SDL_Surface * bmp;
SDL_Surface * background;

bmp = SDL_LoadBMP (filename);
background = SDL_ConvertSurface (bmp, ref->format, ref->flags);
SDL_FreeSurface (bmp);
bmp = 0;

NB: If you want to use the Video Surface as a reference surface (i.e if you
want the pixel format to be the same as the display frame buffer for fast
blitting) then you can use SDL_DisplayFormat (or SDL_DisplayFormatAlpha)
instead of SDL_ConverSurface. It is some kind of “call SDL_ConvertSurface and
fill in the blanks in the parameters list” :slight_smile:

I hope this was clear as I am not a native english speaker…


Michel Nolard
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.2.4 (GNU/Linux)

iD8DBQFAyErCyAKwOMHoSb0RArSAAKCJX3CcSjWB6Fg9KP0K3vNeBR5l0gCg0H7R
GHLsmr399d954GTt67eRz2w=
=4K1O
-----END PGP SIGNATURE-----

If you have an SDL_Surface which can be used as a reference, you can simply
use its properties. Here is some pseudo-code :

Unfortunatly I only have my Display Surface, but that's not what I want to 

use, because the images are used as OpenGL textures. To make it simple I just
want to load the image and store it in RGBA format (or RGB format, but that’s
a trivial change). I can’t seem to find a way to create a pixel format for
RGBA with a function call, so I’ve used the following:

SDL_Surface		*image, *dst_image;
SDL_PixelFormat		dst_format = 	{
						NULL,
						24,
						3,
						0xFF000000, 0x00FF0000, 0x0000FF00, 0x000000FF,		// Color masks (RGBA)
						0, 0, 0, 0,						// Color loss (RGBA)
						24, 16, 8, 0,						// Color shift (RGBA)
						0,
						0
					};

debug("Loading BMP texture [%s]\n", filename);
if(!(image = SDL_LoadBMP(filename))) {
	debug("Could not open BMP image [%s]\n", filename);
	return false;
}

if(!(dst_image = SDL_ConvertSurface(image, &dst_format, 0))) {
	debug("Could not convert BMP surface to desired format\n");
	SDL_FreeSurface(image);
	return false;
}

...

To convert the texture to an RGB image. However the colors of the newly 

created image is swapped (what was blue is now red) so I figure my
PixelFormat is wrong. Also I can’t figure out which flags to specify for the
conversion, I guess it doesn’t matter actually, so I just send 0 (it works
but colors are wrong, which I guess the flags has nothing to do with)

Am I doing something wrong? Or have I missed the point completely? :)

SDL_Surface * ref;
SDL_Surface * bmp;
SDL_Surface * background;

bmp = SDL_LoadBMP (filename);
background = SDL_ConvertSurface (bmp, ref->format, ref->flags);
SDL_FreeSurface (bmp);
bmp = 0;

Thanks for the example, unfortunatly I don't have any reference format :-/

NB: If you want to use the Video Surface as a reference surface (i.e if you
want the pixel format to be the same as the display frame buffer for fast
blitting) then you can use SDL_DisplayFormat (or SDL_DisplayFormatAlpha)
instead of SDL_ConverSurface. It is some kind of “call SDL_ConvertSurface
and fill in the blanks in the parameters list” :slight_smile:

I hope this was clear as I am not a native english speaker…

Neither am I :)

Thanks for your help.On Thursday 10 June 2004 13:49, Michel Nolard wrote:


Simon Ejsing, Systemudvikler
esoft ApS, http://www.esoft.dk
Skibhusvej 52C, DK-5000 Odense C.
Tlf: 70 222 466, Fax: 63 122 466

Okay, I was doing something wrong, apparently the SDL_PixelFormat structure
is defined in another order than the man page specifies!!!

This is the correct way:
SDL_PixelFormat dst_format = {
NULL,
32,
4,
0, 0, 0, 0, // Color loss (RGBA)
0, 8, 16, 24, // Color shift (RGBA)
0x000000FF, 0x0000FF00, 0x00FF0000, 0xFF000000, // Color masks (RGBA)
0,
0
};

However, the man page clearly specifies this:

NAME
SDL_PixelFormat- Stores surface format information

STRUCTURE DEFINITION
typedef struct{
SDL_Palette *palette;
Uint8 BitsPerPixel;
Uint8 BytesPerPixel;
Uint32 Rmask, Gmask, Bmask, Amask;
Uint8 Rshift, Gshift, Bshift, Ashift;
Uint8 Rloss, Gloss, Bloss, Aloss;
Uint32 colorkey;
Uint8 alpha;
} SDL_PixelFormat;

Which is WRONG! Someone should update the man pages to be consistent with the 

include files :-/On Thursday 10 June 2004 14:11, Simon Ejsing wrote:

To convert the texture to an RGB image. However the colors of the newly
created image is swapped (what was blue is now red) so I figure my
PixelFormat is wrong. Also I can’t figure out which flags to specify for
the conversion, I guess it doesn’t matter actually, so I just send 0 (it
works but colors are wrong, which I guess the flags has nothing to do with)

Am I doing something wrong? Or have I missed the point completely? :slight_smile:


Simon Ejsing, Systemudvikler
esoft ApS, http://www.esoft.dk
Skibhusvej 52C, DK-5000 Odense C.
Tlf: 70 222 466, Fax: 63 122 466