SDL_LoadBMP problem

Hi there,

I had a problem with SDL_LoadBMP() the other day which I thought was a bit strange. I’ve been working on something that uses GL, and wanted to use SDL_LoadBMP to load in a file as a texture; at this point I came seriously unstuck when displaying the thing. I tried all kinds of flags for the texturing, and after that didn’t work I looked at the [RGBA]mask values in the surface->format structure, which told me that it was loading the BMP in RGB format. So I took SDL’s response as read and delved back into the GL documentation to find out what I’d done wrong. When I found that nothing was wrong, I dumped the contents of surface->pixels, only to discover that they were BGR ordered.

Why does SDL (at least, 1.2.4, haven’t tried 1.2.5) report the BMP’s colour ordering the wrong way round? Surely this has been a problem for some time, given the number of programs that make use of the library?

Thanks,

Giles Burdett

I had a problem with SDL_LoadBMP() the other day which I thought was a bit strange. I’ve been working on something that uses GL, and wanted to use SDL_LoadBMP to load in a file as a texture; at this point I came seriously unstuck when displaying the thing. I tried all kinds of flags for the texturing, and after that didn’t work I looked at the [RGBA]mask values in the surface->format structure, which told me that it was loading the BMP in RGB format. So I took SDL’s response as read and delved back into the GL documentation to find out what I’d done wrong. When I found that nothing was wrong, I dumped the contents of surface->pixels, only to discover that they were BGR ordered.

SDL uses masks to represent the data as a native int in memory, while OpenGL
RGB BGR refers to the bytes as ordered on disk. Both are reporting correctly.
If you want a fire and forget solution, take a look at test/testgl.c in the
SDL source archive.

See ya,
-Sam Lantinga, Software Engineer, Blizzard Entertainment

Hi Sam,

SDL uses masks to represent the data as a native int in memory, while OpenGL
RGB BGR refers to the bytes as ordered on disk. Both are reporting
correctly. If you want a fire and forget solution, take a look at
test/testgl.c in the SDL source archive.

But SDL gave me this as a byte mask for the BMP:

R: 111111110000000000000000 = 16711680
G: 1111111100000000 = 65280
B: 11111111 = 255

Which I understand as meaning “RGB ordering” - but SDL’s pixel array is
blatantly in “BGR ordering”. Or are you saying that a “native int in memory” on
the x86 is actually the wrong way around and I should (for some strange reason)
read the mask values backwards?

Thanks,

Giles Burdett