Pixel format queries with SDL_Image and OpenGL

Hi everyone,

I’m having problems deciphering the pixel format values that SDL_Image is
returning with the surfaces it creates. I’m trying to supply the pixel data
in these surfaces to OpenGL for texturing, and while I can make this work
without much of a problem, I seem to keep having to choose the OpenGL
texturing “format” and “type” parameters by hand, rather than relying on the
surface’s pixel format to work it out automatically.

Here’s an example of the pixel format returned from two slightly different
images, both in PNG format. The second image is identical to the first apart
from the addition of an alpha channel using the GIMP. Sorry for the info
pulled in from the docs - I just put it in my surface inspection routine to
remind me what I’m looking at.

graded background.png

SDL debug: Surface Flags…
SDL debug: Surface Pixel Format…
Surface dimensions: width=1024 pixels, height=1024 pixels
No palette information.
BitsPerPixel=24
The number of bits used to represent each pixel in a surface. Usually 8, 16,
24 or 32.
BytesPerPixel=3
The number of bytes used to represent each pixel in a surface. Usually one to
four.
Rmask=0xff, Gmask=0xff00, Bmask=0xff0000, Amask=0xff0000
Binary mask used to retrieve individual color values.
Rshift=0x0, Gshift=0x8, Bshift=0x10, Ashift=0x10
Binary left shift of each color component in the pixel value.
Rloss=0x0, Gloss=0x0, Bloss=0x0, Aloss=0x0
Precision loss of each color component (2^[RGBA]loss).
Colourkey=0 - Pixel value of transparent pixels.
Alpha=255 - Overall surface alpha value.

graded background alpha.png

SDL debug: Surface Flags…
SDL_SRCALPHA - Surface blit uses alpha blending
SDL debug: Surface Pixel Format…
Surface dimensions: width=1024 pixels, height=1024 pixels
No palette information.
BitsPerPixel=32
The number of bits used to represent each pixel in a surface. Usually 8, 16,
24 or 32.
BytesPerPixel=4
The number of bytes used to represent each pixel in a surface. Usually one to
four.
Rmask=0xff, Gmask=0xff00, Bmask=0xff0000, Amask=0xff0000
Binary mask used to retrieve individual color values.
Rshift=0x0, Gshift=0x8, Bshift=0x10, Ashift=0x10
Binary left shift of each color component in the pixel value.
Rloss=0x0, Gloss=0x0, Bloss=0x0, Aloss=0x0
Precision loss of each color component (2^[RGBA]loss).
Colourkey=0 - Pixel value of transparent pixels.
Alpha=255 - Overall surface alpha value.

Questions:

  1. Why does the graded background.png have no surface flags? Similarly why
    isn’t the alpha’d graphic bestowed with a software or hardware surface flag?

  2. Looking at the surface flags in the header file, why is SWSURFACE zero? How
    can I bitwise-and that to see if it is true or not for a surface?

This program is running on an x86, so I realise little-endianness applies
here. But:

  1. The masks in graded background.png seem OK for R, G, and B, given the
    number of bytes per pixel, but why is the alpha mask the same as the blue
    mask, and the alpha shift the same as the blue shift? Shouldn’t the alpha
    mask be zero if the surface has no alpha information?

  2. The masks in graded background alpha.png are the same as for the
    non-alpha’d version! How is that possible? Shouldn’t Amask be different from
    Bmask, and Ashift from Bshift? And also, to get this surface to texture
    correctly, I have to pass it to OpenGL with GL_BGRA and
    GL_UNSIGNED_INT_8_8_8_8_REV (which added together is the same as saying
    ARGB). That really doesn’t match up with the mask values. Does it?

I’m getting lost here guys - am I missing something or is SDL_Image messing
with my head? Please help!

Thanks,

Giles

Giles Burdett wrote:

Hi everyone,

I’m having problems deciphering the pixel format values that SDL_Image is
returning with the surfaces it creates. I’m trying to supply the pixel data
in these surfaces to OpenGL for texturing, and while I can make this work
without much of a problem, I seem to keep having to choose the OpenGL
texturing “format” and “type” parameters by hand, rather than relying on the
surface’s pixel format to work it out automatically.

Here’s an example of the pixel format returned from two slightly different
images, both in PNG format. The second image is identical to the first apart
from the addition of an alpha channel using the GIMP. Sorry for the info
pulled in from the docs - I just put it in my surface inspection routine to
remind me what I’m looking at.

graded background.png

SDL debug: Surface Flags…
SDL debug: Surface Pixel Format…
Surface dimensions: width=1024 pixels, height=1024 pixels
No palette information.
BitsPerPixel=24
The number of bits used to represent each pixel in a surface. Usually 8, 16,
24 or 32.
BytesPerPixel=3
The number of bytes used to represent each pixel in a surface. Usually one to
four.
Rmask=0xff, Gmask=0xff00, Bmask=0xff0000, Amask=0xff0000
Binary mask used to retrieve individual color values.
Rshift=0x0, Gshift=0x8, Bshift=0x10, Ashift=0x10
Binary left shift of each color component in the pixel value.
Rloss=0x0, Gloss=0x0, Bloss=0x0, Aloss=0x0
Precision loss of each color component (2^[RGBA]loss).
Colourkey=0 - Pixel value of transparent pixels.
Alpha=255 - Overall surface alpha value.

graded background alpha.png

SDL debug: Surface Flags…
SDL_SRCALPHA - Surface blit uses alpha blending
SDL debug: Surface Pixel Format…
Surface dimensions: width=1024 pixels, height=1024 pixels
No palette information.
BitsPerPixel=32
The number of bits used to represent each pixel in a surface. Usually 8, 16,
24 or 32.
BytesPerPixel=4
The number of bytes used to represent each pixel in a surface. Usually one to
four.
Rmask=0xff, Gmask=0xff00, Bmask=0xff0000, Amask=0xff0000
Binary mask used to retrieve individual color values.
Rshift=0x0, Gshift=0x8, Bshift=0x10, Ashift=0x10
Binary left shift of each color component in the pixel value.
Rloss=0x0, Gloss=0x0, Bloss=0x0, Aloss=0x0
Precision loss of each color component (2^[RGBA]loss).
Colourkey=0 - Pixel value of transparent pixels.
Alpha=255 - Overall surface alpha value.

Questions:

  1. Why does the graded background.png have no surface flags? Similarly why
    isn’t the alpha’d graphic bestowed with a software or hardware surface flag?

If there is no h/w flag, the surface is s/w

  1. Looking at the surface flags in the header file, why is SWSURFACE zero? How
    can I bitwise-and that to see if it is true or not for a surface?

if ((surface->flags & SDL_HWSURFACE)==SDL_SWSURFACE)
{
printf(“the surface is s/w\n”);
}

This program is running on an x86, so I realise little-endianness applies
here. But:

  1. The masks in graded background.png seem OK for R, G, and B, given the
    number of bytes per pixel, but why is the alpha mask the same as the blue
    mask, and the alpha shift the same as the blue shift? Shouldn’t the alpha
    mask be zero if the surface has no alpha information?

  2. The masks in graded background alpha.png are the same as for the
    non-alpha’d version! How is that possible? Shouldn’t Amask be different from
    Bmask, and Ashift from Bshift? And also, to get this surface to texture
    correctly, I have to pass it to OpenGL with GL_BGRA and
    GL_UNSIGNED_INT_8_8_8_8_REV (which added together is the same as saying
    ARGB). That really doesn’t match up with the mask values. Does it?

I’m getting lost here guys - am I missing something or is SDL_Image messing
with my head? Please help!

I think you’re looking at the problem the wrong way. You’ll end up with
trouble trying to match the OpenGL surface format to the SDL_Surface
format (for example, if your surface format changes, or the surface has
a palette…)

What I always do is create a tmp SDL_Surface with the right OpenGL
format, and then blit my bmp on it, so that I’m assured to always feed
OpenGL with correct data, like in the following example :

            glGenTextures(1,&tex_nb);
            SDL_Surface* image;
            image=SDL_LoadBMP(filename);
            width=image->w;
            height=image->h;

SDL_SetColorKey(image,SDL_SRCCOLORKEY,SDL_MapRGB(image->format,0,0,0));
SDL_Surface* alpha_image;
alpha_image=SDL_CreateRGBSurface(SDL_SWSURFACE,width,
height, 32, 0x000000ff, 0x0000ff00, 0x00ff0000, 0xff000000);
SDL_BlitSurface(image, NULL, alpha_image, NULL);

            glBindTexture(GL_TEXTURE_2D,tex_nb);
            glTexParameteri (GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, 

GL_LINEAR);
glTexParameteri (GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER,
GL_LINEAR);
glTexParameteri (GL_TEXTURE_2D, GL_TEXTURE_WRAP_S,
GL_CLAMP);
glTexParameteri (GL_TEXTURE_2D, GL_TEXTURE_WRAP_T,
GL_CLAMP);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, width, height,
0, GL_RGBA, GL_UNSIGNED_BYTE, alpha_image->pixels);
SDL_FreeSurface(image);
SDL_FreeSurface(alpha_image);

There’s one issue with this method though, if your image has an alpha
channel you’ll lose it during blit so you have to blit the surface
yourself using getpixel/putpixel. I don’t have this limitation in my
example because I’m using a colorkeyed source surface.

Stephane

Rmask=0xff, Gmask=0xff00, Bmask=0xff0000, Amask=0xff0000

A tip: always pad mask output to eight characters. It’s easier to
understand them that way.

if ((surface->flags & SDL_HWSURFACE)==SDL_SWSURFACE)
{
printf(“the surface is s/w\n”);
}

FWIW, I find

if( !(surface->flags & SDL_HWSURFACE) )

to be more obvious.

  1. The masks in graded background.png seem OK for R, G, and B, given the
    number of bytes per pixel, but why is the alpha mask the same as the blue
    mask, and the alpha shift the same as the blue shift? Shouldn’t the alpha
    mask be zero if the surface has no alpha information?

Yeah, it should. Could you narrow down the code to show this, and post
it to the list? Which versions of SDL and SDL_image are you using? Also,
please make the files you’re using to test with available.

What I always do is create a tmp SDL_Surface with the right OpenGL
format, and then blit my bmp on it, so that I’m assured to always feed
OpenGL with correct data, like in the following example :

This means an extra blit, which is annoying if you want loads to be as
fast as possible. I match up the masks of the image with known OpenGL
(or D3D) ones, and only re-blit the image if nothing matches. It’s
nontrivial, though.

There’s one issue with this method though, if your image has an alpha
channel you’ll lose it during blit so you have to blit the surface
yourself using getpixel/putpixel. I don’t have this limitation in my
example because I’m using a colorkeyed source surface.

All you have to do to prevent that is to turn SDL_SRCALPHA off:

SDL_SetAlpha( src, 0, SDL_ALPHA_OPAQUE );On Sun, Mar 14, 2004 at 02:36:03PM +0100, Stephane Marchesin wrote:


Glenn Maynard

Giles Burdett wrote:

  1. Why does the graded background.png have no surface flags?
    Similarly why isn’t the alpha’d graphic bestowed with a software or
    hardware surface flag?

See below.

  1. Looking at the surface flags in the header file, why is SWSURFACE
    zero? How can I bitwise-and that to see if it is true or not for a
    surface?

SWSURFACE is simply the absence of HWSURFACE.

This program is running on an x86, so I realise little-endianness
applies here. But:

  1. The masks in graded background.png seem OK for R, G, and B, given
    the number of bytes per pixel, but why is the alpha mask the same as
    the blue mask, and the alpha shift the same as the blue shift?
    Shouldn’t the alpha mask be zero if the surface has no alpha
    information?

Probably because your inspection function is incorrect. Make sure you
really are checking the alpha mask instead of checking the blue mask twice.

  1. The masks in graded background alpha.png are the same as for the
    non-alpha’d version! How is that possible? Shouldn’t Amask be
    different from Bmask, and Ashift from Bshift?

See above.–
Rainer Deyke - rainerd at eldwood.com - http://eldwood.com