8bpp BMP seg faults

Hello, I’m new to SDL and am currently trying to learn how to use SDL and
OpenGL together (under linux for the time being). My problem is this, I’m
trying to load an 8 bit BMP to use as a texture on a GL entity. I’ve
successfully done this with a 24 bit BMP, but when I try to use an 8 bit
BMP it seg faults. I’ve tried SDL 1.1.7 and SDL 1.2.0 using the
SDL_LoadBMP( ) function and tried SDL_image 1.1.7 and SDL_image 1.2.0
using the IMG_Load( ) function with the same results. I’ve tried setting
my screen surface to 8 bit (as opposed to 16), no go. Here’s the code and
the offending lines:

/* Create storage space for the texture */
SDL_Surface *TextureImage[1]; 

/* Load The Bitmap, Check For Errors, If Bitmap's Not Found Quit */
if ( ( TextureImage[0] = SDL_LoadBMP( "data/crate.bmp" ) ) )
    {

    /* Set the status to true */
    Status = TRUE;

    /* Create The Texture */
    glGenTextures( 3, &texture[0] );

    /* Load in texture 1 */
    /* Typical Texture Generation Using Data From The Bitmap */
    glBindTexture( GL_TEXTURE_2D, texture[0] );

    /* Generate The Texture */
    glTexImage2D( GL_TEXTURE_2D, 0, 3, TextureImage[0]->w,
		  TextureImage[0]->h, 0, GL_RGB,
		  GL_UNSIGNED_BYTE, TextureImage[0]->pixels );

After that last line is where it bombs. Any suggestions? Thanks!

Ti Leggett
leggett at eecs.tulane.edu

I also forgot to mention that when I do load the 24 bit BMP, it seems the
Red and Blue colors are switched. i.e., blues come out browns, and browns
come out blue. Is this normal? What’s the best way to switch these?
Switching the Rmask and Bmask in the SDL_PixelFormat structure of the
surface? Thanks!On Fri, 15 Jun 2001, Ti Leggett wrote:

Hello, I’m new to SDL and am currently trying to learn how to use SDL and
OpenGL together (under linux for the time being). My problem is this, I’m
trying to load an 8 bit BMP to use as a texture on a GL entity. I’ve
successfully done this with a 24 bit BMP, but when I try to use an 8 bit
BMP it seg faults. I’ve tried SDL 1.1.7 and SDL 1.2.0 using the
SDL_LoadBMP( ) function and tried SDL_image 1.1.7 and SDL_image 1.2.0
using the IMG_Load( ) function with the same results. I’ve tried setting
my screen surface to 8 bit (as opposed to 16), no go. Here’s the code and
the offending lines:

/* Create storage space for the texture */
SDL_Surface *TextureImage[1]; 

/* Load The Bitmap, Check For Errors, If Bitmap's Not Found Quit */
if ( ( TextureImage[0] = SDL_LoadBMP( "data/crate.bmp" ) ) )
    {

  /* Set the status to true */
  Status = TRUE;

  /* Create The Texture */
  glGenTextures( 3, &texture[0] );

  /* Load in texture 1 */
  /* Typical Texture Generation Using Data From The Bitmap */
  glBindTexture( GL_TEXTURE_2D, texture[0] );

  /* Generate The Texture */
  glTexImage2D( GL_TEXTURE_2D, 0, 3, TextureImage[0]->w,
  	  TextureImage[0]->h, 0, GL_RGB,
  	  GL_UNSIGNED_BYTE, TextureImage[0]->pixels );

After that last line is where it bombs. Any suggestions? Thanks!

Ti Leggett
leggett at eecs.tulane.edu

  /* Generate The Texture */
  glTexImage2D( GL_TEXTURE_2D, 0, 3, TextureImage[0]->w,
  	  TextureImage[0]->h, 0, GL_RGB,
  	  GL_UNSIGNED_BYTE, TextureImage[0]->pixels );

After that last line is where it bombs. Any suggestions? Thanks!

You are giving OpenGL texture data where each byte represents a pixel
(8-bit), but telling it to read three bytes for every pixel
(GL_RGB). glTexImage2D() is reading past the end of the surface’s pixel
data, and probably touching memory your application doesn’t own.

If you’re loading an 8-bit bitmap, you should use GL_COLOR_INDEX (not
GL_RGB) and set up a pixel map with glPixelMapfv().

That’s a pain, though.

Alternately, after loading the 8-bit bitmap, convert the SDL surface to 24
bit:

/* this is completely untested. */
SDL_Surface *bmp = SDL_LoadBMP("data/crate.bmp");

if ( (bmp != NULL) && (bmp->format.BitsPerPixel != 24) ) 
{
    SDL_Surface *tmp;
    SDL_PixelFormat fmt;

    memcpy(fmt, bmp->fmt, sizeof (SDL_PixelFormat));
    fmt->palette = NULL;
    fmt->BitsPerPixel = 24;
    fmt->BytesPerPixel = 3;
    /* set anything else... */

    tmp = SDL_ConvertSurface(bmp, &fmt, SDL_SWSURFACE);
    assert(tmp);
    free(bmp->pixels);
    SDL_FreeSurface(bmp);
    bmp = tmp;
}

/* your GL code goes here, now with a surface guaranteed to be 24-bit, so you can use GL_RGB. */

Hhmmm…that’s also a pain. :slight_smile:

Talk amongst yourselves.

–ryan.

SDL_Surface *bmp = SDL_LoadBMP(“data/crate.bmp”);

if ( (bmp != NULL) && (bmp->format.BitsPerPixel != 24) )
[…]
/* your GL code goes here, now with a surface guaranteed to be 24-bit, so
you can use GL_RGB. */

You also have to verify that the format is correct - in general, SDL_LoadBMP
(and the various loaders in SDL_image) do not guaranteed any particular pixel
format; they are usually chosen to map well to the file format and may
not be what you want. Also, there are endinness issues.

So, the correct way of loading a RGB texture is something like this (untested):

Uint32 rmask, gmask, bmask;
/* define RGB masks for 24-bit R,G,B byte order */

#if SDL_BYTEORDER == SDL_BIG_ENDIAN
rmask = 0xff0000;
gmask = 0x00ff00;
bmask = 0x0000ff;
#else
rmask = 0x0000ff;
gmask = 0x00ff00;
bmask = 0xff0000;
#endif

SDL_Surface *img = IMG_Load(file); /* or SDL_LoadBMP() */
if(img->format->BytesPerPixel != 3
   || img->format->Rmask != rmask
   || img->format->Gmask != gmask
   || img->format->Bmask != bmask) {
    /* We need to convert the image */
    SDL_Surface *s = SDL_CreateRGBSurface(SDL_SWSURFACE, tmp->w, tmp->h,
                                          24, rmask, gmask, bmask, 0);
    SDL_BlitSurface(img, NULL, s, NULL);
    SDL_FreeSurface(img);
    img = s;
}
/* here call glTexImage2D with img->pixels as data */

Exercises for the reader:
a) add error handling above
b) generalise this for RGBA textures
c) and for 15/16bpp textures
d) allow colour-keyed image files be loaded as RGBA textures