SDL/OpenGL and texture mapping

There seem to be several annoying issues (possibily
design oversights?) when using SDL with OpenGL. I’ve
been working on a general SDL/OpenGL image preparer
library trying to support my own projects, and these
are some of the things that have bitten me.

First, the image coordinate systems seemed to be
inverted. In OpenGL, if you imagine your 0,0
coordinate to be in the lower left corner of your
monitor, increasing the y value would move up the
screen, whereas I believe SDL_Surfaces go the other
direction. You can either flip your SDL_Surface
somehow (swapping the correct bits around in the
surface is the most direct way). Or you can adopt the
upside-down convention in your OpenGL code and change
your vertex coordinates so you don’t have to do
anything to the image itself. This might annoy some
OpenGL programmers if they ever have to go through
your code though. If you flip the surface, watch out
if you try rotating the surface 180 degrees. The
horizontal coordinates are correct, so if you rotate,
your x values will then be in the wrong direction.

Second, image formats are not necessarily OpenGL
ready. The easiest thing to do (and the most common
thing) is to use RGB or RGBA formats. However, SDL
surfaces are not always in this format. Without the
SDL_image library, SDL gives you only BMP support.

Looking at just BMP for moment, typically, BMPs are
either 8-bit formats or 24-bit formats (no 32-bit).
24-bit formats are the easiest to work with because
each byte represents a red, green, and blue. However,
there is a catch: SDL doesn’t dictate what order the
pixels are in.

So there is a bug in your code:
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, m_width,
m_height, 0, GL_RGB,
GL_UNSIGNED_BYTE, pBitmap -> pixels);

24bit BMP stores in BGR format, not RGB. (I think
Microsoft did this to be annoying, but I can’t prove
it.) This means that everything that should be red
looks blue, and vice versa. It’s most likely your
channels were swapped, but not actually missing.

So one possible solution is to change the "format"
parameter to GL_BGR:
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, m_width,
m_height, 0, GL_BGR,
GL_UNSIGNED_BYTE, pBitmap -> pixels);

However, the BGR format parameter wasn’t added until
OpenGL 1.2. And ironically, Microsoft stopped their
OpenGL support at 1.1, and you will be dependent on
your video card driver making up the difference.

The other option is to either swap the bits (or create
a new SDL_surface that organizes the bits) in RGB
order.

For 8-bit images, your life gets harder because the
image uses palette. I think you will have to manually
decode each pixel to get the RGB values, and then
place them in a new 24 (or 32) bit surface which you
will pass to OpenGL.

If you start using other image formats via SDL_image,
then you need to worry about all kinds of different
ways the images are stored. For example, the JPEG you
tried probably stored in RGB format which is why it
worked for you. PNG also seems to be RGB. But TGA is
BGR. So it’s a gamble. You should probably write your
code to handle any case if you want to use the
SDL_image library. SDL provides color masks and
SDL_GetRGB(A) to help you, but it’s still kind of
annoying to go through and it would have been nice if
SDL had written their Surface format with OpenGL in
mind.

Finally, there is Endianess issue. When you create a
new surface (say to copy to for the correct RGB order
and vertical direction), you need to specify the
correct mask. The mask will depend on the endianess of
your system. If you’re not careful, this could also
cause your Reds and blues to get mixed up. And this
will also affect portability if you ever go to a
system with a different endianess (e.g. PC to Mac). So
do something like this:

#include “SDL_byteorder.h”

#if SDL_BYTEORDER == SDL_BIG_ENDIAN
glimage = SDL_CreateRGBSurface(SDL_SWSURFACE,
sdlimage->w, sdlimage->h, 32,
0xFF000000, 0x00FF0000, 0x0000FF00, 0x000000FF);
#else
glimage = SDL_CreateRGBSurface(SDL_SWSURFACE,
image->w, image->h, 32,
0x000000FF, 0x0000FF00, 0x00FF0000, 0xFF000000);
#endif

One more note, SDL_Surfaces will not help you with
"colorkey" transparency (i.e. your image format
specifies that a certain color represents
transparency). If you want to use colorkey
transparency, you will also have to write this. You
will need to go through every pixel (assuming the SDL
color key flag has been set), and check to see if the
color matches the colorkey. Then you will need to set
the alpha value to 0 yourself in your OpenGL surface.
Remember that for transparency in OpenGL mode, you
probably should be using GL_RGBA (and not GL_RGB) and
make sure you have a 32-bit surface (unless you know
how to use GL_ALPHA or some of the other special
formats). If you don’t want to go to all this trouble,
then just use 32-bit format images with actual
transparency values already set.

-Eric__________________________________________________
Yahoo! - We Remember
9-11: A tribute to the more than 3,000 lives lost
http://dir.remember.yahoo.com/tribute

First, the image coordinate systems seemed to be
inverted. In OpenGL, if you imagine your 0,0
[snip]

Not really as complicated as you have described. As
David Olofson wrote in his glSDL.c, something like

glOrtho(0, width, height, 0, -1.0, 1.0);

will do the coordinates conversion for you. This works
seamlessly with all coordinates (including texture
coordinates) to let you write 2D applications using
normal topleft original.

For 8-bit images, your life gets harder because the
image uses palette. I think you will have to manually
decode each pixel to get the RGB values, and then
place them in a new 24 (or 32) bit surface which you
will pass to OpenGL.

A simple SDL_BlitSurface() to a 32-bit RGB surface
will do the above said conversion. The nice thing
about SDL is that programmers do not have to deal
with surface format in much detail, but you still
need to understand what is really going on under
the hook.

One more note, SDL_Surfaces will not help you with
"colorkey" transparency (i.e. your image format
specifies that a certain color represents
transparency). If you want to use colorkey
transparency, you will also have to write this. You
will need to go through every pixel (assuming the SDL
color key flag has been set), and check to see if the
color matches the colorkey. Then you will need to set
the alpha value to 0 yourself in your OpenGL surface.

David Olofson also wrote this part in his glSDL
implementation. I really learned a lot by reading
his codes, kudos to David!

formats). If you don’t want to go to all this trouble,
then just use 32-bit format images with actual
transparency values already set.

Yeah that is actually recommended if you are coding
something specifically targetting at OpenGL. And I say
why not, I use all textures in RGBA format uniformly
throughout my application even when the surface doesn’t
need alpha channel. It does not slow down the graphics
hardware in any significant manner.

Regards,
.paul.On Wed, Sep 11, 2002 at 02:19:15AM -0700, Eric Wing wrote:

Hiya,

EW> “but it’s still kind of
EW> annoying to go through and it would have been nice if
EW> SDL had written their Surface format with OpenGL in
EW> mind.”

No it wouldn’t. The surface “format” is a framebuffer and is the way
people have done graphics for years and years on any raster display.
Not the only way, but definitely the main way!

Having 0,0 at the bottom left isn’t the normal way in graphics and I’m
certainly glad the surface format isn’t done like that!

I wouldn’t be against having a seperate surface type that can emulate
the OpenGL co-ordinate format, but that’s a seperate issue. :slight_smile:

Neil.

There seem to be several annoying issues (possibily
design oversights?) when using SDL with OpenGL. I’ve
been working on a general SDL/OpenGL image preparer
library trying to support my own projects, and these
are some of the things that have bitten me.

Well said, these are all accurate. SDL surfaces are designed with 2D
framebuffer graphics in mind, and reflect that. However there is good
code in testgl.c in the SDL source archive that shows how to convert a
general SDL surface into an OpenGL texture, solving all these problems.

For the lazy, I’ll repost the code here:

/* Quick utility function for texture creation */
static int power_of_two(int input)
{
int value = 1;

while ( value < input ) {
    value <<= 1;
}
return value;

}

GLuint SDL_GL_LoadTexture(SDL_Surface *surface, GLfloat *texcoord)
{
GLuint texture;
int w, h;
SDL_Surface *image;
SDL_Rect area;
Uint32 saved_flags;
Uint8 saved_alpha;

/* Use the surface width and height expanded to powers of 2 */
w = power_of_two(surface->w);
h = power_of_two(surface->h);
texcoord[0] = 0.0f;         /* Min X */
texcoord[1] = 0.0f;         /* Min Y */
texcoord[2] = (GLfloat)surface->w / w;  /* Max X */
texcoord[3] = (GLfloat)surface->h / h;  /* Max Y */

image = SDL_CreateRGBSurface(
        SDL_SWSURFACE,
        w, h,
        32,

#if SDL_BYTEORDER == SDL_LIL_ENDIAN /* OpenGL RGBA masks */
0x000000FF,
0x0000FF00,
0x00FF0000,
0xFF000000
#else
0xFF000000,
0x00FF0000,
0x0000FF00,
0x000000FF
#endif
);
if ( image == NULL ) {
return 0;
}

/* Save the alpha blending attributes */
saved_flags = surface->flags&(SDL_SRCALPHA|SDL_RLEACCELOK);
saved_alpha = surface->format->alpha;
if ( (saved_flags & SDL_SRCALPHA) == SDL_SRCALPHA ) {
    SDL_SetAlpha(surface, 0, 0);
}

/* Copy the surface into the GL texture image */
area.x = 0;
area.y = 0;
area.w = surface->w;
area.h = surface->h;
SDL_BlitSurface(surface, &area, image, &area);

/* Restore the alpha blending attributes */
if ( (saved_flags & SDL_SRCALPHA) == SDL_SRCALPHA ) {
    SDL_SetAlpha(surface, saved_flags, saved_alpha);
}

/* Create an OpenGL texture for the image */
glGenTextures(1, &texture);
glBindTexture(GL_TEXTURE_2D, texture);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST);
glTexImage2D(GL_TEXTURE_2D,
         0,
         GL_RGBA,
         w, h,
         0,
         GL_RGBA,
         GL_UNSIGNED_BYTE,
         image->pixels);
SDL_FreeSurface(image); /* No longer needed */

return texture;

}

As you can see, it’s not quite so bad. :slight_smile:
I think this still leaves the texture upside down, but I use glOrtho()
before displaying these textures. Look at testgl.c for all the details.

See ya,
-Sam Lantinga, Software Engineer, Blizzard Entertainment

Eric Wing wrote:

Second, image formats are not necessarily OpenGL
ready. The easiest thing to do (and the most common
thing) is to use RGB or RGBA formats. However, SDL
surfaces are not always in this format. Without the
SDL_image library, SDL gives you only BMP support.

Looking at just BMP for moment, typically, BMPs are
either 8-bit formats or 24-bit formats (no 32-bit).

With this in mind, might it not be a good idea to add a TGA loading
function to the basic SDL core library? TGA is a fairly simple format so
it shouldn’t add much to the library size, and it supports alpha/32-bit
formats (supposedly), making it a better candidate for OpenGL users,
unless I’m mistaken?–
Kylotan
http://pages.eidosnet.co.uk/kylotan

Personaly, I’d like to see TGA loading if at all possible (:> ----- Original Message -----

From: kylotan@kylotan.eidosnet.co.uk (Kylotan)
To:
Sent: Wednesday, September 11, 2002 7:05 PM
Subject: Re: [SDL] SDL/OpenGL and texture mapping

Eric Wing wrote:

Second, image formats are not necessarily OpenGL
ready. The easiest thing to do (and the most common
thing) is to use RGB or RGBA formats. However, SDL
surfaces are not always in this format. Without the
SDL_image library, SDL gives you only BMP support.

Looking at just BMP for moment, typically, BMPs are
either 8-bit formats or 24-bit formats (no 32-bit).

With this in mind, might it not be a good idea to add a TGA loading
function to the basic SDL core library? TGA is a fairly simple format so
it shouldn’t add much to the library size, and it supports alpha/32-bit
formats (supposedly), making it a better candidate for OpenGL users,
unless I’m mistaken?


Kylotan
http://pages.eidosnet.co.uk/kylotan


SDL mailing list
SDL at libsdl.org
http://www.libsdl.org/mailman/listinfo/sdl

Now that Sam has posted his nifty SDL->OpenGL surface
converter, the only remaining piece of the puzzle is
the image inverter.

For me, Ortho2D isn’t always available since I’m
usually in 3D mode, and I frequently use pieces of
code that already assume normal OpenGL texture
coordinate conventions. (I’ve had some very
frightening results after integrating a Quake MD2
model loader into my stuff and the textures mapped to
all the wrong places.)

So here’s the code I use to flip the surfaces. I’d be
happy to see any bug corrections or speed optimations
:slight_smile:

-Eric

/* Will modify the image array so it is vertically
flipped.

  • This is intened to be a helper function to comply
    with

  • OpenGL’s y-coordinate system.

  • Returns -1 on failure, 0 on success

  • Warning: The original data will be modified.

  • Sample Usage: invert_image(image->pitch, image->h,
    &image->pixels)
    /
    static int invert_image(int pitch, int height, void

    image_pixels)
    {
    int index;
    void* temp_row;
    int height_div_2;

    temp_row = (void )malloc(pitch);
    if(NULL == temp_row)
    {
    SDL_SetError(“Not enough memory for image
    inversion”);
    return -1;
    }
    /
    if height is odd, don’t need to swap middle row /
    height_div_2 = (int) (height * .5);
    for(index = 0; index < height_div_2; index++) {
    /
    uses string.h */
    memcpy((Uint8 *)temp_row,
    (Uint8 *)(image_pixels) +
    pitch * index,
    pitch);

     memcpy(
     	(Uint8 *)(image_pixels) + 
     	pitch * index,
     	(Uint8 *)(image_pixels) + 
     	pitch * (height - index-1),
     	pitch);
     memcpy(
     	(Uint8 *)(image_pixels) + 
     	pitch * (height - index-1),
     	temp_row,
     	pitch);
    

    }
    free(temp_row);
    return 0;
    }

/* Convenience wrapper for invert_image /
int SDL_InvertSurface(SDL_Surface
image)
{
if(NULL == image)
{
SDL_SetError(“Surface is NULL”);
return -1;
}
return( invert_image(image->pitch, image->h,
image->pixels) );
}__________________________________________________
Do you Yahoo!?
Yahoo! News - Today’s headlines