SDL_SetColorKey and OpenGL textures

Hi,

I’m having problems with colorkeyed opengl textures. I can load and display the
textures without trouble but they look strange. See picture below.

http://www.student.lu.se/~nbi99mla/sdl-opengl-texture.png

The image is a simple 128x128 bitmap. The colorkey here is (255,0,255) on a
white background. It looks like some of the colorkey still remain. The following
code is used to load the texture:

GLuint loadTextureCK(char *filepath, int ckr, int ckg, int ckb){
GLuint texture;
SDL_Surface *imagesurface;
SDL_Surface *tmpsurface;
Uint32 colorkey;
int w, h;

imagesurface = SDL_LoadBMP(filepath);
if (!imagesurface)
return 0;

w = imagesurface->w;
h = imagesurface->h;

/* create temporary surface with the correct OpenGL format */
tmpsurface = SDL_CreateRGBSurface(SDL_SWSURFACE, w, h, 32,
#if SDL_BYTEORDER == SDL_LIL_ENDIAN
0x000000FF,
0x0000FF00,
0x00FF0000,
0xFF000000
#else
0xFF000000,
0x00FF0000,
0x0000FF00,
0x000000FF
#endif
);
if (!tmpsurface)
return 0;

/* set colour key */
colorkey = SDL_MapRGBA(tmpsurface->format, ckr, ckg, ckb, 0);
SDL_FillRect(tmpsurface, NULL, colorkey);

colorkey = SDL_MapRGBA(imagesurface->format, ckr, ckg, ckb, 0);
SDL_SetColorKey(imagesurface, SDL_SRCCOLORKEY, colorkey);

SDL_BlitSurface(imagesurface, NULL, tmpsurface, NULL);

/* create OpenGL texture */
glGenTextures(1, &texture);
glBindTexture(GL_TEXTURE_2D, texture);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, w, h,
0, GL_RGBA, GL_UNSIGNED_BYTE, tmpsurface->pixels);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);

SDL_FreeSurface(imagesurface);
SDL_FreeSurface(tmpsurface);

return texture;
}

Anyone know what the problem is?

Thank you in advance,
Marcus

On Saturday 13 August 2005 14:30, Marcus Larsson fired a shotgun at the
keyboard and the following appeared:

Hi,

I’m having problems with colorkeyed opengl textures. I can load and display
the textures without trouble but they look strange. See picture below.

http://www.student.lu.se/~nbi99mla/sdl-opengl-texture.png

The image is a simple 128x128 bitmap. The colorkey here is (255,0,255) on a
white background. It looks like some of the colorkey still remain. The
following code is used to load the texture:

This looks like something with the texture itself, not an OpenGL problem. How 

did you create the image? Take a good look at it in a paint editor and I
suspect you’ll find some of the pixels near the black/magenta borders are not
quite matching you color key, but are like 249,0,249 or something like that.
Most editors have an option to anti-alias edges when using various tools.
Looks nice for some things, is barely noticeable on others, but doesn’t play
well with color keys. This is a common problem. Paint Shop has a checkbox for
selecting anti-alias, IIRC it’s default to “on” for most tools. Judging from
the frequency of the problem, I’d guess PhotoShop also defaults to AA. Also,
resizing the image in most apps will anti-alias all edges as well.
One way to ensure things stay the way they are supposed to be is use an
indexed color palette and only have black and magenta in it.

HTH,

-Matt Bailey

Matt Bailey <mattb rtccom.net> writes:

This looks like something with the texture itself, not an OpenGL problem. How
did you create the image? Take a good look at it in a paint editor and I
suspect you’ll find some of the pixels near the black/magenta borders are not
quite matching you color key, but are like 249,0,249 or something like that.

-Matt Bailey

That was my first guess as well, but according to gimp there are only two uniqe
colors (black and magenta) in the image so that can’t be the problem.

Marcus Larsson <trueregret yahoo.se> writes:

That was my first guess as well, but according to gimp there are only two uniqe
colors (black and magenta) in the image so that can’t be the problem.

I seem to “solved” the problem now. When using GL_LINEAR to scale the textures
opengl seems to use the color of the transparent pixels. If I change

glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);

to

glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST);

it works. The textures look like crap when scaled though.

Marcus Larsson wrote:

I seem to “solved” the problem now. When using GL_LINEAR to scale the
textures opengl seems to use the color of the transparent pixels. If
I change GL_LINEAR to GL_NEAREST it works. The textures look like
crap when scaled though.

I think the easiest way around this problem is using an alpha channel
instead of a color key in the image file (once in OpenGL, you’re already
using one anyway, so there’s no difference at display time). Then you
can make the transparent pixels black as well (RGBA 0,0,0,0, that is),
or whatever color the borders of the non-transparent areas have.

Or, after loading the colorkeyed image into the intermediate RGBA
surface, fill the RGB components of each pixel at the border of the
transparent area with a color similar to its neighboring non-transparent
pixels (but keep its alpha component at 0).

-Christian

Marcus Larsson wrote:

I’m having problems with colorkeyed opengl textures. I can load and
display the textures without trouble but they look strange. See
picture below.

[…]

/* create OpenGL texture */
glGenTextures(1, &texture);
glBindTexture(GL_TEXTURE_2D, texture);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, w, h,
0, GL_RGBA, GL_UNSIGNED_BYTE, tmpsurface->pixels);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);

i do barly know ogl, so this is just a wild guess: the last two lines of
this code do aply a filter to the texture, aren’t they? can it be that
they do alter your colors before the textures are sent to the
visible buffer and the colorkey algo is deployed?

clemens

/* set colour key */
colorkey = SDL_MapRGBA(tmpsurface->format, ckr, ckg, ckb, 0);
SDL_FillRect(tmpsurface, NULL, colorkey);

colorkey = SDL_MapRGBA(imagesurface->format, ckr, ckg, ckb, 0);
SDL_SetColorKey(imagesurface, SDL_SRCCOLORKEY, colorkey);

My guess is you are uploading a texture that still has the colorkey
(since you filled tmpsurface with the colorkey color for some reason),
but an alpha of 0. When you then resize the texture with linear
filtering, it’s pulling in the purple colorkey texels with 0 alpha and
blending it with the adjacent black texels that have full alpha…and
you get something that’s a little purple on the edges.

Clear tmpsurface to something that’s entirely 0 (including the alpha
channel) and then do a colorkey blit from imagesurface. That way you end
up with a texture that’s really transparent wherever you had colorkey
pixels.

–ryan.