Colorkey

Hi all!

My question is about colorkeying. I’ve been reading the glSDL
code and the colorkey is implemented with alpha blending when
it’s used in openGL. Is there other way to do it?

I’m working on a 16 bpp XServer and I’m trying to save some
bits from the alpha channel to dedicate it to the colour channels.
When I create a texture from a graphic resource with colorkey
i have to use the internal format GL_RGB5_A1. For a normal RGB
graphic resource I use the GL_RGB5 format. When I show the first
the blue colour gradation is very visible! but the same without
colorkey I get a perfect uniform gradation.

Do you know a good tutorial, link, etc about openGL texture’s
internal formats ?? because red book, etc basically only
enumerates this values!

Any comment is welcome,
Thanks

jorgefm at cirsa.com wrote:

Hi all!

My question is about colorkeying. I’ve been reading the glSDL
code and the colorkey is implemented with alpha blending when
it’s used in openGL. Is there other way to do it?

No. Well, not with OpenGL 1.2 (which is what glSDL targets) at least.

I’m working on a 16 bpp XServer and I’m trying to save some
bits from the alpha channel to dedicate it to the colour channels.

What do you expect from that ? That the hardware itself will support
more bits ? That’s not the case.
Which means, that, as soon as the hardware gets the data, it has to be
in the one format it supports (usually that’s 565 for 16bpp).

Plus, you say “some bits” but there is only one alpha bit to “save”.

When I create a texture from a graphic resource with colorkey
i have to use the internal format GL_RGB5_A1. For a normal RGB
graphic resource I use the GL_RGB5 format. When I show the first
the blue colour gradation is very visible! but the same without
colorkey I get a perfect uniform gradation.

Do you know a good tutorial, link, etc about openGL texture’s
internal formats ?? because red book, etc basically only
enumerates this values!

What do you want to know precisely ?

I’m not sure what you’re trying to achieve here, but it doesn’t sound
like it’s possible at all.

Stephane

Hi all!

My question is about colorkeying. I’ve been reading the glSDL
code and the colorkey is implemented with alpha blending when
it’s used in openGL. Is there other way to do it?

No. Well, not with OpenGL 1.2 (which is what glSDL targets) at least.

I’m working on a 16 bpp XServer and I’m trying to save some
bits from the alpha channel to dedicate it to the colour channels.

What do you expect from that ? That the hardware itself will support
more bits ? That’s not the case.
Which means, that, as soon as the hardware gets the data, it has to be
in the one format it supports (usually that’s 565 for 16bpp).

Plus, you say “some bits” but there is only one alpha bit to “save”.

When I create a texture from a graphic resource with colorkey
i have to use the internal format GL_RGB5_A1. For a normal RGB
graphic resource I use the GL_RGB5 format. When I show the first
the blue colour gradation is very visible! but the same without
colorkey I get a perfect uniform gradation.

Do you know a good tutorial, link, etc about openGL texture’s
internal formats ?? because red book, etc basically only
enumerates this values!

What do you want to know precisely ?

I’m not sure what you’re trying to achieve here, but it doesn’t sound
like it’s possible at all.

I only expect a similar way to do the colorkey in openGL like the
SDL way. I was thinking in setting a mask, a filter, an special
opengl extension, ??, to block a specific color (the colorkey) to reach
the color framebuffer. This way I’ll save to make it through the alpha
channel way (via a blending operation) and save this bit.

In a 16bpp xserver is visible the difference between a graphic file
in 565 format and the 555 one. In my app I display the same graphic
once mapped over a texture with the GL_RGB5_A1 format (that it’s the
case when i apply the colorkey) and the graphic mapped over a texture
with the GL_RGB5 format. In the first case the color gradation is
visible but the colorkey color is filtered, and in the second case
the color gradation is not visible but the colorkey is not filtered.

My loading function is the next one. I hope this helps to clarify
my question :slight_smile:

Thanks,
Jorge

int TGSGraphicRes::LoadTexture( char *filename, TGSTexture *tex )
{
SDL_Surface *image;
SDL_Rect src;
Uint32 saved_flags;
Uint8 saved_alpha;
GLint internal_format;
GLenum format;

// Try to load the surface from disk.
SDL_Surface *surface = IMG_Load( filename );
if( surface == NULL ) {
TRACEMSG( “Error loading ‘%s’.\n”, filename );
return 0;
}

// Check if the surface has a transparent color defined.
if( LoadColorKey( filename ) == 1 ) {
Uint8 r, g, b;
GetColorKey( r, g, b );
SDL_SetColorKey( surface,
SDL_SRCCOLORKEY,
SDL_MapRGB( surface->format, r, g, b ) );
}

// Create a temporal surface.
image = SDL_CreateRGBSurface( SDL_SWSURFACE,
tex->size_w, tex->size_h,
32,
#if SDL_BYTEORDER == SDL_LIL_ENDIAN /* OpenGL RGBA masks */
0x000000FF,
0x0000FF00,
0x00FF0000,
0xFF000000
#else
0xFF000000,
0x00FF0000,
0x0000FF00,
0x000000FF
#endif
);
if( image == NULL ) {
SDL_FreeSurface( surface );
return 0;
}

// Save the alpha blending attributes.
saved_flags = surface->flags & (SDL_SRCALPHA|SDL_RLEACCELOK);
saved_alpha = surface->format->alpha;
if( (saved_flags & SDL_SRCALPHA) == SDL_SRCALPHA ) {
SDL_SetAlpha( surface, 0, 0 );
}

// Copy the surface into the GL texture image.
src.x = tex->rect.x;
src.y = tex->rect.y;
src.w = tex->size_w;
src.h = tex->size_h;
SDL_BlitSurface( surface, &src, image, NULL );

// Restore the alpha blending attributes.
if( (saved_flags & SDL_SRCALPHA) == SDL_SRCALPHA ) {
SDL_SetAlpha( surface, saved_flags, saved_alpha );
}

// Create an OpenGL texture for the image.
glGenTextures( 1, &(tex->id) );
glBindTexture( GL_TEXTURE_2D, tex->id );
glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR );
glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR );
glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP );
glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP );

// ***** THIS IS THE IMPORTANT CODE TO CHECK *****

// Set the format to use.
// Check if the surface has alpha blending else we use the optimum
// values to get the maximum speed.
if( (saved_flags & SDL_SRCALPHA) == SDL_SRCALPHA ) {
internal_format = GL_RGBA8;
format = GL_RGBA;
} else {
if( m_colorkey_defined ) {
internal_format = GL_RGB5_A1;
format = GL_RGBA;
} else {
internal_format = GL_RGB5;
format = GL_RGBA;
}
}

// ***** END *****

// Generate the texture.
glPushClientAttrib( GL_CLIENT_PIXEL_STORE_BIT );
glPixelStorei( GL_UNPACK_ROW_LENGTH, image->pitch /
image->format->BytesPerPixel );
glTexImage2D( GL_TEXTURE_2D,
0,
internal_format,
image->w, image->h,
0,
format,
GL_UNSIGNED_BYTE,
image->pixels );
glPopClientAttrib();

// Free the surfaces.
SDL_FreeSurface( image );
SDL_FreeSurface( surface );

return 1;
}

jorgefm at cirsa.com wrote:

I only expect a similar way to do the colorkey in openGL like the
SDL way. I was thinking in setting a mask, a filter, an special
opengl extension, ??, to block a specific color (the colorkey) to reach
the color framebuffer. This way I’ll save to make it through the alpha
channel way (via a blending operation) and save this bit.

Yes, you can do that with some advanced multitexture/combiners extensions.
But what’s the point compared to using a RGBA8 texture and be done with it ?
RGBA8 will probably perform better speed-wise, and is more portable.

Stephane

I only expect a similar way to do the colorkey in openGL like the
SDL way. I was thinking in setting a mask, a filter, an special
opengl extension, ??, to block a specific color (the colorkey) to reach
the color framebuffer. This way I’ll save to make it through the alpha
channel way (via a blending operation) and save this bit.

Yes, you can do that with some advanced multitexture/combiners extensions.
But what’s the point compared to using a RGBA8 texture and be done with it
?
RGBA8 will probably perform better speed-wise, and is more portable.

I thought initially, that this could be more easy! :slight_smile:

I have read that this can achieve with fragment programs using the
ARB_vertex_program extension. Another way is to use a mask file where
the colorkey color is a 0 and the other colors are 1. An example is
the nehe tutorial number 20.

To get the exact point I’ve attached a file with the different
outputs depending the texture internal format. In the file_rgb5
you could see the desired output but there, the colorkey is visible
(the pure green color 0,255,0). If I use the GL_RGB5_A1 i get a
correct output without the colorkey color but you could see the
blue gradation. And, the last one, with the GL_RGBA8 the output
is correct again but now the blue gradation is more visible!

I’m using an intel i865G with DRI enable, and my xserver is
running in 1024x768 and 16bpp. The original file is a 24bpp TGA
file.

Any comment?
Thanks.

jorgefm at cirsa.com wrote:

I only expect a similar way to do the colorkey in openGL like the
SDL way. I was thinking in setting a mask, a filter, an special
opengl extension, ??, to block a specific color (the colorkey) to reach
the color framebuffer. This way I’ll save to make it through the alpha
channel way (via a blending operation) and save this bit.

Yes, you can do that with some advanced multitexture/combiners extensions.
But what’s the point compared to using a RGBA8 texture and be done with it

?

RGBA8 will probably perform better speed-wise, and is more portable.

I thought initially, that this could be more easy! :slight_smile:

I have read that this can achieve with fragment programs using the
ARB_vertex_program extension.

Yes, you can do that, but you need the ARB_fragment_program extension.

Another way is to use a mask file where
the colorkey color is a 0 and the other colors are 1. An example is
the nehe tutorial number 20.

Yes, that’s about as heavy memory-wise as using RGBA directly, plus it
requires more fillrate to render.

To get the exact point I’ve attached a file with the different
outputs depending the texture internal format. In the file_rgb5
you could see the desired output but there, the colorkey is visible
(the pure green color 0,255,0). If I use the GL_RGB5_A1 i get a
correct output without the colorkey color but you could see the
blue gradation. And, the last one, with the GL_RGBA8 the output
is correct again but now the blue gradation is more visible!

Your file isn’t there, but I don’t see how RGBA8 can have a lower
quality than RGB5_A1.

Stephane