How to guarentee a surface is the right format when making a texture w/ OpenGL

high,

ive recently started using OpenGL w/ SDL and am having some problems. how
would i go about making sure a surface is in the right (ie same) format
everytime no matter what? so far, this is what i do:

load in the image into a surface. now make another surface, which is
SDL_DisplayFormat(original_surface);

ok, now i have a surface which has the same pixel depth, etc of my screen.
now i just need to make it in RGB format. so i make another surface via
SDL_CreateRGBSurface(), and then i blit my DisplayFormat’d surface onto
this newly created surface. ok, great, now it has the same everything. i
just make the texture with GL_RGBA as the flags…

this works!! all my images are loading in properly, no matter what. only
heres the problem : i lose my alpha channel!!! why oh why am i losing my
alpha channel? i like it =)… anyway, how would i go about doing this so i
keep my alpha channel? i want a way that works no matter what image (if i
skip the second part of my method, IE i dont blt the surface onto an RGB
surface, only half my textures load in properly, but the ones that do keep
their alpha channel. so im assuming blitting my surface onto the RGB surface
makes me lose the alpha channel, and YES i send it
SDL_SRCALPHA|SDL_HWSURFACE… i read on a site im supsoed to use
PutPixel/GetPixel to do this, but im not sure how it works… can anyone
explain or suggest a differnt method?

thanks a lot for any help!!!_________________________________________________________________
Best Restaurant Giveaway Ever! Vote for your favorites for a chance to win
$1 million! http://local.msn.com/special/giveaway.asp

ive recently started using OpenGL w/ SDL and am having some problems.
how would i go about making sure a surface is in the right (ie same)
format everytime no matter what?

only heres the problem : i lose my alpha channel!!! why oh why am i
losing my alpha channel? i like it =)… anyway, how would i go about
doing this so i keep my alpha channel?

Here’s what I’m doing. It works for loading 32 bits PNG pictures and 24
bits pics as well.
(error checks removed for clarity)

pic = IMG_Load( filename ); // SDL_image function

// ----- Create a new surface of the desired type (RGBA) -----

#if SDL_BYTEORDER == SDL_BIG_ENDIAN
rmask = 0xff000000;
gmask = 0x00ff0000;
bmask = 0x0000ff00;
amask = 0x000000ff;
#else
rmask = 0x000000ff;
gmask = 0x0000ff00;
bmask = 0x00ff0000;
amask = 0xff000000;
#endif

// pic2 is just used for its pixel format
pic2 = SDL_CreateRGBSurface( SDL_SWSURFACE, 1, 1, 32,
rmask, gmask, bmask, amask);

pic3 = SDL_ConvertSurface( pic, pic2->format, SDL_SWSURFACE );

pt = CALLOC( struct Picture, 1 );
pt->RGB = malloc( pt->uwWidthpt->uwHeight(3+alpha) );

// note: alpha is boolean, 0 for RGB or 1 for RGBA

long int src, dst = 0;
for( int y = 0; y < pic3->h; y++ )
{
src = y * pic3->pitch;
for( int x = 0; x < pic3->w; x++ )
{
for( int i = 0; i < 3 + alpha; i++ )
pt->RGB[dst++] = ((UBYTE*)pic3->pixels)[src++];
if( !alpha ) src++;
}
}On 31/05/2004, Graveyard Filla, you wrote:


Please remove “.ARGL.invalid” from my email when replying.
Incoming HTML mails are automatically deleted.

load in the image into a surface. now make another surface, which is
SDL_DisplayFormat(original_surface);

ok, now i have a surface which has the same pixel depth, etc of my screen.

now i just need to make it in RGB format. so i make another surface via
SDL_CreateRGBSurface(), and then i blit my DisplayFormat’d surface onto
this newly created surface. ok, great, now it has the same everything. i
just make the texture with GL_RGBA as the flags…

this works!! all my images are loading in properly, no matter what. only
heres the problem : i lose my alpha channel!!! why oh why am i losing my
alpha channel? i like it =)… anyway, how would i go about doing this so
i
keep my alpha channel? i want a way that works no matter what image (if i
skip the second part of my method, IE i dont blt the surface onto an RGB
surface, only half my textures load in properly, but the ones that do
keep
their alpha channel. so im assuming blitting my surface onto the RGB
surface
makes me lose the alpha channel, and YES i send it
SDL_SRCALPHA|SDL_HWSURFACE… i read on a site im supsoed to use
PutPixel/GetPixel to do this, but im not sure how it works… can anyone
explain or suggest a differnt method?

Have you try SDL_DisplayFormatAlpha ? Basically it does the same than
SDL_DisplayFormat but it saves the alpha channel too.

Jorge

Graveyard Filla wrote:

high,

ive recently started using OpenGL w/ SDL and am having some problems.
how would i go about making sure a surface is in the right (ie same)
format everytime no matter what? so far, this is what i do:

load in the image into a surface. now make another surface, which is
SDL_DisplayFormat(original_surface);

Well, SDL_DisplayFormat is not the format of the OpenGL texture, but the
format of the video surface, which is not the same.
That’s quite simple, there are three different pixel formats when doing
OpenGL :

  • the pixel format you use to handle OpenGL the pixels needed to create
    a new texture.
  • the pixel format at which the texture is stored on card
  • the pixel format of the video. Everything is rendered in this format.

Now what you’re interested in is the first of these three. OpenGL is
designed so that you’ll never have to deal with the two others.

ok, now i have a surface which has the same pixel depth, etc of my
screen. now i just need to make it in RGB format. so i make another
surface via SDL_CreateRGBSurface(), and then i blit my
DisplayFormat’d surface onto this newly created surface. ok, great,
now it has the same everything. i just make the texture with GL_RGBA
as the flags…

this works!! all my images are loading in properly, no matter what.
only heres the problem : i lose my alpha channel!!! why oh why am i
losing my alpha channel? i like it =)… anyway, how would i go about
doing this so i keep my alpha channel? i want a way that works no
matter what image (if i skip the second part of my method, IE i dont
blt the surface onto an RGB surface, only half my textures load in
properly, but the ones that do keep their alpha channel. so im
assuming blitting my surface onto the RGB surface makes me lose the
alpha channel, and YES i send it SDL_SRCALPHA|SDL_HWSURFACE… i read
on a site im supsoed to use PutPixel/GetPixel to do this, but im not
sure how it works… can anyone explain or suggest a differnt method?

All OpenGL versions support the RGBA8 texture format, so that’s the
format I usually use to handle the pixels to OpenGL.
Now, to keep alpha during the blit, you can disable alpha on the
surface, and that’ll copy the alpha channel directly to the destination
surface.
Here’s how I proceed (this code takes an SDL_Surface called “image” as
input, and creates a texture with it) :

           SDL_Surface*alpha_image;
            SDL_SetAlpha(image,0,0);
            width=image->w;
            height=image->h;
            alpha_image=SDL_CreateRGBSurface(SDL_SWSURFACE,width, 

height, 32, 0x000000ff, 0x0000ff00, 0x00ff0000, 0xff00000
SDL_BlitSurface(image, NULL, alpha_image, NULL);
glGenTextures(1,&tex_nb);
glBindTexture(GL_TEXTURE_2D,tex_nb);
glTexParameteri (GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER,
GL_LINEAR);
glTexParameteri (GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER,
GL_LINEAR);
glTexParameteri (GL_TEXTURE_2D, GL_TEXTURE_WRAP_S,
GL_CLAMP);
glTexParameteri (GL_TEXTURE_2D, GL_TEXTURE_WRAP_T,
GL_CLAMP);

            if ((power_2(width))&&(power_2(height)))
                    glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, width, 

height, 0, GL_RGBA, GL_UNSIGNED_BYTE, alpha_image->pixels);
else
gluBuild2DMipmaps(GL_TEXTURE_2D, GL_RGBA, width,
height, GL_RGBA, GL_UNSIGNED_BYTE, alpha_image->pixels);
SDL_FreeSurface(alpha_image);

Stephane

THANKS YOU!! it was just this single line : SDL_SetAlpha(img,0,0); which made it
all work!! thanks again!! also, what does this line do ? =) thanks again to
everyone who helped!