SDL/OpenGL and texture mapping

Hi All,

Having a weird newbie problem here, attempting to load a texture using
SDL_LoadBMP or IMG_Load and not getting the right results. Basically
what it looks like is that I am loosing the blue channel I think or one
of the channels it not showing up. Here is some code (unoptimized), if
required I can send a screenshot of the image before I map it and after
I map it.:

 //SDL video has already been intialized.

SDL_Surface *pBitmap = IMG_Load("test.bmp"); 
if(pBitmap == NULL)  
{

LogManager::getInstance()->getLog(“textureManager”)->error(" Failed
loading "+getName()+ " : " + ( SDL_GetError()));
return ;
}

// Generate a texture with the associative texture ID stored in the

array
glGenTextures(1, &m_textureID);
std::stringstream os;
os << “genereated texture with id:” << m_textureID;

wright::util::LogManager::getInstance()->getLog(“textureManager”)->info(
os.str());

glBindTexture(GL_TEXTURE_2D, m_textureID);

m_width  = pBitmap -> w;
m_height = pBitmap -> h;   

os.str("");
os<< "texture data: \nwidth:"

<<m_width<<"\nheight:"<<m_height<<"\nbit depth:"<< pBitmap -> format ->
BitsPerPixel;

wright::util::LogManager::getInstance()->getLog(“textureManager”)->info(
os.str());

glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MAG_FILTER,GL_LINEAR);    
glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MIN_FILTER,GL_LINEAR);   

glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, m_width, m_height, 0, GL_RGB,

GL_UNSIGNED_BYTE, pBitmap -> pixels);

SDL_FreeSurface(pBitmap);                 

Byron Wright

Hello

I’ve just the same problem a few monts before, here are (my) solutions.
Be free and suggest improvements!

just say in your main-code:
texture[0]=gload_mipmap(“Box12b.jpg”);
texture[1]=gload_texture(“Box12b.jpg”);

I know for a big project this would be suck, but for leaning OpenGL
and working around it’s easy.

Here’s the code:

template void swap(foo &a, foo &b)
{
foo temp=a;
a=b;
b=temp;
}
void rotate_180d(SDL_Surface *image)
{
int width=image->w;
int height=image->h;
unsigned char *data=(unsigned char *)(image->pixels);
int BytesPerPixel=(image->format->BytesPerPixel);

for(int i=0; i<(height/2); ++i)
for(int j=0; j<width*BytesPerPixel; j+=BytesPerPixel)
for(int k=0; k<BytesPerPixel; ++k)
swap(data[(i*width*BytesPerPixel)+j+k], data[((height-i-1)*width*BytesPerPixel)+j+k]);

}
SDL_Surface prepare_image(SDL_Surface sur)
{
if(!sur)
{
cout<<SDL_GetError()<<endl;
sur=IMG_Load(“failpic.png”);

	if(!sur)
	exit(-1);
}

SDL_Surface *image=SDL_CreateRGBSurface(SDL_HWSURFACE, sur->w, sur->h, 32, 0x000000FF, 0x0000FF00, 0x00FF0000, 0xFF000000);

Uint32 saved_flags=sur->flags&(SDL_SRCALPHA|SDL_RLEACCELOK);
Uint8  saved_alpha=sur->format->alpha;
SDL_Rect area;
area.x = 0;
area.y = 0;
area.w = sur->w;
area.h = sur->h;

if((saved_flags&SDL_SRCALPHA)==SDL_SRCALPHA)
SDL_SetAlpha(sur, 0, 0);

SDL_BlitSurface(sur, &area, image, &area);
	SDL_FreeSurface(sur);

rotate_180d(image);
return image;

}
GLuint gload_texture(const char filename)
{
GLuint texture;
SDL_Surface
image=prepare_image(IMG_Load(filename));

glGenTextures(1, &texture);
glBindTexture(GL_TEXTURE_2D, texture);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, image->w, image->h, 0, GL_RGBA, GL_UNSIGNED_BYTE, image->pixels);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);

	SDL_FreeSurface(image);

return texture;

}
GLuint gload_mipmap(const char filename)
{
GLuint texture;
SDL_Surface
image=prepare_image(IMG_Load(filename));

glGenTextures(1, &texture);
glBindTexture(GL_TEXTURE_2D, texture);
gluBuild2DMipmaps(GL_TEXTURE_2D, GL_RGBA, image->w, image->h, GL_RGBA, GL_UNSIGNED_BYTE, image->pixels);
glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MAG_FILTER,GL_LINEAR);

glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MIN_FILTER,GL_LINEAR_MIPMAP_NEAREST);

	SDL_FreeSurface(image);

return texture;

}

Hope this would help you :slight_smile:

best regards

funthing

Could you briefly expain what this code does? It looks like you are
creating a surface with alpha for every image, bliting the image to it
and using it to create a texture. It doesn’t really explain my I am
loosing a channel when creating the texture. Thank you for the code, I
will try it out and see if it fixes my problem.

-Byron> ----- Original Message -----

From: sdl-admin@libsdl.org [mailto:sdl-admin at libsdl.org] On Behalf Of
Matthias
Sent: Tuesday, September 10, 2002 2:45 AM
To: SDL Mailing List
Subject: Re: [SDL] SDL/OpenGL and texture mapping.

Hello

I’ve just the same problem a few monts before, here are (my) solutions.
Be free and suggest improvements!

just say in your main-code: texture[0]=gload_mipmap(“Box12b.jpg”);
texture[1]=gload_texture(“Box12b.jpg”);

I know for a big project this would be suck, but for leaning OpenGL and
working around it’s easy.

Here’s the code:

template void swap(foo &a, foo &b)
{
foo temp=a;
a=b;
b=temp;
}
void rotate_180d(SDL_Surface *image)
{
int width=image->w;
int height=image->h;
unsigned char *data=(unsigned char *)(image->pixels);
int BytesPerPixel=(image->format->BytesPerPixel);

for(int i=0; i<(height/2); ++i)
for(int j=0; j<widthBytesPerPixel; j+=BytesPerPixel)
for(int k=0; k<BytesPerPixel; ++k)
swap(data[(i
width*BytesPerPixel)+j+k],
data[((height-i-1)widthBytesPerPixel)+j+k]);
}
SDL_Surface prepare_image(SDL_Surface sur)
{
if(!sur)
{
cout<<SDL_GetError()<<endl;
sur=IMG_Load(“failpic.png”);

  if(!sur)
  exit(-1);

}

SDL_Surface *image=SDL_CreateRGBSurface(SDL_HWSURFACE, sur->w,
sur->h, 32, 0x000000FF, 0x0000FF00, 0x00FF0000, 0xFF000000);

Uint32 saved_flags=sur->flags&(SDL_SRCALPHA|SDL_RLEACCELOK);
Uint8 saved_alpha=sur->format->alpha;
SDL_Rect area;
area.x = 0;
area.y = 0;
area.w = sur->w;
area.h = sur->h;

if((saved_flags&SDL_SRCALPHA)==SDL_SRCALPHA)
SDL_SetAlpha(sur, 0, 0);

SDL_BlitSurface(sur, &area, image, &area);
SDL_FreeSurface(sur);

rotate_180d(image);
return image;
}
GLuint gload_texture(const char filename)
{
GLuint texture;
SDL_Surface
image=prepare_image(IMG_Load(filename));

glGenTextures(1, &texture);
glBindTexture(GL_TEXTURE_2D, texture);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, image->w, image->h, 0,
GL_RGBA, GL_UNSIGNED_BYTE, image->pixels);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER,
GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER,
GL_LINEAR);

	SDL_FreeSurface(image);

return texture;
}
GLuint gload_mipmap(const char filename)
{
GLuint texture;
SDL_Surface
image=prepare_image(IMG_Load(filename));

glGenTextures(1, &texture);
glBindTexture(GL_TEXTURE_2D, texture);
gluBuild2DMipmaps(GL_TEXTURE_2D, GL_RGBA, image->w, image->h,
GL_RGBA, GL_UNSIGNED_BYTE, image->pixels);
glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MAG_FILTER,GL_LINEAR);

glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MIN_FILTER,GL_LINEAR_MIPMAP_NEA
REST);

	SDL_FreeSurface(image);

return texture;
}

Hope this would help you :slight_smile:

best regards

funthing


SDL mailing list
SDL at libsdl.org
http://www.libsdl.org/mailman/listinfo/sdl

Maybe you shall try to convert your image to a proper 32-bit RGB
format first? Just create a RGB surface like this

SDL_Surface *s;
Uint32 rmask, gmask, bmask, amask;
int bits = 32;

#if SDL_BYTEORDER == SDL_BIG_ENDIAN
rmask = 0xff000000;
gmask = 0x00ff0000;
bmask = 0x0000ff00;
amask = 0x000000ff;
#else
rmask = 0x000000ff;
gmask = 0x0000ff00;
bmask = 0x00ff0000;
amask = 0xff000000;
#endif
s = SDL_CreateRGBSurface(SDL_SWSURFACE, w, h,
bits, rmask, gmask, bmask, amask);

and then blit your image over, and use the resulting surface
as the source for your GL texture.

Regards,
.paul.On Mon, Sep 09, 2002 at 11:26:56PM -0700, Byron Wright wrote:

Hi All,

Having a weird newbie problem here, attempting to load a texture using
SDL_LoadBMP or IMG_Load and not getting the right results. Basically
what it looks like is that I am loosing the blue channel I think or one
of the channels it not showing up. Here is some code (unoptimized), if
required I can send a screenshot of the image before I map it and after
I map it.:

 //SDL video has already been intialized.

SDL_Surface *pBitmap = IMG_Load(“test.bmp”);
if(pBitmap == NULL)
{

LogManager::getInstance()->getLog(“textureManager”)->error(" Failed
loading "+getName()+ " : " + ( SDL_GetError()));
return ;
}

// Generate a texture with the associative texture ID stored in the

array
glGenTextures(1, &m_textureID);
std::stringstream os;
os << “genereated texture with id:” << m_textureID;

wright::util::LogManager::getInstance()->getLog(“textureManager”)->info(
os.str());

glBindTexture(GL_TEXTURE_2D, m_textureID);

m_width  = pBitmap -> w;
m_height = pBitmap -> h;   

os.str(“”);
os<< “texture data: \nwidth:”
<<m_width<<“\nheight:”<<m_height<<“\nbit depth:”<< pBitmap → format →
BitsPerPixel;

wright::util::LogManager::getInstance()->getLog(“textureManager”)->info(
os.str());

glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MAG_FILTER,GL_LINEAR);    
glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MIN_FILTER,GL_LINEAR);   

glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, m_width, m_height, 0, GL_RGB,

GL_UNSIGNED_BYTE, pBitmap → pixels);

SDL_FreeSurface(pBitmap);                 

Byron Wright


SDL mailing list
SDL at libsdl.org
http://www.libsdl.org/mailman/listinfo/sdl

Hi All,

Ok after saving the file as a jpg (without changing any code) the
texture is now displaying correctly. The only format I can successfully
use as a texture are jpg’s. If someone could please explain why that is
I would be very apperiative.> ----- Original Message -----

From: sdl-admin@libsdl.org [mailto:sdl-admin at libsdl.org] On Behalf Of
Byron Wright
Sent: Monday, September 09, 2002 11:27 PM
To: SDL Mailing List
Subject: [SDL] SDL/OpenGL and texture mapping.

Hi All,

Having a weird newbie problem here, attempting to load a texture using
SDL_LoadBMP or IMG_Load and not getting the right results. Basically
what it looks like is that I am loosing the blue channel I think or one
of the channels it not showing up. Here is some code (unoptimized), if
required I can send a screenshot of the image before I map it and after
I map it.:

 //SDL video has already been intialized.

SDL_Surface *pBitmap = IMG_Load(“test.bmp”);
if(pBitmap == NULL)
{

LogManager::getInstance()->getLog(“textureManager”)->error(" Failed
loading "+getName()+ " : " + ( SDL_GetError()));
return ;
}

// Generate a texture with the associative texture ID stored in the

array
glGenTextures(1, &m_textureID);
std::stringstream os;
os << “genereated texture with id:” << m_textureID;

wright::util::LogManager::getInstance()->getLog(“textureManager”)->info(
os.str());

glBindTexture(GL_TEXTURE_2D, m_textureID);

m_width  = pBitmap -> w;
m_height = pBitmap -> h;   

os.str(“”);
os<< “texture data: \nwidth:”
<<m_width<<“\nheight:”<<m_height<<“\nbit depth:”<< pBitmap → format →
BitsPerPixel;

wright::util::LogManager::getInstance()->getLog(“textureManager”)->info(
os.str());

glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MAG_FILTER,GL_LINEAR);    
glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MIN_FILTER,GL_LINEAR);   

glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, m_width, m_height, 0, GL_RGB,

GL_UNSIGNED_BYTE, pBitmap → pixels);

SDL_FreeSurface(pBitmap);                 

Byron Wright


SDL mailing list
SDL at libsdl.org
http://www.libsdl.org/mailman/listinfo/sdl

Ok, this conbined with Matthias’s code works beautifully. So my problem
is that my image even though I used SDL_LoadBMP or IMG_Load does not
return a 32-bit image? Therefore I have to set the surface manually and
blit to that surface to create a 32bit image?

-Byron> ----- Original Message -----

From: sdl-admin@libsdl.org [mailto:sdl-admin at libsdl.org] On Behalf Of
paul at theV.net
Sent: Tuesday, September 10, 2002 8:28 AM
To: sdl at libsdl.org
Subject: Re: [SDL] SDL/OpenGL and texture mapping.

Maybe you shall try to convert your image to a proper 32-bit RGB
format first? Just create a RGB surface like this

SDL_Surface *s;
Uint32 rmask, gmask, bmask, amask;
int bits = 32;

#if SDL_BYTEORDER == SDL_BIG_ENDIAN
rmask = 0xff000000;
gmask = 0x00ff0000;
bmask = 0x0000ff00;
amask = 0x000000ff;
#else
rmask = 0x000000ff;
gmask = 0x0000ff00;
bmask = 0x00ff0000;
amask = 0xff000000;
#endif
s = SDL_CreateRGBSurface(SDL_SWSURFACE, w, h,
bits, rmask, gmask, bmask, amask);

and then blit your image over, and use the resulting surface
as the source for your GL texture.

Regards,
.paul.

On Mon, Sep 09, 2002 at 11:26:56PM -0700, Byron Wright wrote:

Hi All,

Having a weird newbie problem here, attempting to load a texture using

SDL_LoadBMP or IMG_Load and not getting the right results. Basically
what it looks like is that I am loosing the blue channel I think or
one of the channels it not showing up. Here is some code
(unoptimized), if required I can send a screenshot of the image before

I map it and after I map it.:

 //SDL video has already been intialized.

SDL_Surface *pBitmap = IMG_Load("test.bmp"); 
if(pBitmap == NULL)  
{

LogManager::getInstance()->getLog(“textureManager”)->error(" Failed
loading "+getName()+ " : " + ( SDL_GetError()));
return ;
}

// Generate a texture with the associative texture ID stored in 

the array
glGenTextures(1, &m_textureID);
std::stringstream os;
os << “genereated texture with id:” << m_textureID;

wright::util::LogManager::getInstance()->getLog(“textureManager”)->inf
o(
os.str());

glBindTexture(GL_TEXTURE_2D, m_textureID);

m_width  = pBitmap -> w;
m_height = pBitmap -> h;   

os.str("");
os<< "texture data: \nwidth:" 

<<m_width<<“\nheight:”<<m_height<<“\nbit depth:”<< pBitmap → format
→ BitsPerPixel;

wright::util::LogManager::getInstance()->getLog(“textureManager”)->inf
o(
os.str());

glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MAG_FILTER,GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MIN_FILTER,GL_LINEAR);   

glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, m_width, m_height, 0, 

GL_RGB, GL_UNSIGNED_BYTE, pBitmap → pixels);

SDL_FreeSurface(pBitmap);                 

Byron Wright


SDL mailing list
SDL at libsdl.org
http://www.libsdl.org/mailman/listinfo/sdl


SDL mailing list
SDL at libsdl.org
http://www.libsdl.org/mailman/listinfo/sdl

Ok, this conbined with Matthias’s code works beautifully. So my problem
is that my image even though I used SDL_LoadBMP or IMG_Load does not
return a 32-bit image? Therefore I have to set the surface manually and
blit to that surface to create a 32bit image?

If you save your image as 24bit, you will get 24bit and thus no alpha
channel. I remeber a possibility where you can swith Alpha on manually,
but I’m not nquite sure about this - have a look at the SDL_Surface
doku.

You also asked why there is a Problem with the image loading routine …
That’s an endian problem - Intel and Motorolla (e.g.) are using
different methods of placing bytes in memory.
Those methods are known as little endian or big endian. The one system
stores the lowest byte on the lets say right side of the block, the
other system on the left. So if you see this in RGBA you have RGBA on
the one system and ABGR on the other. That’s the origin of the problem.

Ciao
Arne

hi, folks

Could you briefly expain what this code does? It looks like you are
creating a surface with alpha for every image, bliting the image to it
and using it to create a texture. It doesn’t really explain my I am
loosing a channel when creating the texture. Thank you for the code, I
will try it out and see if it fixes my problem.

Loosing a channel, is very rare execpt the alpha channel, as far as I
can remember my first tries to load a texture with SDL failed too. The
Texture was looking freaky, I played around with GL_RGB and GL_RGBA but
nothing helped sometimes it worked sometimes the texture was looking
freaky then in most cases with a purple touch.

Then I found somewhere the solution to create a new surf with alpha
intended to use GL_RGBA always. Next, SDL loads images exact the other way,
they would appear as texture rotated 180 degrees so you have to rotate them 180?.

I sayed this code might be not the best way but it’s
enough for my case in oder to learn OpenGL :slight_smile:

Perhaps this or the code might help you.
If others have filed a better solution please tell me!

good luck :)–
funthing

You can also look at the code in testgl.c in the test subdirectory of the
SDL source archive. :slight_smile:

See ya,
-Sam Lantinga, Software Engineer, Blizzard Entertainment

My guess the problem of using surfaces returned by SDL_LoadBMP()
or IMG_Load() as GL texture source is mainly something to do with
different surface encoding format. SDL may not return the ideal
RGB or RGBA format that GL expects. It is the same reason why a
SDL_DisplayFormat() or SDL_DisplayFormatAlpha() would speed up
blitting to the main screen surface. I am not very sure after you
enable OpenGL with SDL, SDL_DisplayFormat() will do the right
conversion work to ease texture loading for you… can somebody
verify and confirm this?

But from what David Olofson has also done in his glSDL
implementation, he is also creating a proper RGB/RGBA surface and
blit image over before feeding it as texture.

Alpha channel or not seems to be different issue here.

Regards,
.paul.On Tue, Sep 10, 2002 at 07:04:32PM +0200, Arne wrote:

Ok, this conbined with Matthias’s code works beautifully. So my problem
is that my image even though I used SDL_LoadBMP or IMG_Load does not
return a 32-bit image? Therefore I have to set the surface manually and
blit to that surface to create a 32bit image?

If you save your image as 24bit, you will get 24bit and thus no alpha
channel. I remeber a possibility where you can swith Alpha on manually,
but I’m not nquite sure about this - have a look at the SDL_Surface
doku.

You also asked why there is a Problem with the image loading routine …
That’s an endian problem - Intel and Motorolla (e.g.) are using
different methods of placing bytes in memory.
Those methods are known as little endian or big endian. The one system
stores the lowest byte on the lets say right side of the block, the
other system on the left. So if you see this in RGBA you have RGBA on
the one system and ABGR on the other. That’s the origin of the problem.

Ciao
Arne


SDL mailing list
SDL at libsdl.org
http://www.libsdl.org/mailman/listinfo/sdl

Then I found somewhere the solution to create a new surf with alpha
intended to use GL_RGBA always. Next, SDL loads images exact the other
way, they would appear as texture rotated 180 degrees so you have to
rotate them 180?.

hmm, I wonder if it is SDL has rotated the texture or you didn’t set
your texture coordinates as OpenGL would like to :wink:

Regards,
.paul.On Tue, Sep 10, 2002 at 07:04:54PM +0200, Matthias wrote:

I can refute it. I raised this issue, Sam asked me to submit a patch, and
it seems to have gotten lost in the list noise.On Wed, Sep 11, 2002 at 01:16:53PM +0800, paul at theV.net wrote:

My guess the problem of using surfaces returned by SDL_LoadBMP()
or IMG_Load() as GL texture source is mainly something to do with
different surface encoding format. SDL may not return the ideal
RGB or RGBA format that GL expects. It is the same reason why a
SDL_DisplayFormat() or SDL_DisplayFormatAlpha() would speed up
blitting to the main screen surface. I am not very sure after you
enable OpenGL with SDL, SDL_DisplayFormat() will do the right
conversion work to ease texture loading for you… can somebody
verify and confirm this?


Joseph Carter No conceit in my family

  • Caytln slaps Lisa
    catfight :stuck_out_tongue:
    Watch it girl, I like that.
    :slight_smile:
    figures :smiley:

-------------- next part --------------
A non-text attachment was scrubbed…
Name: not available
Type: application/pgp-signature
Size: 273 bytes
Desc: not available
URL: http://lists.libsdl.org/pipermail/sdl-libsdl.org/attachments/20020911/382d4d89/attachment.pgp

Remember OpenGL textures are bottom to top. SDL_Surfaces are top to
bottom. Flip the texture coordinates appropriately.On Wed, Sep 11, 2002 at 01:18:44PM +0800, paul at theV.net wrote:

Then I found somewhere the solution to create a new surf with alpha
intended to use GL_RGBA always. Next, SDL loads images exact the other
way, they would appear as texture rotated 180 degrees so you have to
rotate them 180?.

hmm, I wonder if it is SDL has rotated the texture or you didn’t set
your texture coordinates as OpenGL would like to :wink:


Joseph Carter Only l33t on Thursdays

I still think you guys are nuts merging Q and QW. :stuck_out_tongue:
Of course we’re nuts. Even John said so. =>
Zoid: we’re nuts, but we’re productive nuts:)

-------------- next part --------------
A non-text attachment was scrubbed…
Name: not available
Type: application/pgp-signature
Size: 273 bytes
Desc: not available
URL: http://lists.libsdl.org/pipermail/sdl-libsdl.org/attachments/20020911/469cf084/attachment.pgp

Ah, I got it. No wonder when SDL surfaces are used in conjuction
with glOrtho(0, w, h, 0, -1.0, 1.0), all coordinates are in the
same order as in upperleft origin view. I was wrong in thinking
glOrtho() even flips the texture coordinates, my bad…

Regards,
.paul.On Wed, Sep 11, 2002 at 04:30:52AM -0700, Joseph Carter wrote:

Remember OpenGL textures are bottom to top. SDL_Surfaces are top to
bottom. Flip the texture coordinates appropriately.