16 bits textures and SDL

Hello,

I have a probleme using SDL to load 16 bits textures
:frowning:
I searched over google many hours but didn’t find
anything…
Maybe some people on this list encountered the same
problem ?

Here is the code (the loaded texture is a power of 2
one (256*256) and this code works when changed for
loading 32 bits textures) :

SDL_Surface *image;
SDL_Surface *newImage;
GLubyte *dataImage;
GLuint texImage;

int bpp=16;

image = IMG_Load(path);

newImage = SDL_CreateRGBSurface(SDL_SWSURFACE, 256,
256, bpp,
#if SDL_BYTEORDER == SDL_BIG_ENDIAN
0xf000, 0x0f00, 0x00f0, 0x000f);
#else
0x000f, 0x00f0, 0x0f00, 0xf000);
#endif

int resultBlit = SDL_BlitSurface(image, NULL,
newImage, NULL);

dataImage = (GLubyte *)malloc((bpp/8)256256);
memcpy(dataImage, newImage->pixels, ((bpp/8)256256)
);

glGenTextures(1, &texImage);
glBindTexture(GL_TEXTURE_2D, texImage);

glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER,
GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER,
GL_LINEAR);

glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S,
GL_CLAMP_TO_EDGE);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T,
GL_CLAMP_TO_EDGE);

glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA4, 256, 256, 0,
GL_RGBA, GL_UNSIGNED_BYTE, dataImage);

The program hangs at “glTexImage2D”.

Thanks for your help ;)___________________________________________________________
Do You Yahoo!? – Une adresse @yahoo.fr gratuite et en fran?ais !
Yahoo! Mail : http://fr.mail.yahoo.com

gamedev srt wrote:

glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA4, 256, 256, 0,
GL_RGBA, GL_UNSIGNED_BYTE, dataImage);

Using GL_RGBA, GL_UNSIGNED_BYTE means that you’re telling OpenGL to read
4 unsigned bytes per pixel (RGBA8). But your SDL surface is in RGBA4
format. So you need something like :

glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA4, 256, 256, 0, GL_RGBA, GL_UNSIGNED_SHORT_4_4_4_4, dataImage);

GL_RGBA4 is just OpenGL’s internal texture format (ie how the texture is
stored on your video card).

Stephane

Hello St?phane,

Thanks very much for your help, now there’s no more
crashes :wink:

The only remaini!ng problem is that images have wrong
colors, in fact they appear like if i added a blue
filter on my screen. For exemple, white color appear
blue :frowning:

Any clues ?> Using GL_RGBA, GL_UNSIGNED_BYTE means that you’re

telling OpenGL to read
4 unsigned bytes per pixel (RGBA8). But your SDL
surface is in RGBA4
format. So you need something like :

glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA4, 256, 256,
0, GL_RGBA, GL_UNSIGNED_SHORT_4_4_4_4, dataImage);

GL_RGBA4 is just OpenGL’s internal texture format
(ie how the texture is
stored on your video card).

Stephane


Do You Yahoo!? – Une adresse @yahoo.fr gratuite et en fran?ais !
Yahoo! Mail : http://fr.mail.yahoo.com