Sdl bmp loader

I am trying to use SDL_LoadBMP to load a .bmp file and then bind it to
an OpenGL texture. I think the problem is that glTexImage2D() wants a
long unsigned int for the pixel format, but SDL _Surface’s format is a
pointer to an SDL_PixelFormat. Calling

SDL_Surface* texture;

glTexImage2D(GL_TEXTURE_2D, 0, 3, texture->w, texture->h, 0,
texture->format, GL_UNSIGNED_BYTE, texture->pixels);

does nothing. Can anyone give me a hand here?
Thanks

glTexImage2D(GL_TEXTURE_2D, 0, 3, texture->w, texture->h, 0,
texture->format, GL_UNSIGNED_BYTE, texture->pixels);

You misunderstood the format parameter, and also, the format member of
the SDL_Surface structure.On Aug 1, 2004, at 11:02 PM, Daniel Roberts wrote:


First the format member of SDL_Surface because it’s the shorter of the
two. The format member is a pointer to an SDL_PixelFormat structure
containing:
SDL_Palette *palette;
Uint8 BitsPerPixel;
Uint8 BytesPerPixel;
Uint32 Rmask, Gmask, Bmask, Amask;
Uint8 Rshift, Gshift, Bshift, Ashift;
Uint8 Rloss, Gloss, Bloss, Aloss;
Uint32 colorkey;
Uint8 alpha;

You don’t want to pass a rather arbitrary value (what can a memory
pointer have anything to do with your the color format enumeration you
desire?)

    format          Specifies the format of the pixel data.  Must be 

one of
GL_COLOR_INDEX, GL_DEPTH_COMPONENT, GL_RED,
GL_GREEN,
GL_BLUE, GL_ALPHA, GL_RGB, GL_RGBA, GL_BGR,
GL_BGRA,
GL_LUMINANCE, or GL_LUMINANCE_ALPHA.

If I want to create a 32-bit RGBA texture, I pass GL_RGBA.


Seecond, the internalFormat parameter (number two) of glTexImage2D from
the man page:

    internalFormat  Requests  the  internal  storage  format of the 

texture
image. The most current version of the SGI
implementa-
tion of GLU does not check this value for
validity
before passing it on to the underlying OpenGL
implemen-
tation. A value that is not accepted by the
OpenGL
implementation will lead to an OpenGL error.
The bene-
fit of not checking this value at the GLU level
is that
OpenGL extensions can add new internal texture
formats
without requiring a revision of the GLU
implementation.
Older implementations of GLU check this value
and raise
a GLU error if it is not 1, 2, 3, or 4 or one
of the
following symbolic constants: GL_ALPHA,
GL_ALPHA4,
GL_ALPHA8, GL_ALPHA12, GL_ALPHA16,
GL_LUMINANCE,
GL_LUMINANCE4, GL_LUMINANCE8,
GL_LUMINANCE12,
GL_LUMINANCE16,
GL_LUMINANCE_ALPHA,
GL_LUMINANCE4_ALPHA4,
GL_LUMINANCE6_ALPHA2,
GL_LUMINANCE8_ALPHA8,
GL_LUMINANCE12_ALPHA4,
GL_LUMINANCE12_ALPHA12,
GL_LUMINANCE16_ALPHA16,
GL_INTENSITY, GL_INTENSITY4,
GL_INTENSITY8,
GL_INTENSITY12, GL_INTENSITY16, GL_RGB,
GL_R3_G3_B2,
GL_RGB4, GL_RGB5, GL_RGB8, GL_RGB10,
GL_RGB12,
GL_RGB16, GL_RGBA, GL_RGBA2, GL_RGBA4,
GL_RGB5_A1,
GL_RGBA8, GL_RGB10_A2, GL_RGBA12 or GL_RGBA16.

If my program wants a 32-bit RGBA texture, then I can pass 4 here. I
think I can also pass GL_RGBA or GL_RGBA8 here, but I’m not completely
sure ask somebody else if you want to know.

You passed zero for this parameter, I’m pretty sure that won’t work
under any circumstances (again, I’m not completely sure.)

does nothing. Can anyone give me a hand here?
Thanks


SDL mailing list
SDL at libsdl.org
http://www.libsdl.org/mailman/listinfo/sdl

Donny Viszneki wrote:

glTexImage2D(GL_TEXTURE_2D, 0, 3, texture->w, texture->h, 0,
texture->format, GL_UNSIGNED_BYTE, texture->pixels);

You misunderstood the format parameter, and also, the format member of
the SDL_Surface structure.

This is the way I do:

if(image->format->BytesPerPixel == 4) // has alpha?
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, image->w, image->h, 0,
GL_BGRA, GL_UNSIGNED_BYTE, image->pixels);
else
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, image->w, image->h, 0,
GL_BGR, GL_UNSIGNED_BYTE, image->pixels);

The 0 on second parameter means the same detail level,
GL_RGBA or GL_RGB means the number of color components, could be 4 or 3
in this case.
image->w and image->h is the size of the image,
0 for border,
GL_BGRA or GL_BGR is the order that the color components are organized
inside image->pixels (they are organized in BGR, not RGB).
GL_UNSIGNED_BYTE means the length of each color component: 1 in this
case.
image->pixels is the image itself.> On Aug 1, 2004, at 11:02 PM, Daniel Roberts wrote:

does nothing. Can anyone give me a hand here?
Thanks


SDL mailing list
SDL at libsdl.org
http://www.libsdl.org/mailman/listinfo/sdl


SDL mailing list
SDL at libsdl.org
http://www.libsdl.org/mailman/listinfo/sdl


Lucas Clemente Vella
@Lucas_Clemente_Vella

GL_BGRA or GL_BGR is the order that the color components are organized
inside image->pixels (they are organized in BGR, not RGB).

Can you tell me what you need to be able to USE GL_BGR? I tried it, and
the compiler came back at me with an error stating (from what I could
tell) that GL_BGR wasn’t a viable option there. I don’t have the exact
error (was a while ago, I’ve since just written/copied a function which
swaps the B and R values), but if you know what I’m talking about cool,
else no worries.

–Scott

not sure if it is the same for you, but I found the definition in
glext.h originally, and #including it worked for me.

Scott Harper wrote:

GL_BGRA or GL_BGR is the order that the color components are organized
inside image->pixels (they are organized in BGR, not RGB).

Can you tell me what you need to be able to USE GL_BGR? I tried it,
and>the compiler came back at me with an error stating (from what I could
tell) that GL_BGR wasn’t a viable option there. I don’t have the exact
error (was a while ago, I’ve since just written/copied a function which
swaps the B and R values), but if you know what I’m talking about cool,
else no worries.

–Scott


SDL mailing list
SDL at libsdl.org
http://www.libsdl.org/mailman/listinfo/sdl