SDLsurface to opengl texture

Hi,

I am using SDL_Image to load a jpg image onto SDL_Surface and then I convert
it to opengl texture.

I am able to render a bitmap image correctly when I use
glTexImage2D(GL_TEXTURE_2D, 0, 3, texture->w,texture->h, 0, GL_BGR,
GL_UNSIGNED_BYTE, texture->pixels);
But when I render a jpg image I get a white box.

I am using glTexImage2D(GL_TEXTURE_2D, 0, 3, texture->w,texture->h,
0, GL_RGB, GL_UNSIGNED_BYTE, texture->pixels);

I believe I am getting the GL_RGB argument or something other wrong. How can
i correctly determine the format to render it correctly ?

-Abhinav

Hi Abhinav,

There are a few things it could be. First make sure that the
dimensions of your texture are powers of two, as these are the only
sized texture data that OpenGL supports (without extensions). If your
image data is not sized with dimensions that are powers of two, you
can instead create an empty texture (power of two sized) with
glTexImage2D and then load your data into it with glTexSubImage2D.
You can then use texture coordinate tricks to hide the fact that your
texture has blank space on two sides.

The second thing to be sure of is that your image data has the same
pixel format as you’re telling OpenGL that it does. To be totally
sure of this, I would use SDL_CreateRGBSurface to create a new surface
of the exact parameters you’re passing into OpenGL, copy your original
image into this surface (blit), and then pass the new converted
surface pixel data to OpenGL.

The third thing you want to be sure of is that you’re telling OpenGL
the necessary parameters it need to know to draw the texture. I’ve
found (and this might be in the spec somewhere) that textures don’t
work until you specify the filtering mode with glTexParameteri.

On a little endian machine (x86), you’d want to do something like:

/* gets next power of two */
int pot(int x) {
int val = 1;
while (val < x) {
val *= 2;
}
return val;
}

SDL_Surface *original = IMG_Load(myImagePath);
SDL_Surface *converted = SDL_CreateRGBSurface(0, original->w, original-

h, 24, 0x0000FF, 0x00FF00, 0xFF0000, 0x000000);
SDL_Blit(original, NULL, converted, NULL);

GLuint textureID;
glGenTextures(1, &textureID);
glBindTexture(GL_TEXTURE_2D, textureID);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, pot(converted->w),
pot(converted->h), 0, GL_RGB, GL_UNSIGNED_BYTE, NULL);
glTexSubImage2D(GL_TEXTURE_2D, 0, 0, 0, converted->w, converted->h,
GL_RGB, GL_UNSIGNED_BYTE, converted->pixels);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);

On a big endian machine (such as PowerPC on Mac) you’d want to swap
the order of the bytes in the bitmask passed to SDL_CreateRGBSurface.
I think this code is correct, though I haven’t tested it. I mostly
meant it to give you an idea of what to do.

  • HolmesOn Aug 21, 2008, at 8:30 AM, Abhinav Lele wrote:

Hi,

I am using SDL_Image to load a jpg image onto SDL_Surface and then I
convert it to opengl texture.

I am able to render a bitmap image correctly when I use
glTexImage2D(GL_TEXTURE_2D, 0, 3, texture->w,texture->h, 0, GL_BGR,
GL_UNSIGNED_BYTE, texture->pixels);
But when I render a jpg image I get a white box.

I am using glTexImage2D(GL_TEXTURE_2D, 0, 3, texture-

w,texture->h, 0, GL_RGB, GL_UNSIGNED_BYTE, texture->pixels);

I believe I am getting the GL_RGB argument or something other wrong.
How can i correctly determine the format to render it correctly ?

-Abhinav


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org

Thanks Holmes. That worked.

Now I am able to render both jpg and bmp files.

Now when I try the same in 600x400 window, my image gets stretched in x
co-ordinate. How can I stop this ?On Thu, Aug 21, 2008 at 9:22 PM, Holmes Futrell wrote:

Hi Abhinav,

There are a few things it could be. First make sure that the dimensions of
your texture are powers of two, as these are the only sized texture data
that OpenGL supports (without extensions). If your image data is not sized
with dimensions that are powers of two, you can instead create an empty
texture (power of two sized) with glTexImage2D and then load your data into
it with glTexSubImage2D. You can then use texture coordinate tricks to hide
the fact that your texture has blank space on two sides.

The second thing to be sure of is that your image data has the same pixel
format as you’re telling OpenGL that it does. To be totally sure of this, I
would use SDL_CreateRGBSurface to create a new surface of the exact
parameters you’re passing into OpenGL, copy your original image into this
surface (blit), and then pass the new converted surface pixel data to
OpenGL.

The third thing you want to be sure of is that you’re telling OpenGL the
necessary parameters it need to know to draw the texture. I’ve found (and
this might be in the spec somewhere) that textures don’t work until you
specify the filtering mode with glTexParameteri.

On a little endian machine (x86), you’d want to do something like:

/* gets next power of two */
int pot(int x) {
int val = 1;
while (val < x) {
val *= 2;
}
return val;
}

SDL_Surface *original = IMG_Load(myImagePath);
SDL_Surface *converted = SDL_CreateRGBSurface(0, original->w, original->h,
24, 0x0000FF, 0x00FF00, 0xFF0000, 0x000000);
SDL_Blit(original, NULL, converted, NULL);

GLuint textureID;
glGenTextures(1, &textureID);
glBindTexture(GL_TEXTURE_2D, textureID);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, pot(converted->w),
pot(converted->h), 0, GL_RGB, GL_UNSIGNED_BYTE, NULL);
glTexSubImage2D(GL_TEXTURE_2D, 0, 0, 0, converted->w, converted->h, GL_RGB,
GL_UNSIGNED_BYTE, converted->pixels);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);

On a big endian machine (such as PowerPC on Mac) you’d want to swap the
order of the bytes in the bitmask passed to SDL_CreateRGBSurface.
I think this code is correct, though I haven’t tested it. I mostly meant
it to give you an idea of what to do.

  • Holmes

On Aug 21, 2008, at 8:30 AM, Abhinav Lele wrote:

Hi,

I am using SDL_Image to load a jpg image onto SDL_Surface and then I
convert it to opengl texture.

I am able to render a bitmap image correctly when I use
glTexImage2D(GL_TEXTURE_2D, 0, 3, texture->w,texture->h, 0, GL_BGR,
GL_UNSIGNED_BYTE, texture->pixels);
But when I render a jpg image I get a white box.

I am using glTexImage2D(GL_TEXTURE_2D, 0, 3, texture->w,texture->h,
0, GL_RGB, GL_UNSIGNED_BYTE, texture->pixels);

I believe I am getting the GL_RGB argument or something other wrong. How
can i correctly determine the format to render it correctly ?

-Abhinav


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org