SDL & OpenGL draws distorted textures

Hey,

I am not sure if this is the right place to ask; yet it has
connections to the SDL lib and there might be an issue.

In my code i have an own “Surface” class, which loads an SDL_Surface
via SDL_LoadBMP(). If i want to , i can also order it to load it as an
OpenGL texture. Its only done when i want to use the OpenGL library in
my game. If i don’t want to use it, the program will never try to do
anything OpenGL related stuff.

My problem is that when i draw my images, some look distorted on
OpenGL mode, (but not all), yet in 2D (pure SDL) mode, it works fine.

Must-noted detail is i am programming on my Macbook. When i open up
one of my BMP sprites in the preview program, and then hit “save”.
Then retry my program with OpenGL, that bitmap looks FINE in OpenGL.
The pictures i use came from a Windows PC, where i have saved the
images with Paint Shop Pro 9 as BMP (8 bit).

So that makes me wonder:

  • is there something wrong with SDL loading bitmaps on Mac OS X?
  • is there something wrong with the OpenGL coupling , combined with
    SDL? I am doing only the very basic stuff, and works fine. Even in
    example code the same images look distorted, so i am 100% certain its
    not my code. (however, if you are curious, i can always post it).

I don’t know if this occurs on Other-Than-Mac machines. But i am going
to find out this weekend. So far, i found it odd that when re-saving
my images on Mac, it solved the problem.

Thanks in advance,–
Stefan Hendriks

http://www.fundynamic.nl

Stefan Hendriks escreveu:

My problem is that when i draw my images, some look distorted on
OpenGL mode, (but not all), yet in 2D (pure SDL) mode, it works fine.

It looks like an error in your code to load SDL_Surfaces into OpenGL
textures.

  • is there something wrong with the OpenGL coupling , combined with
    SDL? I am doing only the very basic stuff, and works fine.
    Unfortunatelly SDL doesn’t provide glue code for loading OpenGL
    textures. That code is not trivial if you want to handle all possibilities.

It’s better to post the code you are using, and possibly the image you
are trying to load.–
Daniel K. O.

I can’t see why my method should fail porting it into an SDL_Surface
properly. This code was taken from a tutorial, working fine. I can’t
deliver the picture right now, but i can deliver the code:

void MMESurface::loadSurfaceOpenGL() {
assert (surface != null);

// Check that the image's width is a power of 2
 assert((surface->w % 2) == 0);

// Also check if the height is a power of 2
assert((surface->h % 2) == 0);

// Have OpenGL generate a texture object handle for us
glGenTextures( 1, &texture );

// Bind the texture object
glBindTexture( GL_TEXTURE_2D, texture );

// Set the texture's stretching properties
glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR );
glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR );

// Edit the texture object's image data using the information

SDL_Surface gives us
glTexImage2D( GL_TEXTURE_2D, 0, 3, surface->w, surface->h, 0,
GL_BGR, GL_UNSIGNED_BYTE, surface->pixels );

openGLed = true;
Log("Surface loaded for OpenGL");

}

MMESurface class definition:

class MMESurface {
public:
int getHeight();
int getWidth();
int getDepth();
bool isOpenGL();

	void loadSurface(string file);
	void loadSurface(string file, bool openGL);

	SDL_Surface* getSurface();
	GLuint getTexture();

	MMESurface(SDL_Surface *surf);
	MMESurface();
	~MMESurface();

 private:
	SDL_Surface *surface;
	GLuint texture;
	bool openGLed;  // loaded for OpenGL

};

and the drawing routine itself:

void MMESurface::loadSurfaceOpenGL() {
assert (surface != null);

// Check that the image's width is a power of 2
 assert((surface->w % 2) == 0);

// Also check if the height is a power of 2
assert((surface->h % 2) == 0);

// Have OpenGL generate a texture object handle for us
glGenTextures( 1, &texture );

// Bind the texture object
glBindTexture( GL_TEXTURE_2D, texture );

// Set the texture's stretching properties
glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR );
glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR );

// Edit the texture object's image data using the information

SDL_Surface gives us
glTexImage2D( GL_TEXTURE_2D, 0, 3, surface->w, surface->h, 0,
GL_BGR, GL_UNSIGNED_BYTE, surface->pixels );

openGLed = true;
Log("Surface loaded for OpenGL");

}

Again, it works. But only with some surfaces it does not. And without
changing the code (but by resaving a BMP) it looks like to work.
Perhaps Windows and Mac have tiny bit different habbit of
saving/loading BMP files?–
Stefan Hendriks

http://www.fundynamic.nl

Your assert() to verify if dimensions are powers of two is wrong, it checks for odd sizes.

OpenGL’s default packing and unpacking alignment is 4 bytes ( glPixelStore ), which would cause problems if your textures were as small as 2x2… but I’ll assume that’s not the case.

Perhaps Windows and Mac have tiny bit different habbit of
saving/loading BMP files?

That format is really a mess, very limited and without compression. Perhaps you could consider PNG using either libpng or SDL_image? Otherwise, feel free to post a screenshot of the distorsion.

AlexisOn Thu, 15 Mar 2007 15:36:04 +0100 “Stefan Hendriks” wrote:

I can’t see why my method should fail porting it into an SDL_Surface
properly. This code was taken from a tutorial, working fine. I can’t
deliver the picture right now, but i can deliver the code:

void MMESurface::loadSurfaceOpenGL() {
assert (surface != null);

// Check that the image’s width is a power of 2
assert((surface->w % 2) == 0);

// Also check if the height is a power of 2
assert((surface->h % 2) == 0);

// Have OpenGL generate a texture object handle for us
glGenTextures( 1, &texture );

// Bind the texture object
glBindTexture( GL_TEXTURE_2D, texture );

// Set the texture's stretching properties
glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR );
glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR );

// Edit the texture object's image data using the information

SDL_Surface gives us
glTexImage2D( GL_TEXTURE_2D, 0, 3, surface->w, surface->h, 0,
GL_BGR, GL_UNSIGNED_BYTE, surface->pixels );

openGLed = true;
Log(“Surface loaded for OpenGL”);
}

MMESurface class definition:

class MMESurface {
public:
int getHeight();
int getWidth();
int getDepth();
bool isOpenGL();

  void loadSurface(string file);
  void loadSurface(string file, bool openGL);

  SDL_Surface* getSurface();
  GLuint getTexture();

  MMESurface(SDL_Surface *surf);
  MMESurface();
  ~MMESurface();

 private:
  SDL_Surface *surface;
  GLuint texture;
  bool openGLed;  // loaded for OpenGL

};

and the drawing routine itself:

void MMESurface::loadSurfaceOpenGL() {
assert (surface != null);

// Check that the image’s width is a power of 2
assert((surface->w % 2) == 0);

// Also check if the height is a power of 2
assert((surface->h % 2) == 0);

// Have OpenGL generate a texture object handle for us
glGenTextures( 1, &texture );

// Bind the texture object
glBindTexture( GL_TEXTURE_2D, texture );

// Set the texture's stretching properties
glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR );
glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR );

// Edit the texture object's image data using the information

SDL_Surface gives us
glTexImage2D( GL_TEXTURE_2D, 0, 3, surface->w, surface->h, 0,
GL_BGR, GL_UNSIGNED_BYTE, surface->pixels );

openGLed = true;
Log(“Surface loaded for OpenGL”);
}

Again, it works. But only with some surfaces it does not. And without
changing the code (but by resaving a BMP) it looks like to work.
Perhaps Windows and Mac have tiny bit different habbit of
saving/loading BMP files?


Stefan Hendriks

// Check that the image’s width is a power of 2
assert((surface->w % 2) == 0);
// Also check if the height is a power of 2
assert((surface->h % 2) == 0);

This only checks that the width or height is an even number, not that it’s
a power of 2.

glTexImage2D( GL_TEXTURE_2D, 0, 3, surface->w, surface->h, 0,
GL_BGR, GL_UNSIGNED_BYTE, surface->pixels );

OpenGL assumes that the whole picture is stored in one long chunk,
sequentially, with no gaps inbetween. But the SDL surface can have some
padding bytes between the rows, check the pitch variable in SDL_Surface.
(IIRC, OpenGL can be set to know about this by adjusting the pixel
transfer parameters.) So, either set necessary pixel transfer parameters
if there are padding bytes, or manually repack the data in an intermediate
buffer in the format OpenGL expects it.

In addition, this code assumes a specific pixel format of the SDL surface,
even though the color components within the surface actually could be
(more or less) arbitrarily arranged. In that case, either use different
format parameters to glTexImage2D or repack the data if no suitable
parameter combination can be found.

In any case, the code above sure will work, but only under certain
circumstances. A completely general texture loading function would have to
take all the information in SDL_Surface into account.

Again, it works. But only with some surfaces it does not. And without
changing the code (but by resaving a BMP) it looks like to work.
Perhaps Windows and Mac have tiny bit different habbit of
saving/loading BMP files?

Yes, resaving the BMP files might store them in a slightly different
fashion, which might fulfill all the hidden assumptions in the texture
loading code.

The strange thing, however, is that I’d suspect the BMP loading code
within SDL to work identically on all platforms.

// MartinOn Thu, 15 Mar 2007, Stefan Hendriks wrote:

Thanks for the heads up on the assertions. I made that in the very
late hours, should’nt be doing that i suppose.


The strange thing, however, is that I’d suspect the BMP loading code
within SDL to work identically on all platforms.

Yes that is what i would expect as well. And i think it is not SDL who
is at fault. Since showing them with SDL goes fine. So that leaves out
the only option which is creating a texture out of an SDL Surface.

Since i’ve just started digging into OpenGL i think the answer should
be there as stated. I will digg into the glTexImage2D function and see
if i can find anything that might explain what is happening. It sounds
logical to me that it might be trying to texture a ‘gapped’ image.

Thanks for your help!

Thanks for the heads up on the assertions. I made that in the very
late hours, should’nt be doing that i suppose.


The strange thing, however, is that I’d suspect the BMP loading code
within SDL to work identically on all platforms.

Yes that is what i would expect as well. And i think it is not SDL who
is at fault. Since showing them with SDL goes fine. So that leaves out
the only option which is creating a texture out of an SDL Surface.

Exactly. It could, however, be due to differences in endianness.

If the color channels are swapped, it’s an endian issue. If the rows of
the texture are shifted, it’s due to the row pitch.

Since i’ve just started digging into OpenGL i think the answer should
be there as stated. I will digg into the glTexImage2D function and see
if i can find anything that might explain what is happening. It sounds
logical to me that it might be trying to texture a ‘gapped’ image.

The pitch problem is, as sais, quite trivial to remedy.

For the color channels, though, you should keep in mind how SDL internally
store surfaces. For 24 bpp surfaces, the channels for each pixel are
stored as three consecutive bytes. The order of these three bytes are
determined by the R/G/Bmask parameters in the pixel format structure of
the surface, interpreted in the native machine endian.

The behaviour of OpenGL, on the other hand, don’t change due to endian. If
specifying GL_BGR, GL_UNSIGNED_BYTE as the format and type parameters to
glTexImage2D, it interprets the first byte of each pixel as blue, the
second as green and the third as red. On big-endian, it equals an Rmask of
0x0000ff in SDL, and 0xff0000 on little-endian.

Hope this helps you in the correct direction.

// MartinOn Thu, 15 Mar 2007, Stefan Hendriks wrote:

(Note to Benjamin Li: You might have this same problem.)

Going back to the start of this thread, I think I can tell you the
reason for your original problem.

“The pictures i use came from a Windows PC, where i have saved the
images with Paint Shop Pro 9 as BMP (8 bit).”

And from your code

glTexImage2D( GL_TEXTURE_2D, 0, 3, surface->w, surface->h, 0,
                GL_BGR, GL_UNSIGNED_BYTE, surface->pixels );

Here you assume 24-bit per pixel data by specifying GL_BGR. However,
since your BMP was saved as 8-bit, your SDL_Surface most likely is in
8-bit per pixel format. Which means 2/3 of the data OpenGL loads into
its texture is garbage, and even the rest is interpreted wrong. You have
to convert the loaded surface into a known format like Daniel K.O.
described. Good news is that you just have to create the intermediate
surface in the correct format, and SDL_BlitSurface will do the
conversion automatically no matter what format the source surface is in.–
Jukka-Pekka Manninen

Here you assume 24-bit per pixel data by specifying GL_BGR. However,
since your BMP was saved as 8-bit, your SDL_Surface most likely is in
8-bit per pixel format. Which means 2/3 of the data OpenGL loads into
its texture is garbage, and even the rest is interpreted wrong. You have
to convert the loaded surface into a known format like Daniel K.O.
described. Good news is that you just have to create the intermediate
surface in the correct format, and SDL_BlitSurface will do the
conversion automatically no matter what format the source surface is in.

Yes, exactly. I saw an other post on the list with similiar problems.
I come from the Allegro world where i want to toy with palettes, the
pictures must be in 8bit format. Meaning, if i want to draw it as
opengl, i should convert them to 24 bit pictures. (right?).

I need them to be 8 bit because i am writing an rts game, therefor i
want the units be team-colored first (using palette color
replacements) and then drawn on the screen. Normally i would recreate
a palette, then apply it on the 8 bit bitmap, and then draw it on my
16 (or 32) bit screen surface.

If i want to do this the opengl way, i assume i have to create a temp
picture of 24 bit to draw it on first (so its coloured right), and
then draw with opengl that picture.

A lot to think about, thanks for the advice.

-Stefan

I need them to be 8 bit because i am writing an rts game, therefor i
want the units be team-colored first (using palette color
[…]
If i want to do this the opengl way, i assume i have to create a temp
picture of 24 bit to draw it on first (so its coloured right), and
then draw with opengl that picture.

There are a couple of ways of doing this (“tinting”). I suggest you check
out “Techniques to Apply Team Colors to 3D Models” Section 5.8 (p451) in
Game Programming Gems 4 for a good article going through the pros and cons
of some different approaches.On Fri, 16 Mar 2007 09:59:57 +0100, Stefan Hendriks wrote:


“There is no fitness function for ‘fun’” – John Hancock
Eddy L O Jansson | http://gazonk.org/~eloj