SDL and OpenGL 2D problems

Hi,

First of all, forgive me if these questions sound stupid. I have a fair
amount of experience with 2D SDL programming, but very little experience
in 3D, and alpha channels. So, bear with me :slight_smile:

Ok. I’ve started work on a GUI for a game project I’m working on. I’ve
planned the classes (C++) and have now moved into the implementation
phase. What I want to do is have a 2D GUI that can be drawn on top of
the 3D opengl rendering. From what I read on this list, SDL_OPENGLBLIT
is NOT and option. No problems - I’ve been experimenting with
glDrawPixels as an alternative.

Using this method, I can load a BMP and draw it using the GL_BGR format,
and I can load a PNG and draw it using the GL_RGBA format. My main
problem comes when I want to create my own GUI pixel data in memory
(with alpha support). To do this I have tried the following:

—//CUT HERE//—

data = SDL_CreateRGBSurface(
SDL_SWSURFACE,
100,
100,
screen->format->BitsPerPixel,
screen->format->Rmask,
screen->format->Gmask,
screen->format->Bmask,
screen->format->Amask
);

SDL_SetAlpha(data, SDL_SRCALPHA, 255); //Make the surface opaque

—//CUT HERE//–

I then try to draw it using the following code (adapted from testgl.c):

—//CUT HERE//–

SDL_GL_Enter2DMode();
glRasterPos2i(200,200);
glPixelZoom(1.0f, -1.0f); //Flip the surface so it’s the right way up
glDrawPixels(data->w, data->h, GL_RGBA, GL_UNSIGNED_BYTE, data->pixels);
SDL_GL_Leave2DMode();

—//CUT HERE//—

Now when I run the app, the area where the surface should be displayed
is corrupted (like noise on a tv). I originally had this problem with
images (i.e. trying to draw a PNG as an RGB instead of RGBA), but
changing the format flag makes no difference.

With this in mind, I’m looking for answers to the following:

  1. Is glDrawPixels appropriate for my needs? Keep in mind that I need to
    change the pixel data after creation (for example render a real time
    game map). I had considered using textures, but I thought constant
    re-binding would be inefficient. Am I right to assume this?

  2. Is there a way to add an alpha channel to a surface that originally
    had none? This would allow me to draw all images/surfaces using the RGBA
    format flag.

  3. Is there a way to find out if an image is BGR (or BGRA) and then
    convert it to the more supported RGBA - or do I have to write my own
    functions (if so any tips?)?

  4. Do I need to convert the image surface data to the screen format
    using SDL_DisplayFormat?

  5. If I use SDL_SetAlpha to set the alpha, will these changes be
    displayed by OpenGL, or does SDL keep alpha info separate from pixel data?

  6. Can someone clarify something for me concerning OpenGL pixel types. I
    know that GL_UNSIGNED_BYTE is basically a 16-bit pixel representation,
    but what would I use signed (such as GL_BYTE) pixel formats for (as far
    as I understand, pixels can’t have negative colour values). In fact, now
    that I’ve got myself confused - perhaps someone could also refresh my on
    how pixel data is stored for each bit depth. For example, when using 32
    bit colour, are 8 bits allocated for each of the R, G, B, and Alpha
    channels? If so how is does this work for 16 bit colour (4 bits each?)

Whew! I appreciate the time anyone takes to read this (and reply), and
hopefully you can help me.

Thanks,
Lance

Hi,

First of all, forgive me if these questions sound stupid. I have a
fair amount of experience with 2D SDL programming, but very little
experience in 3D, and alpha channels. So, bear with me :slight_smile:

Ok. I’ve started work on a GUI for a game project I’m working on.
I’ve planned the classes (C++) and have now moved into the
implementation phase. What I want to do is have a 2D GUI that can
be drawn on top of the 3D opengl rendering. From what I read on
this list, SDL_OPENGLBLIT is NOT and option. No problems - I’ve
been experimenting with glDrawPixels as an alternative.

That’s almost as bad as SDL_OPENGLBLIT on some platforms. It’s often
unaccelerated, and regardless, it means every single blit includes a
system RAM -> VRAM transfer. Whether it’s done with busmaster DMA or
the CPU, it’s slower than rendering from a VRAM resident texture.

Using this method, I can load a BMP and draw it using the GL_BGR
format, and I can load a PNG and draw it using the GL_RGBA format.
My main problem comes when I want to create my own GUI pixel data
in memory (with alpha support). To do this I have tried the
following:

—//CUT HERE//—

data = SDL_CreateRGBSurface(
SDL_SWSURFACE,
100,
100,
screen->format->BitsPerPixel,
screen->format->Rmask,
screen->format->Gmask,
screen->format->Bmask,
screen->format->Amask
);

SDL_SetAlpha(data, SDL_SRCALPHA, 255); //Make the surface opaque

That doesn’t matter unless you’re going to use SDL blitting functions
on the surface. It doesn’t affect the surface pixel data directly.

—//CUT HERE//–

I then try to draw it using the following code (adapted from
testgl.c):

—//CUT HERE//–

SDL_GL_Enter2DMode();
glRasterPos2i(200,200);
glPixelZoom(1.0f, -1.0f); //Flip the surface so it’s the right way
up glDrawPixels(data->w, data->h, GL_RGBA, GL_UNSIGNED_BYTE,
data->pixels); SDL_GL_Leave2DMode();

—//CUT HERE//—

Now when I run the app, the area where the surface should be
displayed is corrupted (like noise on a tv). I originally had this
problem with images (i.e. trying to draw a PNG as an RGB instead of
RGBA), but changing the format flag makes no difference.

With this in mind, I’m looking for answers to the following:

  1. Is glDrawPixels appropriate for my needs? Keep in mind that I
    need to change the pixel data after creation (for example render a
    real time game map). I had considered using textures, but I thought
    constant re-binding would be inefficient. Am I right to assume
    this?

Rebinding costs virtually nothing on most cards. The cost is in
transferring the data to texture memory - and there’s no way to get
around that as long as you need to modify the textures.

Note that you can update part of a texture; you don’t have to
transfer the whole texture after every change. Also, more
interestingly, you don’t have to retransfer the texture unless you’ve
actually changed it. That is, unless you need to completely redraw
every pixel of your 2D stuff every frame, glDrawPixels() is a
complete waste of resources.

  1. Is there a way to add an alpha channel to a surface that
    originally had none? This would allow me to draw all
    images/surfaces using the RGBA format flag.

I’m not even sure you can count on alpha blending working at all with
glDrawPixels()… AFAIK, it’s not really meant for that kind of
stuff. If you want OpenGL to really do something with the data,
“procedural textures” is the term you’re looking for - and that’s
just a way of using normal textures.

  1. Is there a way to find out if an image is BGR (or BGRA) and then
    convert it to the more supported RGBA - or do I have to write my
    own functions (if so any tips?)?

If you’re loading using SDL_image or something else that generates
proper SDL surfaces, just check the pixel format of the surface. Also
note that BGR and RGB look different on different depending on CPU
endian. (Which matters if you want your code to be portable accross
CPU types.)

  1. Do I need to convert the image surface data to the screen format
    using SDL_DisplayFormat?

No, but you need to convert it into something that OpenGL understands.
If you require OpenGL 1.2 anyway, I think you have a lot more options
here, as it can handle most sensible formats directly. (Obviously,
the driver will often have to convert anyway, as 3D h/w rarely
supports more than a few hardcoded formats internally.)

  1. If I use SDL_SetAlpha to set the alpha, will these changes be
    displayed by OpenGL, or does SDL keep alpha info separate from
    pixel data?

OpenGL treats alpha pretty much like SDL; as a fourth channel, in
parallell with R, G and B. SDL_SetAlpha() has little to do with the
actual alpha channels, though. Only SDL blitters care about it.

  1. Can someone clarify something for me concerning OpenGL pixel
    types. I know that GL_UNSIGNED_BYTE is basically a 16-bit pixel
    representation, but what would I use signed (such as GL_BYTE) pixel
    formats for (as far as I understand, pixels can’t have negative
    colour values).

I would think you need two bytes per pixel for 16 bit formats,
though… Note that the data format and the pixel format in OpenGL
calls are not the same thing.

In fact, now that I’ve got myself confused -
perhaps someone could also refresh my on how pixel data is stored
for each bit depth. For example, when using 32 bit colour, are 8
bits allocated for each of the R, G, B, and Alpha channels?

Yep. Order can be RGBA or BGRA or whatever, though - it differs
between drivers, platforms and video cards.

If so
how is does this work for 16 bit colour (4 bits each?)

Yes, typically. It can also be ABGR 1:5:5:5 or something; again,
haredware dependent. Though drivers should understand the standard
formats defined by the OpenGL version they claim to support, so you
shouldn’t have to worry about the details, unless you’re dealing with
high bandwidth procedural textures.

Whew! I appreciate the time anyone takes to read this (and reply),
and hopefully you can help me.

Well, I think the OpenGL gurus around here know more about the details
of OpenGL and pixel formats; I just wrote glSDL.

//David Olofson - Programmer, Composer, Open Source Advocate

.- The Return of Audiality! --------------------------------.
| Free/Open Source Audio Engine for use in Games or Studio. |
| RT and off-line synth. Scripting. Sample accurate timing. |
`---------------------------> http://olofson.net/audiality -’
— http://olofson.net — http://www.reologica.se —On Wednesday 29 January 2003 13.12, Lance Duivenbode wrote:

There are almost as many possibilities for getting noise on the screen as
there are for getting black :slight_smile:

Did you check that glDrawPixels draws a region of memory known to be filled
with zero as black, and -1 as white? (For this test you want GL_BLEND off.)

Regards,

DanielOn Wednesday 29 January 2003 13:12, Lance Duivenbode wrote:

Now when I run the app, the area where the surface should be displayed
is corrupted (like noise on a tv). I originally had this problem with
images (i.e. trying to draw a PNG as an RGB instead of RGBA), but
changing the format flag makes no difference.