SDL & OpenGL

Hi,

I’d like to know if OpenGL support in SDL is nearly perfect (just maybe
some little things will be changed), or if it is in early developpment
stage.
I’ve tried it and i found it very good but who knows, i’m not a guru…

Thanks,
PH.

PH-Neutre schrieb am 07 Jun 2000:

Hi,

I’d like to know if OpenGL support in SDL is nearly perfect (just maybe
some little things will be changed), or if it is in early developpment
stage. I’ve tried it and i found it very good but who knows, i’m not a guru…

It’s very usable. There are a few issues on win32 (mainly with 3dfx cards),
but other than that, it’s perfect.

  • Andreas–
    Check out my 3D lightcycle game: http://www.gltron.org
    More than 60’000 Downloads of the latest version (0.53)

Is there a fix in 1.1.3 for the fullscreen voodoo3 win32 crashing problem???

Ben> ----- Original Message -----

From: Andreas Umbach [mailto:marvin@dataway.ch]
Sent: 07 June 2000 22:05
To: sdl at lokigames.com
Subject: Re: [SDL] SDL & OpenGL

PH-Neutre schrieb am 07 Jun 2000:

Hi,

I’d like to know if OpenGL support in SDL is nearly perfect (just maybe
some little things will be changed), or if it is in early developpment
stage. I’ve tried it and i found it very good but who knows, i’m not a
guru…

It’s very usable. There are a few issues on win32 (mainly with 3dfx cards),
but other than that, it’s perfect.

  • Andreas

    Check out my 3D lightcycle game: http://www.gltron.org
    More than 60’000 Downloads of the latest version (0.53)

Ben Burns schrieb am 09 Jun 2000:

Is there a fix in 1.1.3 for the fullscreen voodoo3 win32 crashing problem???

Yeah, use glut, as I do. (SCNR).

Seriously. AFAIK it hasn’t been solved yet. Manuel Moss (the
Armagetron Author) wrote me that he just did

#ifdef WIN32
// disable DirectX by default; it causes problems with some boards.
if (!getenv(“SDL_VIDEODRIVER”) ) {
putenv(“SDL_VIDEODRIVER=windib”);
}
#endif

before initializing SDL and it worked fine on Voodoo Banshee / Voodoo3
and i740 (the cards that gave him problems). The fundamental solution
would probably be that someone with more win32 knowledge (and time) than
I sat down and wrote a cool wrapper that let’s you choose your OpenGL
lib at runtime (like Quake II / Quake III does). I hoped that Chris
Hecker (www.glsetup.com) would do it, but http://www.glsetup.com/dev.htm
hasn’t anything yet.

  • Andreas–
    Check out my 3D lightcycle game: http://www.gltron.org
    More than 60’000 Downloads of the latest version (0.53)

Is there a fix in 1.1.3 for the fullscreen voodoo3 win32 crashing problem???

Ben

I probalby should have mailed this one earlier, but I wanted to make
sure it is wanted/needed/correct first: on the GLSetup pages
at http://www.glsetup.com/ they say that a number of cards (including
the VooDoo 3) can’t work in OpenGL if the screen is initialized with
DirectX functions. Just disabling it with the lines

#ifdef WIN32
// disable DirectX by default; it causes problems with some boards.
if (!getenv(“SDL_VIDEODRIVER”) ) {
putenv(“SDL_VIDEODRIVER=windib”);
}
#endif

before initializing SDL in your OpenGL app should solve the problem (well,
it worked for me…)

And if you want to know what I’ve got to do with SDL: I use it
for my game Armagetron at

http://pluto.spaceports.com/~zman/armagetron/tron.html

(I did’t announce it here earlier because it’s too similar to Andreas’
game glTron :-D)

Manuel Moos mmoos at ix.urz.uni-heidelberg.deOn Fri, 9 Jun 2000, Ben Burns wrote:

I’m glad to hear there is finally an explanation to the odd Voodoo
crash bug. Don’t feel to bad about having a project similar to
Andreas’ Manuel, when we first heard about your project in the IRC
channel we were all teasing Andreas about starting our own tron
projects, he took it rather well too.

Wesley Poole
AKA Phoenix Kokido
Tired of hiding behind a on-line only identity…
members.xoom.com/kokido
@Wes_Poole

Manuel Moos wrote:

Is there a fix in 1.1.3 for the fullscreen voodoo3 win32 crashing
problem???> On Fri, 9 Jun 2000, Ben Burns wrote:

Ben

I probalby should have mailed this one earlier, but I wanted to make
sure it is wanted/needed/correct first: on the GLSetup pages
at http://www.glsetup.com/ they say that a number of cards (including
the VooDoo 3) can’t work in OpenGL if the screen is initialized with
DirectX functions. Just disabling it with the lines

#ifdef WIN32
// disable DirectX by default; it causes problems with some boards.
if (!getenv(“SDL_VIDEODRIVER”) ) {
putenv(“SDL_VIDEODRIVER=windib”);
}
#endif

before initializing SDL in your OpenGL app should solve the problem (well,
it worked for me…)

And if you want to know what I’ve got to do with SDL: I use it
for my game Armagetron at

http://pluto.spaceports.com/~zman/armagetron/tron.html

(I did’t announce it here earlier because it’s too similar to Andreas’
game glTron :-D)

Manuel Moos mmoos at ix.urz.uni-heidelberg.de

I’m glad to hear there is finally an explanation to the odd Voodoo
crash bug. Don’t feel to bad about having a project similar to
Andreas’ Manuel, when we first heard about your project in the IRC
channel we were all teasing Andreas about starting our own tron
projects, he took it rather well too.
You did see him cry:/
Poor lad.

:slight_smile:

MartinOn Fri, 9 Jun 2000, Phoenix Kokido wrote:

Bother! said Odo, as Pooh ran out and vaporized him.

Thanks to the two people who gave me answers to this…I shall be playing
with that when I get home from work :slight_smile:

Ben> ----- Original Message -----

From: Manuel Moos [mailto:mmoos@ix.urz.uni-heidelberg.de]
Sent: 09 June 2000 13:33
To: 'sdl at lokigames.com
Subject: RE: [SDL] SDL & OpenGL

On Fri, 9 Jun 2000, Ben Burns wrote:

Is there a fix in 1.1.3 for the fullscreen voodoo3 win32 crashing
problem???

Ben

I probalby should have mailed this one earlier, but I wanted to make
sure it is wanted/needed/correct first: on the GLSetup pages
at http://www.glsetup.com/ they say that a number of cards (including
the VooDoo 3) can’t work in OpenGL if the screen is initialized with
DirectX functions. Just disabling it with the lines

#ifdef WIN32
// disable DirectX by default; it causes problems with some boards.
if (!getenv(“SDL_VIDEODRIVER”) ) {
putenv(“SDL_VIDEODRIVER=windib”);
}
#endif

before initializing SDL in your OpenGL app should solve the problem (well,
it worked for me…)

And if you want to know what I’ve got to do with SDL: I use it
for my game Armagetron at

http://pluto.spaceports.com/~zman/armagetron/tron.html

(I did’t announce it here earlier because it’s too similar to Andreas’
game glTron :-D)

Manuel Moos mmoos at ix.urz.uni-heidelberg.de

#ifdef WIN32
if (!getenv(“SDL_VIDEODRIVER”) )
{
putenv(“SDL_VIDEODRIVER=windib”);
}
#endif

I just though I should point out that this is not as it might seem an ugly hack
to get it working. The microsoft driver conformance tests for OpenGL does not
require that the driver can render to DirectDraw surfaces (SDL uses DirectDraw
surfaces by default) and therefore there is a bunch of drivers that does not work
that way and even more that are more or less unstable.
There is no possiblity (short of getting in bed with Billy-boy) for SDL to fix
this so the only good way to do it is to use GDI (hopefully SDL will in later
versions take some kind of OpenGL-hint and use GDI automaticly).

I just though I should point out that this is not as it might seem an ugly hack
to get it working. The microsoft driver conformance tests for OpenGL does not
require that the driver can render to DirectDraw surfaces

SDL itself doesn’t require that a DirectDraw surface be present for GL
rendering. SDL gets the GL device context from the window, which
theoretically shouldn’t be bound to DirectDraw.

Is it just the fact that a DirectDraw primary surface has been created
that is causing the 3Dfx GL to crash? If so, it would be very easy to
prevent the creation of the DirectDraw surfaces when a GL mode is being
set. In fact this should probably be done anyway, to conserve system
resources. Can anyone verify that this would solve the problem?

See ya,
-Sam Lantinga, Lead Programmer, Loki Entertainment Software

I just though I should point out that this is not as it might seem an ugly hack
to get it working. The microsoft driver conformance tests for OpenGL does not
require that the driver can render to DirectDraw surfaces

SDL itself doesn’t require that a DirectDraw surface be present for GL
rendering. SDL gets the GL device context from the window, which
theoretically shouldn’t be bound to DirectDraw.

Is it just the fact that a DirectDraw primary surface has been created
that is causing the 3Dfx GL to crash? If so, it would be very easy to
prevent the creation of the DirectDraw surfaces when a GL mode is being
set. In fact this should probably be done anyway, to conserve system
resources. Can anyone verify that this would solve the problem?

Well, I can’t check it, but the GLSetup guys seem to have done that
and they say on http://www.glsetup.com/ihvs/3dfx/voodoo3/index.htm :

Developers: The Voodoo3 ICD does not draw if you use
DirectDraw::SetDisplayMode to switch the display mode.

this function (I assume it’s the same as IDirectDraw2_SetDisplayMode())
is only called in the function DX5_SetVideoMode() in SDL_dx5video.c, so
replacing it there with the corresponding normal function (if that’s
possible) if the GL flag is set could solve the problem. Removing only the
surface creation would not work, I guess.

Hope that helps,

Manuel Moos @Manuel_MoosOn Wed, 14 Jun 2000, Sam Lantinga wrote:

Developers: The Voodoo3 ICD does not draw if you use
DirectDraw::SetDisplayMode to switch the display mode.

this function (I assume it’s the same as IDirectDraw2_SetDisplayMode())
is only called in the function DX5_SetVideoMode() in SDL_dx5video.c, so
replacing it there with the corresponding normal function (if that’s
possible) if the GL flag is set could solve the problem. Removing only the
surface creation would not work, I guess.

Hope that helps,

Yes it does, thanks.

See ya,
-Sam Lantinga, Lead Programmer, Loki Entertainment Software

O.K. it looks like my last message got munched - let me try again…

When using the combined SDl & OpenGL video mode (aka. SDL_OPENGLBLIT).
It seems alpha information contained in an SDL surface is lost when
blitting to the screen. The effect is that SDL surfaces and OpenGL
renderings do not blend at all. Is this a bug? A limitation? A planned
feature? A silly thing to ask for?

Again, any insight would be greatly appreciated…

Cheers,

-Smitty–


  • HOMEOWNER ENVY: Feelings of jealousy generated in the young and the
    disenfranchised when facing gruesome housing statistics.
    _____________________________________________________
    Dave Schmitz Interact-TV
    smitty at interact-tv.com http://www.interact-tv.com
    Ph: 720.406.9399 Fax: 720.406.8424

Dave Schmitz wrote:

O.K. it looks like my last message got munched - let me try again…

When using the combined SDl & OpenGL video mode (aka. SDL_OPENGLBLIT).
It seems alpha information contained in an SDL surface is lost when
blitting to the screen. The effect is that SDL surfaces and OpenGL
renderings do not blend at all. Is this a bug? A limitation? A planned
feature? A silly thing to ask for?

It’s a limitation, but it’s not silly to ask for it. I’d like the
same thing…

-Ray

When using the combined SDl & OpenGL video mode (aka. SDL_OPENGLBLIT).
It seems alpha information contained in an SDL surface is lost when
blitting to the screen. The effect is that SDL surfaces and OpenGL
renderings do not blend at all. Is this a bug? A limitation? A planned
feature? A silly thing to ask for?

It’s a limitation, but it’s not silly to ask for it. I’d like the
same thing…

I already wrote in an earlier reply that it is possible, though it is
unclear what the original poster wanted. If you just want to copy the
pixels including the alpha values, make sure SDL_SRCALPHA isn’t set

Since no one seems to be biting my question about porting from GLUT to SDL,
here’s the same question boiled down to its essence:

Does calling SDL_SetVideoMode() more than once cause the pre-existing OpenGL
context to be destroyed and a new one created? Would this be the cause of
loosing texture objects between window resize and full screen toggles?

-Blake

I’m surprised nobody else has answered yet, I saw your message several
hours ago and flagged it for later reply when I wasn’t in the middle of
about six other things…

SDL_VideoMode () will destroy the old context and create a new one. You
must reupload textures and the like as you would normally expect for
working with an OpenGL context. I’d need to look over GLUT again to see
what it does for you to make this easier, but I suspect you could do the
same yourself and clone that functionality without much real pain beyond
making sure that your code is restartable. For a large application,
possibly it would be less painful, in fact.On Mon, Apr 22, 2002 at 10:00:05PM -0700, Blake Senftner wrote:

Since no one seems to be biting my question about porting from GLUT to SDL,
here’s the same question boiled down to its essence:

Does calling SDL_SetVideoMode() more than once cause the pre-existing OpenGL
context to be destroyed and a new one created? Would this be the cause of
loosing texture objects between window resize and full screen toggles?


Joseph Carter Certified free software nut

But IANAL, of course.

IANAL either. My son is, but if I asked him I might get an answer I
wouldn’t want to hear.

“Here’s my invoice.” ? =D

-------------- next part --------------
A non-text attachment was scrubbed…
Name: not available
Type: application/pgp-signature
Size: 273 bytes
Desc: not available
URL: http://lists.libsdl.org/pipermail/sdl-libsdl.org/attachments/20020423/83114b2c/attachment.pgp

SDL_VideoMode () will destroy the old context and create a new one. You
must reupload textures and the like as you would normally expect for
working with an OpenGL context. I’d need to look over GLUT again to see
what it does for you to make this easier, but I suspect you could do the
same yourself and clone that functionality without much real pain beyond
making sure that your code is restartable. For a large application,
possibly it would be less painful, in fact.



new message:

Interesting that my older GLUT program does not loose its OpenGL context
between window resize or fullscreen toggle events… at least on win32, my
development platform.

<…brief time passage…>

I just looked over what GLUT does: (win32 logic only, but both fullscreen
and resize events)

#if defined(_WIN32)
if ( workMask & GLUT_FULL_SCREEN_WORK ) {
DWORD s;
RECT r;

        GetWindowRect(GetDesktopWindow(), &r);
        s = GetWindowLong(window->win, GWL_STYLE);
        s &= ~WS_OVERLAPPEDWINDOW;
        s |= WS_POPUP;
        SetWindowLong(window->win, GWL_STYLE, s);
        SetWindowPos(window->win,
            /*HWND_TOP, <-- safer? */
  HWND_TOPMOST, /* is better, but no windows atop it */
            r.left, r.top,
            r.right-r.left, r.bottom-r.top,
            SWP_FRAMECHANGED);

        /* This hack causes the window to go back to the right position
        when it is taken out of fullscreen mode. */
        {
            POINT p;

            p.x = 0;
            p.y = 0;
            ClientToScreen(window->win, &p);
            window->desiredConfMask |= CWX | CWY;
            window->desiredX = p.x;
            window->desiredY = p.y;
        }
    } else {
        RECT changes;
        POINT point;
        UINT flags = SWP_NOACTIVATE | SWP_NOMOVE | SWP_NOOWNERZORDER |

SWP_NOSENDCHANGING | SWP_NOSIZE | SWP_NOZORDER;
DWORD style;

        GetClientRect(window->win, &changes);
        style = GetWindowLong(window->win, GWL_STYLE);

        /* Get rid of fullscreen mode, if it exists */
        if ( style & WS_POPUP ) {
            style &= ~WS_POPUP;
            style |= WS_OVERLAPPEDWINDOW;
            SetWindowLong(window->win, GWL_STYLE, style);
            flags |= SWP_FRAMECHANGED;
        }

        /* If this window is a toplevel window, translate the 0,0 client
        coordinate into a screen coordinate for proper placement. */
        if (!window->parent) {
            point.x = 0;
            point.y = 0;
            ClientToScreen(window->win, &point);
            changes.left = point.x;
            changes.top = point.y;
        }
        if (window->desiredConfMask & (CWX | CWY)) {
            changes.left = window->desiredX;
            changes.top = window->desiredY;
            flags &= ~SWP_NOMOVE;
        }
        if (window->desiredConfMask & (CWWidth | CWHeight)) {
            changes.right = changes.left + window->desiredWidth;
            changes.bottom = changes.top + window->desiredHeight;
            flags &= ~SWP_NOSIZE;
            /* XXX If overlay exists, resize the overlay here, ie.
            if (window->overlay) ... */
        }
        if (window->desiredConfMask & CWStackMode) {
            flags &= ~SWP_NOZORDER;
            /* XXX Overlay support might require something special here.

*/
}

        /* Adjust the window rectangle because Win32 thinks that the x,

y,
width & height are the WHOLE window (including decorations),
whereas GLUT treats the x, y, width & height as only the CLIENT
area of the window. Only do this to top level windows
that are not in game mode (since game mode windows do
not have any decorations). */
if (!window->parent && window != __glutGameModeWindow) {
AdjustWindowRect(&changes, style, FALSE);
}

        /* Do the repositioning, moving, and push/pop. */
        SetWindowPos(window->win,
            window->desiredStack == Above ? HWND_TOP : HWND_BOTTOM,
            changes.left, changes.top,
            changes.right - changes.left, changes.bottom - changes.top,
            flags);

        /* Zero out the mask. */
        window->desiredConfMask = 0;
    }

I notice that there is no destruction of the window context, simply resize
logic for the window. I understand that for other operating systems simple
resize logic like this may not be possible. Or is it? I have never been much
of an expert on creating windows and such on any operating system (most of
my career has been on consoles.)

For now I can easially add the logic to my system to delete all the texture
objects and recreate them upon resize/fullscreen events. But I would also
think that for many of the more hobbie programmers that are now using SDL,
this type of maintainence would be the reason that they don’t support
resizable windows.

Is it possible for the range of OS that SDL supports to handle resize and
fullscreen events without destroying and recreating the OpenGL context?

Also, when the OpenGL context is destroyed, is all the memory associated
with that context freed (such as display lists, texture objects,
pBuffers…) or are all those items left dangling? (Still consuming memory.)

Thanks for any info,
-Blake

----- Original Message -----
From: knghtbrd@bluecherry.net (Joseph Carter)
To:
Sent: Tuesday, April 23, 2002 12:38 AM
Subject: Re: [SDL] SDL & OpenGL