OpenGL textures and SDL_SetVideoMode()

My game uses SDL and OpenGL.
I have an options screen to change the resolution/depth and to switch
between fullscreen and windowed. To apply the changes I call
SDL_SetVideoMode() again, which seems to be the official method.

The implication seems to be that SDL_SetVideoMode destroys and recreates
the GL context, so it looks like I’d have to rebuild my textures.
Is this always the case?
And if not, how do I tell that I’m in a situation where I have to
rebuild my textures?

Do I have to call glGenTextures() again, or do I reuse just the existing
texture names?

Is the behaviour the same on all platforms?
(There was some discussion in the list archives about windows losing
surfaces, but maybe that’s a completely separate issue)…

Thanks,
Ben.

Everything from me in this mail is IIRC - I’m not sure if it’s all correct - it should be, but I didn’t test it.

My game uses SDL and OpenGL.
I have an options screen to change the resolution/depth and to switch
between fullscreen and windowed. To apply the changes I call
SDL_SetVideoMode() again, which seems to be the official method.

You should also call SDL_QuitSubsystem(SDL_VIDEO) and SDL_InitSubsystem(SDL_VIDEO) before SDL_SetVideoMode() on all platforms, excpet Linux, where in some versions you don’t need to do it.

The implication seems to be that SDL_SetVideoMode destroys and recreates
the GL context, so it looks like I’d have to rebuild my textures.
Is this always the case?
And if not, how do I tell that I’m in a situation where I have to
rebuild my textures?

Do I have to call glGenTextures() again, or do I reuse just the existing
texture names?

Probably not, I’ve read on GameDev one thread where one guy claimed that it’s working for him, even when he’s using old texture names.

HTH

Koshmaar

Selon Koshmaar :

You should also call SDL_QuitSubsystem(SDL_VIDEO) and
SDL_InitSubsystem(SDL_VIDEO) before SDL_SetVideoMode() on all platforms,
excpet Linux, where in some versions you don’t need to do it.

No, you don’t have to.

Regards,

Xavier

Koshmaar wrote:

Ben Campbell wrote:

I have an options screen to change the resolution/depth and to switch
between fullscreen and windowed. To apply the changes I call
SDL_SetVideoMode() again, which seems to be the official method.

You should also call SDL_QuitSubsystem(SDL_VIDEO) and
SDL_InitSubsystem(SDL_VIDEO) before SDL_SetVideoMode() on all
platforms, excpet Linux, where in some versions you don’t
need to do it.

OK, that makes sense.
Is there any reason I can’t just do call SDL_QuitSubsystem() and then
SDL_InitSubsystem() on all platforms anyway?

The implication seems to be that SDL_SetVideoMode destroys and recreates
the GL context, so it looks like I’d have to rebuild my textures.
Is this always the case?
And if not, how do I tell that I’m in a situation where I have to
rebuild my textures?

Do I have to call glGenTextures() again, or do I reuse just the existing
texture names?

Probably not, I’ve read on GameDev one thread where one guy claimed that it’s working for him, even when he’s using old texture names.

Hmmm… Guess we’re out of the SDL realm here and talking about OpenGL…
I’d have thought that if the context is destroyed, all the texture
objects go with it (although I know there is a way to tell contexts to
share textures).
I could imagine that a lot of opengl drivers implement glGenTextures and
glDeleteTextures as null operations and just have a big internal array
of texture pointers - this would explain why the guy on gamedev could
use old (unallocated) texture names… but really I’d like to know what
the ‘proper’ answer is here…

Ben.

OK, that makes sense.
Is there any reason I can’t just do call SDL_QuitSubsystem() and then
SDL_InitSubsystem() on all platforms anyway?

Whoops, sorry, Xavier was right - you don’t need to reinitalize video subsystem, just call SDL_SetVideoMode and it should be fine. I’ve checked it few minutes ago with my engine and it worked. However…

Hmmm… Guess we’re out of the SDL realm here and talking about OpenGL…
I’d have thought that if the context is destroyed, all the texture
objects go with it (although I know there is a way to tell contexts to
share textures).
I could imagine that a lot of opengl drivers implement glGenTextures and
glDeleteTextures as null operations and just have a big internal array
of texture pointers - this would explain why the guy on gamedev could
use old (unallocated) texture names… but really I’d like to know what
the ‘proper’ answer is here…

… after toggling fullscreen, textures were deleted. First I thought there’s sth wrong with my code, but then I found this:


So, OGL context is destroyed => textures are deleted => you have to reload them :-/

Koshmaar

Le Thu, 9 Jun 2005 19:10:26 +0200
"Koshmaar" a ?crit:

… after toggling fullscreen, textures were deleted. First I thought
there’s sth wrong with my code, but then I found this:

http://www.gamedev.net/community/forums/topic.asp?topic_id=175510
http://www.gamedev.net/community/forums/topic.asp?topic_id=272440

So, OGL context is destroyed => textures are deleted => you have to
reload them :-/

The problem is the OpenGL context destruction is platform-dependent. And
you may have a platform, where a simple resize don’t destroy the
context, but changing bpp do. So SDL should inform us that it has been
destroyed, thus avoiding some IFDEF in the code. I proposed to add a
specific event a while ago, but nobody answered to this.–
Patrice Mandin
WWW: http://membres.lycos.fr/pmandin/
Programmeur Linux, Atari
Sp?cialit?: D?veloppement, jeux

Patrice Mandin wrote:

Le Thu, 9 Jun 2005 19:10:26 +0200
"Koshmaar" a ?crit:

… after toggling fullscreen, textures were deleted. First I thought
there’s sth wrong with my code, but then I found this:

http://www.gamedev.net/community/forums/topic.asp?topic_id=175510
http://www.gamedev.net/community/forums/topic.asp?topic_id=272440

So, OGL context is destroyed => textures are deleted => you have to
reload them :-/

The problem is the OpenGL context destruction is platform-dependent. And
you may have a platform, where a simple resize don’t destroy the
context, but changing bpp do. So SDL should inform us that it has been
destroyed, thus avoiding some IFDEF in the code. I proposed to add a
specific event a while ago, but nobody answered to this.

Ahh - yep. Platform-specific differences. Thanks for the explaination!

I think you’re right that some sort of notification is in order. However
I’d feel a little uneasy about the idea of a context-lost event, because
it’s not immediately obvious when it’ll get handled.
For example, the chances are that the event won’t be caught and handled
until the app next does some event processing. So, when the
SDL_SetVideoMode() returns, your recovery code (to recreate your
textures, display lists, vertex programs etc) hasn’t yet run, so you
can’t immediately start drawing. In fact there could be other events
ahead of the context-lost notification in the event queue, and you could
get odd behaviour if they try drawing with textures that no longer exist…

Another solution would be to enforce the context destruct/recreate on
every SDL_SetVideoMode() call, even if it wasn’t strictly required by
specific platforms. So you’d know that your textures etc had been
zapped every time you called SDL_SetVideoMode(). I don’t really know
what the implications of this are, and it does feel a little bit silly,
but maybe it’s worth it to achieve consistant behaviour across platforms…
(not sure I like this idea much either :slight_smile:

The route I plan to go down for now is to assume the context will be
recreated when SDL_SetVideoMode() is called. So before I make the call
I’ll destroy all my textures, and recreate them after the call returns.
This should work fine no matter what SDL_SetVideoMode() does to the
context…

I wouldn’t mind adding some of this stuff to the SDL documentation - is
there a definitive place for docs? There are man pages and html under
CVS, and also the web-based doc project and wiki. Is there a 'master’
copy somewhere, or should I just make changes/submit patches to all of them?

Ben.

Le Fri, 10 Jun 2005 10:34:37 +0100
Ben Campbell a ?crit:

The problem is the OpenGL context destruction is platform-dependent.
And you may have a platform, where a simple resize don’t destroy the
context, but changing bpp do. So SDL should inform us that it has
been destroyed, thus avoiding some IFDEF in the code. I proposed to
add a specific event a while ago, but nobody answered to this.

Ahh - yep. Platform-specific differences. Thanks for the explaination!

I think you’re right that some sort of notification is in order.
However I’d feel a little uneasy about the idea of a context-lost
event, because it’s not immediately obvious when it’ll get handled.
For example, the chances are that the event won’t be caught and
handled until the app next does some event processing. So, when the
SDL_SetVideoMode() returns, your recovery code (to recreate your
textures, display lists, vertex programs etc) hasn’t yet run, so you
can’t immediately start drawing. In fact there could be other events
ahead of the context-lost notification in the event queue, and you
could get odd behaviour if they try drawing with textures that no
longer exist…

Another solution would be to enforce the context destruct/recreate on
every SDL_SetVideoMode() call, even if it wasn’t strictly required by
specific platforms. So you’d know that your textures etc had been
zapped every time you called SDL_SetVideoMode(). I don’t really know
what the implications of this are, and it does feel a little bit
silly, but maybe it’s worth it to achieve consistant behaviour across
platforms… (not sure I like this idea much either :slight_smile:

The route I plan to go down for now is to assume the context will be
recreated when SDL_SetVideoMode() is called. So before I make the call
I’ll destroy all my textures, and recreate them after the call
returns. This should work fine no matter what SDL_SetVideoMode() does
to the context…

Maybe we could add a SDL_NEWOPENGLCONTEXT to the SDL_Surface->flags
structure ?

If it is set, an OpenGL context has been created (on first run) or the
context has been destroy/recreated (changed mode on win32), and leave it
to 0 if the OpenGL context was not touched (x11 or some other platform).–
Patrice Mandin
WWW: http://membres.lycos.fr/pmandin/
Programmeur Linux, Atari
Sp?cialit?: D?veloppement, jeux

Patrice Mandin wrote:

Le Fri, 10 Jun 2005 10:34:37 +0100
Ben Campbell <@Ben_Campbell> a ?crit:

The route I plan to go down for now is to assume the context will be
recreated when SDL_SetVideoMode() is called. So before I make the call
I’ll destroy all my textures, and recreate them after the call
returns. This should work fine no matter what SDL_SetVideoMode() does
to the context…

Maybe we could add a SDL_NEWOPENGLCONTEXT to the SDL_Surface->flags
structure ?

If it is set, an OpenGL context has been created (on first run) or the
context has been destroy/recreated (changed mode on win32), and leave it
to 0 if the OpenGL context was not touched (x11 or some other platform).

That sounds like a reasonable approach - it makes it really easy to
detect when the context changes.
The client app would have to clear the flag itself, but that can be
documented, and it does nail down an exact detection method that would
be uniform across platforms…

I might have a dig about in the SDL source this weekend and give this a go.

Anyone else have any comments/improvements on this idea?

Ben.

Le Fri, 10 Jun 2005 16:22:25 +0100
Ben Campbell a ?crit:

Maybe we could add a SDL_NEWOPENGLCONTEXT to the SDL_Surface->flags
structure ?

If it is set, an OpenGL context has been created (on first run) or
the context has been destroy/recreated (changed mode on win32), and
leave it to 0 if the OpenGL context was not touched (x11 or some
other platform).

That sounds like a reasonable approach - it makes it really easy to
detect when the context changes.
The client app would have to clear the flag itself, but that can be
documented, and it does nail down an exact detection method that would
be uniform across platforms…

You can let SDL_SetVideoMode() in SDL_video.c always clear the flag
before calling the driver’s GL_MakeCurrent() function. And the various
video drivers will set the needed bit whenever they (re-)create the
OpenGL context.

From the application’s point of view, the screen’s SDL_Surface
informations should be kept read-only.

I might have a dig about in the SDL source this weekend and give this
a go.

It should be quite easy to do.–
Patrice Mandin
WWW: http://membres.lycos.fr/pmandin/
Programmeur Linux, Atari
Sp?cialit?: D?veloppement, jeux