Where have all my textures gone?

Hi,

Firstly, sorry if this is an old topic; I joined the list this
morning, so I’m way behind.

I’m working on an OpenGL/SDL game which employs texture mapping heavily.
I’m getting weird behavior under Windows using VC++ 6 that I’m not seeing
in Linux.

In my game loop, I have a key, ‘f’, which sets the screen resolution to
800x600 pixels using whatever color depth SDL originally chose from the
startup resolution of 1024x768. The function called is just
SDL_SetVideoMode.

In Linux, this works perfectly. My OpenGL scene gets redrawn perfectly at
the new resolution, with all the textures intact, etc. The scene just looks
like a smaller version of the original. On the other hand, with the
exact same source code compiled with VC++ 6 on Windows 2000, however, the
behavior of the ‘f’ key is quite different. It causes my OpenGL scene to
be resized to the lower resolution all right, but now all the textures
for the polygons in the scene are gone! Is this a glitch in the SDL
implementation for MS Windows, or have I made a subtle memory-related SDL
programming error?

Finally, I have noticed a similar glitch in the Linux implementation of
SDL when switching resolutions in fullscreen mode. When the program
quits, SDL_Quit() does not reset the XFree86 resolution to what it was
before the SDL app changed it, so that my XFree86 server ends up in a
different video mode than it started with as an after-effect of the
program. The alteration of X, of course, only happens if I elect to
run the game in fullscreen mode, since windowed won’t affect the
screen resolution itself. In any case, this is highly undesirable
behavior. How have people worked around this?

For both issues, not that I have installed the most recent (Aug 1) CVS
version of the SDL libraries and headers.

Thanks,
Curtis

Hi,

Firstly, sorry if this is an old topic; I joined the list this
morning, so I’m way behind.

Tried searching the list archives?

I’m working on an OpenGL/SDL game which employs texture mapping heavily.
I’m getting weird behavior under Windows using VC++ 6 that I’m not
seeing
in Linux.

In Linux, this works perfectly. My OpenGL scene gets redrawn perfectly
at
the new resolution, with all the textures intact, etc. The scene just
looks
like a smaller version of the original. On the other hand, with the
exact same source code compiled with VC++ 6 on Windows 2000, however,
the
behavior of the ‘f’ key is quite different. It causes my OpenGL scene
to
be resized to the lower resolution all right, but now all the textures
for the polygons in the scene are gone! Is this a glitch in the SDL
implementation for MS Windows, or have I made a subtle memory-related
SDL
programming error?

The Windows code for SDL is likely destroying and recreating the OpenGL
context when you switch resolutions. You should release all of your
textures before calling SDL_Quit(), then reload them after calling
SDL_SetVideoMode(). You should also save/restore any other OpenGL state
you had before, like the viewing transformation etc.On Sunday, August 4, 2002, at 02:44 PM, Curtis Cooper wrote:

Hello and Welcome to the SDL mailing list!

In WGL (OpenGL in Windows), when you recreate your rendering context -
all the OpenGL textures (and I think also draw-lists, states, etc.) are
lost.
SDL_SetVideoMode() recreates the rendering context. (It must - don’t
blame SDL)
If I remember correctly, you must then recreate your textures and other
stuff.

About the resolution in X in Linux, look at the archive
http://www.libsdl.org/pipermail/sdl/2002-August/thread.html
"SDL not returning to proper resolution" appeared just lately…

RK.

Curtis Cooper wrote:>Hi,

Firstly, sorry if this is an old topic; I joined the list this
morning, so I’m way behind.

I’m working on an OpenGL/SDL game which employs texture mapping heavily.
I’m getting weird behavior under Windows using VC++ 6 that I’m not seeing
in Linux.

In my game loop, I have a key, ‘f’, which sets the screen resolution to
800x600 pixels using whatever color depth SDL originally chose from the
startup resolution of 1024x768. The function called is just
SDL_SetVideoMode.

In Linux, this works perfectly. My OpenGL scene gets redrawn perfectly at
the new resolution, with all the textures intact, etc. The scene just looks
like a smaller version of the original. On the other hand, with the
exact same source code compiled with VC++ 6 on Windows 2000, however, the
behavior of the ‘f’ key is quite different. It causes my OpenGL scene to
be resized to the lower resolution all right, but now all the textures
for the polygons in the scene are gone! Is this a glitch in the SDL
implementation for MS Windows, or have I made a subtle memory-related SDL
programming error?

Finally, I have noticed a similar glitch in the Linux implementation of
SDL when switching resolutions in fullscreen mode. When the program
quits, SDL_Quit() does not reset the XFree86 resolution to what it was
before the SDL app changed it, so that my XFree86 server ends up in a
different video mode than it started with as an after-effect of the
program. The alteration of X, of course, only happens if I elect to
run the game in fullscreen mode, since windowed won’t affect the
screen resolution itself. In any case, this is highly undesirable
behavior. How have people worked around this?

For both issues, not that I have installed the most recent (Aug 1) CVS
version of the SDL libraries and headers.

Thanks,
Curtis