Linux GL+FullScreen+nVidia problem

Problem: OpenGL fullscreen on Linux/nVidia segfaults in
SDL_GL_SwapBuffers().

Setup: Red Hat Linux 7.1 + official patches
- kernel-2.4.2-2
- glibc-2.2.2-10
- binutils-2.10.91.0.2-3
- gcc-2.96-81
- XFree86-4.0.3-5
SDL-1.2.1 (tried RPMs and compile from source)
nVidia OpenGL drivers 1.0-1251

Detail:

This simple program causes a segv inside SDL_GL_SwapBuffers. The same
program with the SDL_FULLSCREEN flag removed does not. Worse than that,
it causes the nVidia drivers some sort of headache and they segv in the
same call even for non-fullscreen SDL apps after the failure until the
X server is re-started.

I’m sure this worked for me at one stage, but I don’t know what caused
it to stop working. It still fails when run against the SDL-1.1.7 shared
lib shipped with Red Hat 7.1 so I don’t think it’s an SDL upgrade that
broke it.

Questions:

- Is this a (known?) nVidia driver bug? An X bug?

- Does the test program work for anyone else under Linux?

- Have I done something stupid?

- Can anyone provide a minimal non-SDL equivalent I can
  try / use as a bug-report?

As you might expect, this is super-annoying to debug.

  • Mike

/-------------------- buggered.c --------------------/
#include <SDL/SDL.h>
#include <GL/gl.h>

int
main(int argc, const char *argv[])
{
SDL_Surface *window;

if (SDL_Init(SDL_INIT_VIDEO) < 0)
    return 1;
SDL_GL_SetAttribute(SDL_GL_DOUBLEBUFFER, 1);
window = SDL_SetVideoMode(640, 480, 0, SDL_OPENGL|SDL_FULLSCREEN);
if (window == NULL)
    goto cleanup;
glClearColor(0.0, 0.0, 0.0, 0.0);
glClear(GL_COLOR_BUFFER_BIT);
SDL_GL_SwapBuffers();

cleanup:
SDL_Quit();
return 0;
}
/----------------------------------------------------/

Detail:

This simple program causes a segv inside SDL_GL_SwapBuffers. The same
program with the SDL_FULLSCREEN flag removed does not. Worse than that,
it causes the nVidia drivers some sort of headache and they segv in the
same call even for non-fullscreen SDL apps after the failure until the
X server is re-started.

Well, I did some very small programs for scholar purposes using SDL/OpenGL
on NVidia.
It always worked well on these hardware :

  • Creative Labs TNT1 16Mb.
  • Hercule Guillemot TNT2 32Mb.
  • ASUS Geforce ( I cannot remember if it was a Geforce1 or 2 )

/-------------------- buggered.c --------------------/
#include <SDL/SDL.h>
#include <GL/gl.h>

int
main(int argc, const char *argv[])
{
SDL_Surface *window;

if (SDL_Init(SDL_INIT_VIDEO) < 0)
    return 1;
SDL_GL_SetAttribute(SDL_GL_DOUBLEBUFFER, 1); -------------> In fact,

my own codes do not use this attribute ( but I keep using SDL_GL_SwapBuffers
and it work w/o any crashes. ) Try to remove this line.

window = SDL_SetVideoMode(640, 480, 0, SDL_OPENGL|SDL_FULLSCREEN);

try to test afterward if the depth you get is 32b. I may be wrong but it
seems that you may need to demand a 24 bit depth with a TNT1 card
(e.g. SDL_SetVideoMode(640, 480, 24, SDL_OPENGL|SDL_FULLSCREEN)). My tnt1
is a bit buggy in 2D ( not opengl ) 32b modes under X.
I noticed on my old TNT1 that if you ask a 24b window, you’ll get a proper
SDL_Surface ( 24b depth ) and internally, OpenGL will work using 32b 3D
hardware mode, allowing an opengl alpha channel you may use.

Have you also try to set SDL_ANYFORMAT ?
Have you try using SDL_OPENGLBLIT ( even if the things should work the same
way they eventually do ) ?

----- Original Message -----
From: mib@roadrunner.its.unimelb.edu.au (Mike Battersby)
To:
Sent: Friday, June 22, 2001 5:21 AM
Subject: [SDL] Linux GL+FullScreen+nVidia problem

NVIDIA+SDL seem to have a problem when you call SDL_GL_SwapBuffers without
having flushed the graphics so you should always do:
glFlush();
SDL_GL_SwapBuffers();

In addition you do not initialize OpenGL correctly. Read the SDL
documentation for an example how to initialize opengl. You must call
SDL_GL_SetAttribute to set RGB colors Z-buffer size (DEPTH_SIZE), and
Double Buffering (which is used
for SDL_GL_SwapBuffers to work)

Also, passing 0 to bpp hasnot always worked for me. I would recommend
passing the bpp of your desktop or use SDL_GetDisplay() (or something
similar) to get your desktop’s BPP.On Thu, 21 Jun 2001, Mike Battersby wrote:

Problem: OpenGL fullscreen on Linux/nVidia segfaults in
SDL_GL_SwapBuffers().

Setup: Red Hat Linux 7.1 + official patches
- kernel-2.4.2-2
- glibc-2.2.2-10
- binutils-2.10.91.0.2-3
- gcc-2.96-81
- XFree86-4.0.3-5
SDL-1.2.1 (tried RPMs and compile from source)
nVidia OpenGL drivers 1.0-1251

Detail:

This simple program causes a segv inside SDL_GL_SwapBuffers. The same
program with the SDL_FULLSCREEN flag removed does not. Worse than that,
it causes the nVidia drivers some sort of headache and they segv in the
same call even for non-fullscreen SDL apps after the failure until the
X server is re-started.

I’m sure this worked for me at one stage, but I don’t know what caused
it to stop working. It still fails when run against the SDL-1.1.7 shared
lib shipped with Red Hat 7.1 so I don’t think it’s an SDL upgrade that
broke it.

Questions:

- Is this a (known?) nVidia driver bug? An X bug?

- Does the test program work for anyone else under Linux?

- Have I done something stupid?

- Can anyone provide a minimal non-SDL equivalent I can
  try / use as a bug-report?

As you might expect, this is super-annoying to debug.

  • Mike

/-------------------- buggered.c --------------------/
#include <SDL/SDL.h>
#include <GL/gl.h>

int
main(int argc, const char *argv[])
{
SDL_Surface *window;

if (SDL_Init(SDL_INIT_VIDEO) < 0)
    return 1;
SDL_GL_SetAttribute(SDL_GL_DOUBLEBUFFER, 1);
window = SDL_SetVideoMode(640, 480, 0, SDL_OPENGL|SDL_FULLSCREEN);
if (window == NULL)
    goto cleanup;
glClearColor(0.0, 0.0, 0.0, 0.0);
glClear(GL_COLOR_BUFFER_BIT);
SDL_GL_SwapBuffers();

cleanup:
SDL_Quit();
return 0;
}
/----------------------------------------------------/

NVIDIA+SDL seem to have a problem when you call SDL_GL_SwapBuffers without
having flushed the graphics so you should always do:
glFlush();
SDL_GL_SwapBuffers();

Weird. I think on Windows at least swapping buffers implies a glFlush with
the NVIDIA drivers. But adding glFlush() is safe (as in won’t hamper
performance) before the swap buffer call.

  • Daniel Vogel, Programmer, Epic Games Inc.

Mike Battersby wrote:

Problem: OpenGL fullscreen on Linux/nVidia segfaults in
SDL_GL_SwapBuffers().

This simple program causes a segv inside SDL_GL_SwapBuffers. The same
program with the SDL_FULLSCREEN flag removed does not. Worse than that,
it causes the nVidia drivers some sort of headache and they segv in the
same call even for non-fullscreen SDL apps after the failure until the
X server is re-started.

Thanks to all those that responded. The program I posted was the
simplest one I could make that reproduced the problem. I had already
tried using the SDL_GL_SetAttribute calls (as my app normally does), and
also getting the BPP from the SDL_GetVideoInfo return, but none of those
things were the problem.

As it turns out the problem isn’t an SDL problem, but something in
either X or the nVidia drivers. I can’t figure out why, but when using
KDM (the KDE xdm replacement) to start my X server I get two copies of
the NV-GLX extension loaded (as reported in xdpyinfo), which is causing
this problem.

If I use ‘startx’ or regular xdm it doesn’t happen, and the app works
correctly. I guess I should try to figure out why kdm causes this
problem but all of my debugging steps so far have shown no differences
(same X startup call, command line options, env vars, etc.).

Bloody hell.

  • Mike

I just checked on my system, and xdpyinfo reported 20 instances of
NV-GLX. I am also using KDM (but not for long, now that I know). I suspect
that each time I log off I gather another copy.

Oddly, my SDL-based GL program is working just fine when I run it.

-RayOn Saturday 23 June 2001 23:05, you wrote:

Mike Battersby wrote:
As it turns out the problem isn’t an SDL problem, but something in
either X or the nVidia drivers. I can’t figure out why, but when using
KDM (the KDE xdm replacement) to start my X server I get two copies of
the NV-GLX extension loaded (as reported in xdpyinfo), which is causing
this problem.

Ray Kelm wrote:

I just checked on my system, and xdpyinfo reported 20 instances of
NV-GLX. I am also using KDM (but not for long, now that I know). I suspect
that each time I log off I gather another copy.

Oddly, my SDL-based GL program is working just fine when I run it.

It’s only fullscreen video modes that cause the crash. Thanks for the
confirm though, nice to know somebody else has the same symptoms.

  • Mike

is there anywhere I can download an older verison of nvidia drivers that
works with SDL?> -----Original Message-----

From: owner-sdl at lokigames.com
[mailto:owner-sdl at lokigames.com]On Behalf
Of Ray Kelm
Sent: Sunday, June 24, 2001 3:06 PM
To: sdl at lokigames.com
Subject: Re: [SDL] Re: Linux GL+FullScreen+nVidia problem

On Saturday 23 June 2001 23:05, you wrote:

Mike Battersby wrote:
As it turns out the problem isn’t an SDL problem, but something in
either X or the nVidia drivers. I can’t figure out why,
but when using
KDM (the KDE xdm replacement) to start my X server I get
two copies of
the NV-GLX extension loaded (as reported in xdpyinfo),
which is causing
this problem.

I just checked on my system, and xdpyinfo reported 20 instances of
NV-GLX. I am also using KDM (but not for long, now that I
know). I suspect
that each time I log off I gather another copy.

Oddly, my SDL-based GL program is working just fine when I run it.

-Ray