OpenGL X Crash

OK,

I’ve got a TNT2 and a Voodoo 2 an the same computer, so I can test
the 2 drivers on the same system.

The results : all answers to the original message are right :

  • The NVIDIA driver ( mine is utah-glx ) is buggy,
  • The 3Dfx driver is good.

And the problem is that a good NVIDIA driver don’t seems to come …

And just a little other thing :

You said :
“Now, I realize that this isn’t a “normal” game application because its
updating the screen way too fast.”

Yes but no : when you start a game, there is often a menu, that isn’t
more complex that your code !!!

That’s all.

Nicox.

Hi everybody…

Here’s the trouble I’m having (I’m including the following code as an
example):

#include <string.h>
#include <stdlib.h>

#include <SDL.h>

int main() {
SDL_Init( SDL_INIT_VIDEO );

SDL_GL_SetAttribute( SDL_GL_RED_SIZE, 5 );
SDL_GL_SetAttribute( SDL_GL_GREEN_SIZE, 5 );
SDL_GL_SetAttribute( SDL_GL_BLUE_SIZE, 5 );
SDL_GL_SetAttribute( SDL_GL_DEPTH_SIZE, 16 );
SDL_GL_SetAttribute( SDL_GL_DOUBLEBUFFER, 1 );
if ( SDL_SetVideoMode( 800, 600, 16, SDL_OPENGL ) == NULL ) {
fprintf(stderr, “Couldn’t set GL mode: %s\n”, SDL_GetError());
exit(1);
}

glViewport( 0, 0, 800, 600 );
glClearColor( 1.0, 1.0, 1.0, 1.0 );
glMatrixMode( GL_PROJECTION );
glLoadIdentity();
gluPerspective( 90, 800/600, 1, 100 );
glMatrixMode( GL_MODELVIEW );
glLoadIdentity();

while ( 1 ) {
glLoadIdentity();
glClear( GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT );

glTranslatef( -1.5, 0.0, -6.0 );

glBegin( GL_POLYGON );
glColor3f( 1.0, 0.0, 0.0 );
glVertex3f( 0.0, 1.0, 0.0 );
glColor3f( 0.0, 1.0, 0.0 );
glVertex3f( 1.0, -1.0, 0.0 );
glColor3f( 0.0, 0.0, 1.0);
glVertex3f( -1.0, -1.0, 0.0 );
glEnd();

SDL_GL_SwapBuffers();

SDL_Event event;
if ( SDL_PollEvent( &event ) > 0 ) {
if ( event.type == SDL_KEYDOWN )
break;
}
}

SDL_Quit();
}

As you can see it’s pretty boiler-plate.

Now, the problem is that running this program and hitting a key will crash X
and dump me uncerimoniously back to XDM. Now, I realize that this isn’t a
"normal" game application because its updating the screen way too fast, and
indeed if I insert a sleep(1) after the rendering code then everything is
fine. However I’m left wondering why this is crashing X and furthermore
what amount of delay is the lowest needed to keep it from crashing…perhaps
if I simply inserted a delay before quitting the application it would be OK,
however I haven’t tested this and I’m still wondering why this would be
neccessary since it isn’t documented anywhere.

Anyway, just thought I’d give you guys something to mull over. My system is
an Intel Pentium II/400, TNT2 (using the older, more stable (?) nvidia glx
driver), glibc 2.1.

Thanks in advance,

      -Will Weisser

Anyway, just thought I’d give you guys something to mull over. My system is
an Intel Pentium II/400, TNT2 (using the older, more stable (?) nvidia glx
driver), glibc 2.1.

This is a bug in your X/GL driver. The program works fine on my 3DFX.

m.On Tue, Mar 07, 2000 at 11:56:41AM -0500, Will Weisser wrote:


Programmer "I wrote a song about dental floss,
Loki Entertainment Software but did anyone’s teeth get cleaner?"
http://lokigames.com/~briareos/ - Frank Zappa, re: the PMRC

In article <8a3cb3$s6f$1 at ftp.lokigames.com>, “Will Weisser” wrote:

Now, the problem is that running this program and hitting a key will crash X
and dump me uncerimoniously back to XDM. Now, I realize that this isn’t a
"normal" game application because its updating the screen way too fast, and
indeed if I insert a sleep(1) after the rendering code then everything is
fine. However I’m left wondering why this is crashing X and furthermore
what amount of delay is the lowest needed to keep it from crashing…perhaps
if I simply inserted a delay before quitting the application it would be OK,
however I haven’t tested this and I’m still wondering why this would be
neccessary since it isn’t documented anywhere.

I don’t know why this code is crashing, but it certainly isn’t related to a missing
delay. I’m writing a visualization plugin, that is similar to your code in that it
creates a rendering thread with an endless loop that does the drawing.

I compiled your program on my machine and it doesn’t crash X. Since my
system specs are mostly very similar to yours (pIII/733, glibc2.1), except for
the opengl drivers (i’m using software mesa and mesa/3dfx, i guess it is your
opengl drivers that can’t cope with the stress.--------------------------------------------------------------
Christian Zander ** N?ckersberg 76 ** 45257 Essen ** Germany
email: mbox at minion.de ** www: www.minion.de ** icq#: 5322926

Will Weisser wrote:

Hi everybody…

Here’s the trouble I’m having (I’m including the following code as an
example):

#include <string.h>
#include <stdlib.h>

#include <SDL.h>

int main() {
SDL_Init( SDL_INIT_VIDEO );

SDL_GL_SetAttribute( SDL_GL_RED_SIZE, 5 );
SDL_GL_SetAttribute( SDL_GL_GREEN_SIZE, 5 );
SDL_GL_SetAttribute( SDL_GL_BLUE_SIZE, 5 );
SDL_GL_SetAttribute( SDL_GL_DEPTH_SIZE, 16 );
SDL_GL_SetAttribute( SDL_GL_DOUBLEBUFFER, 1 );
if ( SDL_SetVideoMode( 800, 600, 16, SDL_OPENGL ) == NULL ) {
fprintf(stderr, “Couldn’t set GL mode: %s\n”, SDL_GetError());
exit(1);
}

glViewport( 0, 0, 800, 600 );
glClearColor( 1.0, 1.0, 1.0, 1.0 );
glMatrixMode( GL_PROJECTION );
glLoadIdentity();
gluPerspective( 90, 800/600, 1, 100 );
glMatrixMode( GL_MODELVIEW );
glLoadIdentity();

while ( 1 ) {
glLoadIdentity();
glClear( GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT );

glTranslatef( -1.5, 0.0, -6.0 );

glBegin( GL_POLYGON );
glColor3f( 1.0, 0.0, 0.0 );
glVertex3f( 0.0, 1.0, 0.0 );
glColor3f( 0.0, 1.0, 0.0 );
glVertex3f( 1.0, -1.0, 0.0 );
glColor3f( 0.0, 0.0, 1.0);
glVertex3f( -1.0, -1.0, 0.0 );
glEnd();

SDL_GL_SwapBuffers();

SDL_Event event;
if ( SDL_PollEvent( &event ) > 0 ) {
if ( event.type == SDL_KEYDOWN )
break;
}
}

SDL_Quit();
}

As you can see it’s pretty boiler-plate.

Now, the problem is that running this program and hitting a key will crash X
and dump me uncerimoniously back to XDM. Now, I realize that this isn’t a
"normal" game application because its updating the screen way too fast, and
indeed if I insert a sleep(1) after the rendering code then everything is
fine. However I’m left wondering why this is crashing X and furthermore
what amount of delay is the lowest needed to keep it from crashing…perhaps
if I simply inserted a delay before quitting the application it would be OK,
however I haven’t tested this and I’m still wondering why this would be
neccessary since it isn’t documented anywhere.

Anyway, just thought I’d give you guys something to mull over. My system is
an Intel Pentium II/400, TNT2 (using the older, more stable (?) nvidia glx
driver), glibc 2.1.

Thanks in advance,

      -Will Weisser

I assume you mean the X server available on the nvidia web page. In this case
its garbage. Any use of glx quickly leads to either a server lockup or crash
for me. Under Nvidia + Mesa 3.0 the application I’m developing would instantly
crash the X server, I changed to Mesa 3.1 and now it will run, but locks the X
server if I switch from full screen to windowed.

I’ve given up on Nvidia’s accelerated glx and am using software Mesa now, slow
but sure. Hopefully something decent and easy to install will be available
shortly after the expect release of Xfree86 4.0 this month. I don’t have much
confidence in Nvidia’s ability to write good drivers, the Windows drivers for
the TNT I’ve used on my brothers machine are fast but buggy (by using different
versions you can at least pick what bugs you want to live with).

Yeah, I struggled with NVIDIA’s crappy glx implementation at first, but
now I’m running Utah-GLX with TNT2 Ultra drivers under X 3.3.6. The agp
support is STILL non-existent, but at least I have hardware acceleration
under 32bpp.

  • Get XFree86 3.3.6 from www.xfree86.org
  • Get Utah-GLX from SourceForge - documentation is great!
  • Don’t forget to grab the Mesa-3.2 CVS code from Utah-GLX
  • Compile X 3.3.6 first, then Utah-GLX, etc.
  • As a bonus, apps already compiled for Mesa don’t need to be recompiled
    (via a symlink to the accelerated libraries)

Like I said, I had no luck getting RIVA agp modules to work, but hey - the
GLX docs say it doesn’t work. This is IMHO MUCH better than
software-only Mesa – quake-gl (quake.sourceforge.net) looks VERY good!!

Marcus>

I assume you mean the X server available on the nvidia web page. In this case
its garbage. Any use of glx quickly leads to either a server lockup or crash
for me. Under Nvidia + Mesa 3.0 the application I’m developing would instantly
crash the X server, I changed to Mesa 3.1 and now it will run, but locks the X
server if I switch from full screen to windowed.

I’ve given up on Nvidia’s accelerated glx and am using software Mesa now, slow
but sure. Hopefully something decent and easy to install will be available
shortly after the expect release of Xfree86 4.0 this month. I don’t have much
confidence in Nvidia’s ability to write good drivers, the Windows drivers for
the TNT I’ve used on my brothers machine are fast but buggy (by using different
versions you can at least pick what bugs you want to live with).

— “M. R. Brown” wrote:

Yeah, I struggled with NVIDIA’s crappy glx implementation at first,
but
now I’m running Utah-GLX with TNT2 Ultra drivers under X 3.3.6. The
agp
support is STILL non-existent, but at least I have hardware
acceleration
under 32bpp.

  • Get XFree86 3.3.6 from www.xfree86.org
  • Get Utah-GLX from SourceForge - documentation is great!
  • Don’t forget to grab the Mesa-3.2 CVS code from Utah-GLX
  • Compile X 3.3.6 first, then Utah-GLX, etc.
  • As a bonus, apps already compiled for Mesa don’t need to be
    recompiled
    (via a symlink to the accelerated libraries)

Like I said, I had no luck getting RIVA agp modules to work, but hey

  • the
    GLX docs say it doesn’t work. This is IMHO MUCH better than
    software-only Mesa – quake-gl (quake.sourceforge.net) looks VERY
    good!!

Marcus

It is very unlikely that the Utah-GLX riva implementation will ever
support either AGP or DMA. This is due to a lack of programming
information for the riva chipsets (there is some but it isn’t much and
it doesn’t cover AGP or DMA). To get those pieces the only hope is to
wait until nvidia releases their closed source DRI driver. On the other
hand, ?Nathanial Hand? who is maintaining and improving the Utah-GLX
riva code is doing a terrific job at it and has squashed quite a lot of
bugs in the driver compared to the one found on nvidia’s web site (he’s
also made some improvements too).=====
Jason Platt.

“In theory: theory and practice are the same.
In practice: they arn’t.”

ICQ# 1546328


Do You Yahoo!?
Talk to your friends online with Yahoo! Messenger.

M. R. Brown wrote in message
news:Pine.OSF.3.96.1000307171645.5592A-100000 at alpha3.csd.uwm.edu

Yeah, I struggled with NVIDIA’s crappy glx implementation at first, but
now I’m running Utah-GLX with TNT2 Ultra drivers under X 3.3.6. The agp
support is STILL non-existent, but at least I have hardware acceleration
under 32bpp.

  • Get XFree86 3.3.6 from www.xfree86.org
  • Get Utah-GLX from SourceForge - documentation is great!
  • Don’t forget to grab the Mesa-3.2 CVS code from Utah-GLX
  • Compile X 3.3.6 first, then Utah-GLX, etc.
  • As a bonus, apps already compiled for Mesa don’t need to be recompiled
    (via a symlink to the accelerated libraries)

I tried the utah-glx stuff, including recompiling the kernel with the new
agp stuff, downloading everything from CVS, etc, etc, and it didn’t appear
to be working for me. Applications would just render in software, and the
module would give me an error “unknown mga chipset” or something (why it was
looking for an mga chipset when I have a TNT card I have no clue). Maybe
I’ll try again soon…thanks for the help.

  -W.W.

“M. R. Brown” wrote:

Yeah, I struggled with NVIDIA’s crappy glx implementation at first, but
now I’m running Utah-GLX with TNT2 Ultra drivers under X 3.3.6. The agp
support is STILL non-existent, but at least I have hardware acceleration
under 32bpp.

  • Get XFree86 3.3.6 from www.xfree86.org
  • Get Utah-GLX from SourceForge - documentation is great!
  • Don’t forget to grab the Mesa-3.2 CVS code from Utah-GLX
  • Compile X 3.3.6 first, then Utah-GLX, etc.
  • As a bonus, apps already compiled for Mesa don’t need to be recompiled
    (via a symlink to the accelerated libraries)

Like I said, I had no luck getting RIVA agp modules to work, but hey - the
GLX docs say it doesn’t work. This is IMHO MUCH better than
software-only Mesa – quake-gl (quake.sourceforge.net) looks VERY good!!

Marcus

I assume you mean the X server available on the nvidia web page. In this case
its garbage. Any use of glx quickly leads to either a server lockup or crash
for me. Under Nvidia + Mesa 3.0 the application I’m developing would instantly
crash the X server, I changed to Mesa 3.1 and now it will run, but locks the X
server if I switch from full screen to windowed.

I’ve given up on Nvidia’s accelerated glx and am using software Mesa now, slow
but sure. Hopefully something decent and easy to install will be available
shortly after the expect release of Xfree86 4.0 this month. I don’t have much
confidence in Nvidia’s ability to write good drivers, the Windows drivers for
the TNT I’ve used on my brothers machine are fast but buggy (by using different
versions you can at least pick what bugs you want to live with).

I assume (hope) if I install the latest glxMesa-*-1.i586.rpm from
http://matroxusers.com/driver/linux.html this is all I need to get my TNT working?
I presume its not really necessary to recompile X.

Dana

Yeah, you probably don’t want to recompile X, I just like to get down and
dirty to see how everything works.

Also, I might have spoken too soon (concerning the drivers), I downloaded
and compiled gltron 0.59 beta and compiled it using the mikmod support.
When I run gltron, I get all the menus, but when I goto start a new game X
crashes – hard. No keyboard, mouse, etc., and I have to use SysRq to
unmount and reboot. The same crash happens with Tux: A Quest for Herring.
Other GL apps, e.g. QuakeForge-GL, etc. work fine, so I dunno.

I guess it’s up to NVIDIA to give us some decent 3D-hardware support
(don’t get me wrong Utah-GLX is good!).

Don’t forget the module (glx.so) section in your XF86Config file.

Marcus>

I assume (hope) if I install the latest glxMesa-*-1.i586.rpm from
http://matroxusers.com/driver/linux.html this is all I need to get my TNT working?
I presume its not really necessary to recompile X.

Dana