NVIDIA & SDL_GL_SwapBuffers

System:
RH7.3
SDL1.2.4
GeForce4Go (driver ver:1.0-2960)

If I use the binary compiled before I installed the NVIDIA GLX & Kernel
binary distributions (not sources), the program works fine and is accelerated
or seg faults depending on the program. If I use a binary compiled after
NVIDIA i get garbage in the window.

I’ve found in my apps that it seems there is a problem with SDL_GL_SwapBuffers
in that I cant even write a program that simple clears the buffer then flips
(I compiles but I get the same garbage in the window).

I have tried putting glFlush() right before the call as per a list email
a year ago, but it doesnt help.

If it helps I get the same results with a GeForce3 ti500 on the same setup.

Has anyone else experienced this?
More importantly do you know of a work around?

-bart

I had the exact same problem, but don’t know what exactly cause it, because I
switched to Gentoo and in gentoo I do not have the problem.

I think I took the Nvidia includes in use and after that it didn’t work, but I
simply made my switch to Gentoo a litle bit earlier than I otherwise would
and problem solved. :slight_smile:

Try to uninstall the nvidia drivers and try if it works with the original nv
driver and if it does then take the nvidia source rpm’s and build your own
rpms, and install from them. (take both kernel and glx sources)On Tuesday 23 July 2002 22:35, wyatbar at iit.edu wrote:

System:
RH7.3
SDL1.2.4
GeForce4Go (driver ver:1.0-2960)

If I use the binary compiled before I installed the NVIDIA GLX & Kernel
binary distributions (not sources), the program works fine and is
accelerated or seg faults depending on the program. If I use a binary
compiled after NVIDIA i get garbage in the window.

I’ve found in my apps that it seems there is a problem with
SDL_GL_SwapBuffers in that I cant even write a program that simple clears
the buffer then flips (I compiles but I get the same garbage in the
window).

I have tried putting glFlush() right before the call as per a list email
a year ago, but it doesnt help.

If it helps I get the same results with a GeForce3 ti500 on the same setup.

Has anyone else experienced this?
More importantly do you know of a work around?

I’ve experienced this on Debian, but it was Debian’s fault, not NVIDIA.
Do other OpenGL binaries work such as Quake 3 or precompiled versions of
games such as armagetron? It very well could be a problem with your build
environment.

Additionally, the ldd output of your binary, does it contain a line
libGL.so.1.something? If not, I know exactly what the problem is. More
information is required to accurately diagnose your problem.On Tue, Jul 23, 2002 at 02:35:39PM -0500, wyatbar at iit.edu wrote:

If I use the binary compiled before I installed the NVIDIA GLX & Kernel
binary distributions (not sources), the program works fine and is accelerated
or seg faults depending on the program. If I use a binary compiled after
NVIDIA i get garbage in the window.

I’ve found in my apps that it seems there is a problem with SDL_GL_SwapBuffers
in that I cant even write a program that simple clears the buffer then flips
(I compiles but I get the same garbage in the window).

I have tried putting glFlush() right before the call as per a list email
a year ago, but it doesnt help.

If it helps I get the same results with a GeForce3 ti500 on the same setup.

Has anyone else experienced this?
More importantly do you know of a work around?


Joseph Carter <-- That boy needs therapy

c++: the power, elegance and simplicity of a hand grenade

-------------- next part --------------
A non-text attachment was scrubbed…
Name: not available
Type: application/pgp-signature
Size: 273 bytes
Desc: not available
URL: http://lists.libsdl.org/pipermail/sdl-libsdl.org/attachments/20020724/25e66498/attachment.pgp

Ldd output does not list libGL (wish I had thought to look at that) for
the apps which fail to work.

Furthermore, from what I can tell normal precompiled opengl apps/games
work just fine as well as any app I compile which uses glut to handle
the windowing/surface creation even if it uses SDL for other things. ( I
did some deeper research after you tipped me to the build environment ).

-bart> ----- Original Message -----

From: sdl-admin@libsdl.org [mailto:sdl-admin at libsdl.org] On Behalf Of
Joseph Carter
Sent: Wednesday, July 24, 2002 1:51 PM
To: sdl at libsdl.org
Subject: Re: [SDL] NVIDIA & SDL_GL_SwapBuffers

On Tue, Jul 23, 2002 at 02:35:39PM -0500, @Bart wrote:

If I use the binary compiled before I installed the NVIDIA GLX &
Kernel
binary distributions (not sources), the program works fine and is
accelerated
or seg faults depending on the program. If I use a binary compiled
after
NVIDIA i get garbage in the window.

I’ve found in my apps that it seems there is a problem with
SDL_GL_SwapBuffers
in that I cant even write a program that simple clears the buffer then
flips
(I compiles but I get the same garbage in the window).

I have tried putting glFlush() right before the call as per a list
email
a year ago, but it doesnt help.

If it helps I get the same results with a GeForce3 ti500 on the same
setup.

Has anyone else experienced this?
More importantly do you know of a work around?

I’ve experienced this on Debian, but it was Debian’s fault, not NVIDIA.
Do other OpenGL binaries work such as Quake 3 or precompiled versions of
games such as armagetron? It very well could be a problem with your
build
environment.

Additionally, the ldd output of your binary, does it contain a line
libGL.so.1.something? If not, I know exactly what the problem is. More
information is required to accurately diagnose your problem.


Joseph Carter <-- That boy needs
therapy

c++: the power, elegance and simplicity of a hand grenade

rm /usr/X11R6/lib/libGL.a

Bet this fixes your problem, too. Distributions should NEVER ship static
libGL.a, and if they do ship it, it absolutely should NOT be in /usr/X11R6
according to the Linux OpenGL ABI standard. Individual distributions seem
to be unwilling or unable to read or follow the ABI standard. Concerns
about including basically useless static libs and more importantly about
putting them in the wrong directory, as usual, fall on deaf ears.

Basically, the people who handle OpenGL libraries for the various dists
are all morons. That’s a hostile attitude to take, but they are breaking
things intentionally, they know it, and they don’t give a damn. I wrote
DynGL 3 the way I did so it could be used by anybody in any project
specifically to help combat stupidity in vendors’ treatment of OpenGL
libraries. I have maintained that I’m not just referring to Microsoft and
have taken some flack for accusing Linux distributions of being no more
intelligent about it than the boys in Redmond. I’m right, however, and
you’ve just found (more) proof of it.On Wed, Jul 24, 2002 at 07:53:42PM -0500, Bart wrote:

Ldd output does not list libGL (wish I had thought to look at that) for
the apps which fail to work.

Furthermore, from what I can tell normal precompiled opengl apps/games
work just fine as well as any app I compile which uses glut to handle
the windowing/surface creation even if it uses SDL for other things. ( I
did some deeper research after you tipped me to the build environment ).


Joseph Carter Intelligent backside at large

our local telco has admitted that someone “backed into a
button on a switch” and took the entire ATM network down
hopefully now routers are designed better, so the “network
off” swtich is on the back

-------------- next part --------------
A non-text attachment was scrubbed…
Name: not available
Type: application/pgp-signature
Size: 273 bytes
Desc: not available
URL: http://lists.libsdl.org/pipermail/sdl-libsdl.org/attachments/20020724/a7f92a5c/attachment.pgp