Stencil Buffer under Linux?

Hi,

I try to use a Stencil Buffer under my Linux Box (Nvidia GF2). But whenever I try to use “SDL_GL_SetAttribute(SDL_GL_STENCIL_SIZE, 1);”, the program stops with “Couldn’t find matching GLX Visual”. Do I need more than this SDL_GL_SetAttribute() or whats wrong?
Display depth is 16bits.

Thanks in advance, Enrico

You may need 24/32 bit depth to get stencil buffer with the Linux
drivers, not sure.On Tue, 2003-02-18 at 08:03, Enrico Zschemisch wrote:

Hi,

I try to use a Stencil Buffer under my Linux Box (Nvidia GF2). But whenever I try to use “SDL_GL_SetAttribute(SDL_GL_STENCIL_SIZE, 1);”, the program stops with “Couldn’t find matching GLX Visual”. Do I need more than this SDL_GL_SetAttribute() or whats wrong?
Display depth is 16bits.

Enrico Zschemisch wrote:

I try to use a Stencil Buffer under my Linux Box (Nvidia GF2). But
whenever I try to use “SDL_GL_SetAttribute(SDL_GL_STENCIL_SIZE, 1);”, the
program stops with “Couldn’t find matching GLX Visual”. Do I need more than
this SDL_GL_SetAttribute() or whats wrong?
Display depth is 16bits.

In a 16bit display depth, you cannot use the stencil buffer, it only works
on 32bpp. Note also that the behaviour of glXChooseVisual is very different
from the win32 ChoosePixelFormat. ChoosePixelFormat chooses the closest
pixel format that better matchs the given pfd. However, glXChooseVisual only
suceeds if it finds a pixel format that ‘meets’ your requierements, so if
you ask for 1 bit stencil buffer, and it doesn’t find it, it will fail.
However, on win32 it will succeed and return a pixel format with no stencil
bits.

Ignacio Casta?o
@Ignacio_Castano___________________________________________________
Yahoo! M?viles
Personaliza tu m?vil con tu logo y melod?a favorito
en http://moviles.yahoo.es

Hello,On Tue, 18 Feb 2003 16:18:12 +0100 Ignacio Casta?o wrote:

In a 16bit display depth, you cannot use the stencil buffer, it only works
on 32bpp.
Ok, thanks, works now :slight_smile:
But can anybody explain me, why the stencil buffer works on with 32bit display depth? I don’t understand this …

bye Enrico

-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1On Tuesday 18 February 2003 20:21, Enrico Zschemisch wrote:

Ignacio Casta?o wrote:

In a 16bit display depth, you cannot use the stencil buffer, it only
works on 32bpp.

Ok, thanks, works now :slight_smile:
But can anybody explain me, why the stencil buffer works on with 32bit
display depth? I don’t understand this …

It’s a graphics card and driver specific thing. The NVidia people just chose
not to support a stencil buffer at 16 bit. They kind of have a point, since
stencil buffers are usually used in conjunction with a multi-pass rendering
algorithm, and at 16 bit colour depth, multi-pass rendering begins to give
wrong results pretty quickly. 16 bit rendering is dead for 3D applications.

cu,
Nicolai
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.2.1 (GNU/Linux)

iD8DBQE+UpNPsxPozBga0lwRAhnLAKC/9vfC8jO27XfpsYX17dA2/PATFgCgzvgE
x/sj8itRMExJHkJc7MpVCXA=
=UFuF
-----END PGP SIGNATURE-----

Enrico Zschemisch wrote:

But can anybody explain me, why the stencil buffer works on with 32bit
display depth? I don’t understand this …

It’s a hardware limitation on some geforce cards. I think that newer cards
do not suffer from this. On the geforce2 I belive that the only accelerated
pixel formats are: (color=16,depth=16) and (color=32,depth=24,stencil=8). It
seems that the hardware has been optimized to run on those configurations
only.

But someone from nvidia should be able to clarify this better than me.

Ignacio Casta?o
@Ignacio_Castano___________________________________________________
Yahoo! M?viles
Personaliza tu m?vil con tu logo y melod?a favorito
en http://moviles.yahoo.es

Regards,

DanielOn Wed 19 Feb 03 00:38, Ignacio Casta?o wrote:

Enrico Zschemisch wrote:

But can anybody explain me, why the stencil buffer works on with 32bit
display depth? I don’t understand this …

It’s a hardware limitation on some geforce cards. I think that newer cards
do not suffer from this. On the geforce2 I belive that the only accelerated
pixel formats are: (color=16,depth=16) and (color=32,depth=24,stencil=8).
It seems that the hardware has been optimized to run on those
configurations only.

From the sound of it, they decided to use a few zbuffer bits for the stencil.

Enrico Zschemisch wrote:

But can anybody explain me, why the stencil buffer works on with 32bit
display depth? I don’t understand this …

It’s a hardware limitation on some geforce cards. I think that newer cards
do not suffer from this. On the geforce2 I belive that the only accelerated
pixel formats are: (color=16,depth=16) and (color=32,depth=24,stencil=8).
It seems that the hardware has been optimized to run on those
configurations only.

From the sound of it, they decided to use a few zbuffer bits for the stencil.

Sound like they like to have complete pixels that are either 32 bits (16

    1. or 64 bits (32 + 32). Makes sense, their internal busses are
      probably optimized for multiples of 32 bits.

      Bob PendletonOn Wed, 2003-02-19 at 00:38, Daniel Phillips wrote:

On Wed 19 Feb 03 00:38, Ignacio Casta?o wrote:

Regards,

Daniel


SDL mailing list
SDL at libsdl.org
http://www.libsdl.org/mailman/listinfo/sdl

±-----------------------------------+