Forcing 24/32 bit depth/z-buffer

i’m having problems forcing OpenGL to initialise with a custom size of
depth buffer. the main problem is with windows 98, but also in linux. in
windows i cannot force the depth buffer to be greater than the default
of 16. i’ve tried 24 bit (which is the my default for linux) and 32 bit.
Both cause kernel32 errors in windows. i get a win16 message box pop up,
which i find highly amusing. it pops up after i’ve actually started
OpenGL.
and although i’m happy with the default in linux, i tried it
there too, and it just crashes out through the SDL parachute just as it
initialises.

are there any known bugs here? am i being dense? would anyone like to
help me? if so, i’ll provide some more information. are there any
tutorials on this. i basically used the same code as i found in the SDL
documentation (example 2-7, “Using OpenGL With SDL”).

SDL_GL_SetAttribute( SDL_GL_DEPTH_SIZE, z-depth );

being the key command.

any ideas appreciated.
bill

Hi William!
As this problem is not a real problem in democoding (16bit are enough at
most times for demos) I’ve never tried it out. But when I read your post,
I tried a little around with my demo at its actual state.
I’ve tried with these Params:
SDL_GL_SetAttribute( SDL_GL_RED_SIZE, 8 );
SDL_GL_SetAttribute( SDL_GL_GREEN_SIZE, 8 );
SDL_GL_SetAttribute( SDL_GL_BLUE_SIZE, 8 );
SDL_GL_SetAttribute( SDL_GL_ALPHA_SIZE, 8 );
SDL_GL_SetAttribute( SDL_GL_DEPTH_SIZE, 32 );
// alternatively: SDL_GL_SetAttribute( SDL_GL_DEPTH_SIZE, 24 );
SDL_GL_SetAttribute( SDL_GL_DOUBLEBUFFER, 1 );
schgrien = SDL_SetVideoMode(800,600,32,SDL_OPENGL|SDL_FULLSCREEN);

Result: no problems with windows2kpro (5.00.2195), not even a drop in the
framerate …
So far, this should’ve been the test on win2k (I could try it on winME
also, but not now, rather in a week when I’m back at my parents …)
Hope this made the prob smaller, as you didn’t write anything concerning win2k.
Cheers,
St0fF.
At 17:09 04.03.2002 +0000, you wrote:>i’m having problems forcing OpenGL to initialise with a custom size of

depth buffer. the main problem is with windows 98, but also in linux. in
windows i cannot force the depth buffer to be greater than the default
of 16. i’ve tried 24 bit (which is the my default for linux) and 32 bit.
Both cause kernel32 errors in windows. i get a win16 message box pop up,
which i find highly amusing. it pops up after i’ve actually started
OpenGL.
and although i’m happy with the default in linux, i tried it
there too, and it just crashes out through the SDL parachute just as it
initialises.

are there any known bugs here? am i being dense? would anyone like to
help me? if so, i’ll provide some more information. are there any
tutorials on this. i basically used the same code as i found in the SDL
documentation (example 2-7, “Using OpenGL With SDL”).

SDL_GL_SetAttribute( SDL_GL_DEPTH_SIZE, z-depth );

being the key command.

any ideas appreciated.
bill


SDL mailing list
SDL at libsdl.org
http://www.libsdl.org/mailman/listinfo/sdl

I’ve tried with these Params:
SDL_GL_SetAttribute( SDL_GL_RED_SIZE, 8 );
SDL_GL_SetAttribute( SDL_GL_GREEN_SIZE, 8 );
SDL_GL_SetAttribute( SDL_GL_BLUE_SIZE, 8 );
SDL_GL_SetAttribute( SDL_GL_ALPHA_SIZE, 8 );
SDL_GL_SetAttribute( SDL_GL_DEPTH_SIZE, 32 );
// alternatively: SDL_GL_SetAttribute( SDL_GL_DEPTH_SIZE, 24 );
SDL_GL_SetAttribute( SDL_GL_DOUBLEBUFFER, 1 );
schgrien = SDL_SetVideoMode(800,600,32,SDL_OPENGL|SDL_FULLSCREEN);

Result: no problems with windows2kpro (5.00.2195), not even a drop in the
framerate …
So far, this should’ve been the test on win2k (I could try it on winME
also, but not now, rather in a week when I’m back at my parents …)

ok, thanks.
i shall try and vigorously comment stuff out until i get it working. if
not, i’ll be back sometime.

thanks

bill