The future of SDL and GL

Well, since the SDL/GLX port is definitely well under way (I’ll let Kenton
talk about that, if he so chooses), I found myself starting to wonder and
think about the future of SDL and OpenGL support therein… in particular,
how best to implement it.

At present, just for sort of building purposes, the GLX-dependant code is
stuffed in SDL_X11VIDEO.C and in order to activate it, you need to define
USE_GLX. This is an unacceptable solution to me, largely because in order to
switch from using SDL’s nice 2D support to GL’s
3Dsupportyesbut2Dsupportsucks mode (according to Kenton’s latest compile,
for him, testsprite runs at 10 FPS! GAH!), you need to recompile the
library. I believe that this is a stop gap measure. What I personally am
looking at is having a SDL_USE_GLX environment variable. That way, if the
user wants to use GLX for his card, he defines that. Or, conversely, if a
programmer wants to enable GLX functions, he calls putenv, and away he goes.

Taking this one step further, and assuming that I get around to doing what I
want to doing (and I think Kenton does as well, not sure though), which is
creating more GL drivers for other environments (Windows, BeOs, GGI, Glide,
etc.), then you’d want to set SDL_USE_GL instead. That would turn on the
support, or not. Mesa works a lot like this if you want to toggle 3Dfx
support or not. The other option is to build in a query_gl and init_gl
function. I don’t know how hard this would be to do off the top of my head;
I suspect a hell of a lot harder than doing the putenv() approach.

Anybody have any really strong thoughts on this particular issue, or should
we just plough resolutely onwards? (This will be a major issue, since I know
a lot of people who want to be able to use SDL instead of GLUT(ton) for
their gaming needs)

Nicholas Vining (Mordred)
e-mail: vining at pacificcoast.net
icq: 20782003