SDL + GLX = Coolness... and you might just get it

I sat down for three hours this evening after a pleasant night out with
three young ladies and patiently started ripping apart SDL to see how it
works. My compliments to you, Sam, for developing such an easily-expandable
engine, although some documentation may well be useful. Everything appears
to be very nicely laid out, and easy to expand.

I also had a gander at the glX specifications and sort of plodded my way
through the archives, and am pleased to report that, (A) yes, combining SDL
and glX is possible, and (B) yes, I could probably do it. AFAICT what is
needed is to rip the video code right out of the X driver and to plug in glX
code, since the input/threading/audio code doesn’t seem like it would tangle
with glX.

The problems are first and foremost that implementing some of the issues
surrounding converting bitmaps on the fly, but this can probably be dealt
with by blitting them in certain modes and by using Hermes a lot, the end
result of which would be rather an interesting kludge… but if you don’t
mind putting up with poor performance on that end, glX can be integrated.
Besides, if the 3D is fast and furious, who’d care about the bitmaps?

Anyways, this will probably be my sort of Christmas-break project. Usually,
come break times, I hole myself up with a hefty supply of Coke, tea, coffee,
sugar, and chips, go nocturnal, and bash out code in 8 hour shifts until
burn out ensues, usually a month later. My one problem with doing it,
however, is that I’m not sure how good glX support is for my second hand
3Dfx Voodoo 1, so if anybody’d like to buy me a Matrox (which AFAIK are
supported very well) that would speed up development time . I doubt
anybody will, though. I do know that John Carmack was threatening to hand
out video cards to anybody who wanted to become a gl Nazi and monitor
standards, which is very nifty of him.

The only other concern I have, which maybe somebody like Sam Lantinga can
answer is will there be any objections from Loki Software higher-ups if I do
this myself, and attempt to get it integrated into SDL as a whole? I’m
serious about doing this, mainly because I’d like to use SDL for my own
sinister GL projects!

Nicholas Vining (Mordred)
e-mail: vining at pacificcoast.net
icq: 20782003

We know who you are. We know where you live :).

Seriously, we were going to do this ourselves (along with some crazy
dlopen stuff for GL libraries…), but just haven’t gotten to it. If
you’d like to take a swing it, more power to you. SDL is LGPL, and
it’s Sam’s creation, not Loki’s.

m.On Mon, Dec 13, 1999 at 10:47:19PM -0800, Nicholas Vining wrote

The only other concern I have, which maybe somebody like Sam Lantinga can
answer is will there be any objections from Loki Software higher-ups if I do
this myself, and attempt to get it integrated into SDL as a whole? I’m


Programmer “I wrote a song about dental floss,
Loki Entertainment Software but did anyone’s teeth get cleaner?”
http://lokigames.com/~briareos/ - Frank Zappa, re: the PMRC

We know who you are. We know where you live :).

Good. Hire me. :slight_smile:

Seriously, we were going to do this ourselves (along with some crazy
dlopen stuff for GL libraries…), but just haven’t gotten to it. If
you’d like to take a swing it, more power to you. SDL is LGPL, and
it’s Sam’s creation, not Loki’s.

Right. Well, Kenton and I are working on it, and alpha versions should be
available soon. I’m gonna skip out of class in order to do this. Damn, it sucks
being a high school student; no time for hacking projects. Anyways, if you
folks have any requests about how it be implemented, any code, or anything
else, please do let me or Kenton know and we’ll try to integrate them. At least
I will; I can’t speak for Kenton.

m.

Nich

vining at pacificcoast.net wrote:

Right. Well, Kenton and I are working on it, and alpha versions should be
available soon. I’m gonna skip out of class in order to do this. Damn, it sucks
being a high school student; no time for hacking projects. Anyways, if you
folks have any requests about how it be implemented, any code, or anything
else, please do let me or Kenton know and we’ll try to integrate them. At least
I will; I can’t speak for Kenton.

I will be glad to respond to any questions as well. :slight_smile:

Thinks look good so far. I’m basically taking the x11 driver and adapting it to
glx for now. I believe that I will be able to use various GL routines to get
hardware accelleration of many of the things that the X11 driver doesn’t appear to
accellerate. Overall, it will probably be slower for 2D graphics than standard
X11, however.

-Kenton Varda

hi`

Kenton Varda wrote:

Thinks look good so far. I’m basically taking the x11 driver and adapting it to
glx for now. I believe that I will be able to use various GL routines to get
hardware accelleration of many of the things that the X11 driver doesn’t appear to
accellerate. Overall, it will probably be slower for 2D graphics than standard
X11, however.

Depends on how you chose to mix GL and SDL. If you do every blit via a
little triangle strip you will end up being faster if much alpha is
involved or when there are few texture changes. But if there are many
blits the texture changes let the framerate drop considerably (at least
with my TNT and ClanLibs GL target).–
Daniel Vogel

666 @ http://grafzahl.de

Depends on how you chose to mix GL and SDL. If you do every blit via a
little triangle strip you will end up being faster if much alpha is
involved or when there are few texture changes. But if there are many
blits the texture changes let the framerate drop considerably (at least
with my TNT and ClanLibs GL target).

An interesting idea, and one definitely worth taking a look at. I suspect
that the best route would be to query hardware capabilities, and then work
from there. I had already decided that the best way to render bitmaps would
probably be to texture map them into place. But actually drawing colored
areas? With RLE bitmaps you’d get the best speed increase…


Daniel Vogel

No relation to Jeff Vogel?

Nicholas Vining (Mordred)
e-mail: vining at pacificcoast.net
icq: 20782003

----- Original Message -----
From: 666@grafzahl.de (Daniel Vogel)
To: sdl at lokigames.com
Date: Tuesday, December 14, 1999 12:48 PM
Subject: Re: [SDL] Re: SDL + GLX = Coolness… and you might just get it.

Nicholas Vining wrote:

Depends on how you chose to mix GL and SDL. If you do every blit via a
little triangle strip you will end up being faster if much alpha is
involved or when there are few texture changes. But if there are many
blits the texture changes let the framerate drop considerably (at least
with my TNT and ClanLibs GL target).

An interesting idea, and one definitely worth taking a look at. I suspect
that the best route would be to query hardware capabilities, and then work
from there. I had already decided that the best way to render bitmaps would
probably be to texture map them into place. But actually drawing colored
areas? With RLE bitmaps you’d get the best speed increase…

There is not much choice when you want to blit to an GLX context:

  • mixing X11 with GL (doubt it is fast - never tried)
  • using glDrawPixels (slow)
  • using texture mapping
  • mesa hack: using the mesa fb

whereof the texture mapping variant should be the fastest and the
glDrawPixels method the most straightforward. Well, problems with
texture mapping is the max texture size of a Voodoo e.g. (256) that will
lead to splitting up to many textures (many texture changes).

No relation to Jeff Vogel?

no relation.–
Daniel Vogel

666 @ http://grafzahl.de