As Nick mentioned (can I call you Nick? :), I have managed to render
something using SDL, GL, and testsprite. Key word is: something. It
certainly doesn’t look like happy yellow faces. The problem right
now is that I’m not sure what the correct values are for the "format"
and “type” parameters of glDrawPixels for 16-bit rendering. None of the
formats in my GL reference book work. If anyone knows the correct
values, please tell me.
As for the speed, it is extremely slow right now (10FPS at best,
sometimes as few as 4!). There are several reasons for this:
(1) Currently, I am updating the entire framebuffer every frame.
(2) I am using true GLX with a TNT2 card, which does not support direct
rendering at this time.
(3) I have not implemented any sort of hardware surfaces yet.
So, basically, the entire image is being rendered in software, packed,
stuffed in a pipe over to the X server, unpacked, and then manually
copied into the frambuffer, every frame! This will all be fixed,
though, so fear not. I will not stop optimizing this until Civ: CTP
is at least somewhat playable through it. On the brite side, I am able
to render 3D images over and under the 2D objects.
As for the recompiling every time you want to change modes thing, I do
plan on making GL a runtime option at some later point. Ideally, an
extra flag would be added somewhere in the API which turns on GL
support. An environment variable could also work, but it would hinder
portability of SDL-based apps. Sam, do you think a GL option/flag could
be added to the video initialization functions somehow? The only other
option would be to have two copies of the library available on every
system: one with GL and one without. Of course, GL will always be a
compile-time option as well, so that you can remove it altogether if you
don’t have the GL headers or libraries or whatnot.
And Loki, what’s up with my $160 order taking a week and a half to
arrive!? (J/K, it was because of Q3A I’m sure. no problem.)
QOTD: “Now if only my X server would quit crashing…”