Howdy,
Sorry to be a hoser, as the answer to my question is either insanely easy
or nonexistent, but I’ve got to ask: do 16bit gradients always look like
crap? I’ve attached a sample program to demonstrate what I mean. If
you run your desktop at 32bit depth, the first example will appear
to have distinct rows, whereas the other two will be smooth. In 16 bit
mode,
all passes will produce blocky results. I haven’t tried this in 8 bit
mode,
so I can’t comment. The code itself attempts to run in 32 bit mode.
I think I understand why the gradients look so crummy in 16 bit mode:
assuming a 565 bit configuration, you only have 2^5=32 possibilites
for red (the color I am fading). In full blown 32 bit, or an
appropriately
pallated 8 bit mode, you have 2^8=256 possibilities for red.
Regardless, as I sit here looking at the smooth gradient background Gnome
has generated
for me, with my X windows set to 16bit color depth, I can’t help but
wonder if I am
doing something wrong. I also note that OpenGL doesn’t seem to have any
problems
generating smooth gradients, and I’m not certain it is using my desktop’s
16bit color depth.
Any insight, if available, would be cherished.
Peace,
=Pete
-------------- next part --------------
An embedded and charset-unspecified text was scrubbed…
Name: gradient.c
URL: http://lists.libsdl.org/pipermail/sdl-libsdl.org/attachments/20030315/d8c12e4e/attachment-0007.asc