16bit Gradients

Howdy,

Sorry to be a hoser, as the answer to my question is either insanely easy
or nonexistent, but I’ve got to ask: do 16bit gradients always look like
crap? I’ve attached a sample program to demonstrate what I mean. If
you run your desktop at 32bit depth, the first example will appear
to have distinct rows, whereas the other two will be smooth. In 16 bit
mode,
all passes will produce blocky results. I haven’t tried this in 8 bit
mode,
so I can’t comment. The code itself attempts to run in 32 bit mode.

I think I understand why the gradients look so crummy in 16 bit mode:
assuming a 565 bit configuration, you only have 2^5=32 possibilites
for red (the color I am fading). In full blown 32 bit, or an
appropriately
pallated 8 bit mode, you have 2^8=256 possibilities for red.

Regardless, as I sit here looking at the smooth gradient background Gnome
has generated
for me, with my X windows set to 16bit color depth, I can’t help but
wonder if I am
doing something wrong. I also note that OpenGL doesn’t seem to have any
problems
generating smooth gradients, and I’m not certain it is using my desktop’s
16bit color depth.

Any insight, if available, would be cherished.

Peace,
=Pete
-------------- next part --------------
An embedded and charset-unspecified text was scrubbed…
Name: gradient.c
URL: http://lists.libsdl.org/pipermail/sdl-libsdl.org/attachments/20030315/d8c12e4e/attachment-0007.asc

I think I understand why the gradients look so crummy in 16 bit mode:
assuming a 565 bit configuration, you only have 2^5=32 possibilites
for red (the color I am fading). In full blown 32 bit, or an
appropriately
pallated 8 bit mode, you have 2^8=256 possibilities for red.

You need to dither your images, instead of simply dropping the extra bits.

I posted some code for this recently; do an archive search for “SDL_dither”.

Regardless, as I sit here looking at the smooth gradient background Gnome
has generated
for me, with my X windows set to 16bit color depth, I can’t help but
wonder if I am
doing something wrong. I also note that OpenGL doesn’t seem to have any
problems
generating smooth gradients, and I’m not certain it is using my desktop’s
16bit color depth.

There are two factors here: framebuffer color depth and texture color depth.

With 32-bit textures and a 16-bit framebuffer, almost all current 3D hardware
will dither on the fly for free. This can be done at device resolution,
instead of texture resolution, which can often give incredibly better results–
in fact, if you’re not doing a lot of multipass blending, it can be hard for
most of people to tell it from 32-bit rendering. This is one of those bonuses
you get for using OpenGL in 2D apps–it can take advantage of higher
resolutions in various ways, even if your program doesn’t have the texture
resolution to give full detail.

If you load 16-bit textures, you need to dither yourself at load time.

(I’m aware that you probably weren’t looking for a dissertation on
OpenGL for 2D apps; it’s just relevant to a lot of people on this list. :)On Sat, Mar 15, 2003 at 04:18:30AM -0500, Peter wrote:


Glenn Maynard

Reading this message it seems to me that there are a lot of people who ask
the same questions repeatedly. Is it possible for people to add useful
information like this to the tutorials page at SDL.org. The documentation
there is very sparse when it comes to doing useful things with SDL.

Sof.T

“Glenn Maynard” <g_sdl at zewt.org> wrote in message
news:20030315094943.GJ2033 at zewt.org

I think I understand why the gradients look so crummy in 16 bit mode:
assuming a 565 bit configuration, you only have 2^5=32 possibilites
for red (the color I am fading). In full blown 32 bit, or an
appropriately
pallated 8 bit mode, you have 2^8=256 possibilities for red.

You need to dither your images, instead of simply dropping the extra bits.

I posted some code for this recently; do an archive search for
"SDL_dither".

Regardless, as I sit here looking at the smooth gradient background
Gnome

has generated
for me, with my X windows set to 16bit color depth, I can’t help but
wonder if I am
doing something wrong. I also note that OpenGL doesn’t seem to have any
problems
generating smooth gradients, and I’m not certain it is using my
desktop’s

16bit color depth.

There are two factors here: framebuffer color depth and texture color
depth.

With 32-bit textures and a 16-bit framebuffer, almost all current 3D
hardware
will dither on the fly for free. This can be done at device resolution,
instead of texture resolution, which can often give incredibly better
results–
in fact, if you’re not doing a lot of multipass blending, it can be hard
for
most of people to tell it from 32-bit rendering. This is one of those
bonuses
you get for using OpenGL in 2D apps–it can take advantage of higher
resolutions in various ways, even if your program doesn’t have the texture
resolution to give full detail.

If you load 16-bit textures, you need to dither yourself at load time.

(I’m aware that you probably weren’t looking for a dissertation on
OpenGL for 2D apps; it’s just relevant to a lot of people on this list.
:)> On Sat, Mar 15, 2003 at 04:18:30AM -0500, Peter wrote:


Glenn Maynard

On Sat, 15 Mar 2003 04:49:43 -0500, “Glenn Maynard” <g_sdl at zewt.org>
said:

You need to dither your images, instead of simply dropping the extra
bits.

I posted some code for this recently; do an archive search for
"SDL_dither".

Lo! That is it!

With 32-bit textures and a 16-bit framebuffer, almost all current 3D
hardware
will dither on the fly for free. This can be done at device resolution,
instead of texture resolution, which can often give incredibly better
results–
in fact, if you’re not doing a lot of multipass blending, it can be hard
for
most of people to tell it from 32-bit rendering. This is one of those
bonuses
you get for using OpenGL in 2D apps–it can take advantage of higher
resolutions in various ways, even if your program doesn’t have the
texture
resolution to give full detail.

Ok, I understand, then, how OpenGL takes advantage of this. But Gnome?
I should really just look at the source code if I want the answer to
this question, but you don’t suppose my desktop is rendered with OpenGL?
That would strike me as a little… hardcore…

(I’m aware that you probably weren’t looking for a dissertation on
OpenGL for 2D apps; it’s just relevant to a lot of people on this list.
:slight_smile:

Actually, I’m quite interested in OpenGL for 2D. Even in 2D mode, hi-res
graphics programming munches resources, and hardware acceleration is a
big
plus (I clearly remember the days when game consoles had graphics that
put
PCs to shame). Plus, you have access to nifty settings like gamma
correction
and such, which can be used to, say, fade in and out, whereas with SDL et
al,
the best you can do is blit your image and a decreasingly opaque surface
to the screen. Then again, a software implementation of OpenGL can
be drastically slower than SDL. Something to ponder when you have
nothing
better to do, I guess.

Thanks so much for the help!

=Pete

On Sat, 15 Mar 2003 13:01:38 -0000, “SofT” <sof.t at tesco.net> said:

Reading this message it seems to me that there are a lot of people who
ask
the same questions repeatedly. Is it possible for people to add useful
information like this to the tutorials page at SDL.org. The documentation
there is very sparse when it comes to doing useful things with SDL.

Yeah, seriously, searching the mailing list isn’t that helpful when you
don’t even know what to search for. I thought my problem was my linear
interpolation. I think it would be a good idea to setup a network
of little tutorials that explain specific techniques, and/or how to
bind other techniques together to do something useful. In my case,
tutorials should be written on per-pixel imaging, linear and bilinear
interpolation, dithering, and creating good gradients using the
previously
mentioned techniques. Since each technique might have applications
outside
of a given area (i.e gradients), it would be wise to make their tutorials
functional
on their own.

I’ll take the initiative and put together a series of tutorials on
the topics I just mentioned. If anyone else is interested, pipe up.
This could be a very useful resource; not just for SDL users, but
aspiring
graphics dweebs of all ilks.

Peace,
=Pete

Gradients in desktop widgets are generated algorithmically, so they can
be dithered at the same time for very little cost.

Actually, dithering bitmaps is really cheap, too (at least for simple
dithering algorithms), but it’s still too slow for most gaming uses.On Sat, Mar 15, 2003 at 01:03:41PM -0500, Peter wrote:

Ok, I understand, then, how OpenGL takes advantage of this. But Gnome?
I should really just look at the source code if I want the answer to
this question, but you don’t suppose my desktop is rendered with OpenGL?
That would strike me as a little… hardcore…


Glenn Maynard