Advice on which color format to use (software vs. OpenGL)

First of all, a little explanation. I’m upgrading the internal
framebuffer of the Stella emulator. Right now, it uses a maximum of 256
colors, but these colors are stored in a CLUT in 32bit RGB format. The
emulation framebuffer itself stores indices to the color table, not
actual color values.

I wish to create a Framebuffer that will extend the emulation framebuffer
(for reasons too hard to explain, I don’t want to modify the internal
emulation framebuffer itself). It can make use of dirty updates to speed
up changes to the new Framebuffer.

My problem is that I really want to use OpenGL for the rendering, but I
don’t want to absolutely require it. So software rendering should also
be possible. And it should be possible to run in 16/24/32 bit modes.

I’m constantly hitting a dead end here. Since I’m creating a new
Framebuffer class from scratch, I want to do it in a way to optimize
software and OpenGL hardware rendering. If I went with OpenGL fully,
I’d just stick with the 32bit data, and that would be it. And trying to
account for software rendering in different color modes wouldn’t be too
bad. I could just create a SDL_Color palette at program start and use
that to update the screen.

The problem occurs when I want to use both options. For the SDL_Color and
software rendering to work, I would need the buffer to hold color indices
(into the color table). But for OpenGL to work, I would need the
framebuffer to hold actual color values (in 32bit format). So I don’t
know how to proceed.

If I make the Framebuffer hold 32bit values, then on every software
update, I’d have to convert from 32bit to the current screen bpp (which
could be relatively expensive). And if I don’t use 32bit color, then
OpenGL mode is out.

What do you recommend?
Steve