Why are DirectX 8-bit hardware-surface screen palette flashes so slow? I'll tell you

I reported this as a bug on the SDL bug tracker years ago but no action has ever been taken on it. So, I decided to sniff around in the source myself.

Seems when you change the palette on an DX 8-bit surface, SDL believes you want to automatically remap the surface into the new palette. I could see how this would be very useful for graphic surfaces that are going to be blitted to the screen. However, this is simply disasterous when applied to the screen itself.

Here’s what it does currently:

  1. Remaps the entire old palette to the closest color in the new palette using SDL_FindColor. This alone is not going to be a fast operation.

  2. Locks the surface. Since the surface is a DirectX hardware surface, this involves copying the entire contents of the screen out of video memory. Not only that, but it will also cause the program to block until the next hardware screen refresh, so the amount of lag suffered is largely dependent upon the timing of the call (and is thus highly inconsistent).

  3. Remaps the entire surface via the lookup table generated in step 1.

By the time all this has been done, the program has been stalled sometimes for as long as half a second.

In games like Doom, when a palette flash is performed, no remapping of the screen is necessary. The palettes are all orthogonal to one another, so it is expected that the screen contents will remain unchanged and only the palette will be different. This same archetype was used in all DOS games, demos, etc, in particular to use fade transitions. Just to make clear the impact of this problem, it can be felt in DosBox just as easily when it is run fullscreen and a game such as Strife is played. It was a basic tenet of VGA programming, and so the resulting problem is not restricted to Doom ports.

I believe that SDL needs some way to except the screen surface from this undocumented remapping operation so that palette sets can be done efficiently again regardless of the video driver in use. If programmers want smooth transitions between inorthogonal palettes on screen, they would be allowed to do them the old-fashioned way: by fading to a common color between the two palettes, such as index 0, before changing the contents of the screen._________________________________________________________________
Stay organized with simple drag and drop from Windows Live Hotmail.
http://windowslive.com/Explore/hotmail?ocid=TXT_TAGLM_WL_hotmail_102008

I believe that SDL needs some way to except the screen surface from this
undocumented remapping operation so that palette sets can be done
efficiently again regardless of the video driver in use. If programmers want
smooth transitions between inorthogonal palettes on screen, they would be
allowed to do them the old-fashioned way: by fading to a common color
between the two palettes, such as index 0, before changing the contents of
the screen.

It was a bit surprising the first time I encountered it, but it is documented:

http://www.libsdl.org/cgi/docwiki.cgi/SDL_SetPalette

What you want is to change the physical palette rather than the logical palette.On Fri, Oct 17, 2008 at 2:01 AM, James Haley wrote:


http://pphaneuf.livejournal.com/