DirectX fullscreen palette lag

I noticed that when I have my Doom port use the DirectX video drivers in fullscreen mode, palette flash effects now frequently cause an irritating lag. That is, the entire game sometimes stops and waits for the palette to change. I am hoping that there is some way to lessen or avoid this entirely. It’s pretty awful.

James Haley_________________________________________________________________
Put your friends on the big screen with Windows Vista? + Windows Live?.
http://www.microsoft.com/windows/shop/specialoffers.mspx?ocid=TXT_TAGLM_CPC_MediaCtr_bigscreen_102007

Just thought I would mention that I found a work-around for my problem. The game runs in 8-bit, but the 8-bit buffer is converted to 32-bit and copied to a 32-bit screen using a precalculated CLUT. This avoids any lag due to hardware palette sets.

Evidently the SDL API is too coarse relative to the way DirectX handles palette changes to do them during the game loop in an efficient manner. I’ve not looked at the source but my guess is that it’s causing my program to block until the next hardware refresh, causing the irritating but inconsistent lag. If the code were bare DirectDraw I assume there would be a way to set the palette at the appropriate time instead of waiting immediately when the call to my I_SetPalette function is dispatched.

Unfortunately this required me to add another weird configuration variable to the program, to select running in 8-on-32-bit mode. :stuck_out_tongue: We hope to augment this with true 32-bit rendering eventually, but there are many obstacles to that still in our way (DOOM and its relatives use a lot of colormaps, translucency tables, palette swaps, etc. that are arduous to emulate flexibly in true color).

James Haley_________________________________________________________________
Connect and share in new ways with Windows Live.
http://www.windowslive.com/connect.html?ocid=TXT_TAGLM_Wave2_newways_112007