I have a video card that interfaces to a special monitor. The card only
supports 16 bit color (max.) I wrote a C app using SDL that creates a 16bit
color 640x480 screen. The app runs fine on a linux system using 32 bit
color. Running it under GDM with the 16 bit card screws up the color
palette. I?m assuming the GDM changes the color palette and is causing this
issue. When I run the app from a virtual console (GDM off or on) the colors
My questions: Is there any way to reset the palette using sdl in 16 bit
mode? I saw there is 8bit palette change capability. If not is there any
way to force the GDM not to change the palette? I would run the app outside
of the GDM but there are frequent ?hiccups? even with the priority almost
set to maximum. It?s a real-time graphically intensive app.
No virus found in this outgoing message.
Checked by AVG.
Version: 7.5.552 / Virus Database: 270.9.18/1851 - Release Date: 12/16/2008