Lars Brubaker wrote:
This has been a big problem for me and for several other developers on
Windows.
Thanks for the reply!
Is it just me, or does windows go out of its way to be annoying?
The big problem for the windib driver is (I think) detecting which is
the ‘best’ pixelformat - ie what is the desktop running in.
I already know which formats I can actually use, because it’s purely
software - defined by win32 and not some hardware feature.
So SDL_SetVideoMode() is easy - it’s just the SDL mode query
functions which are tricky (especially given my ignorance of the
internal layout of SDL).
You can ask DircetX to do it with EnumDisplayModes and something like this
in the call back;
if(lpDDSurfaceDesc->ddpfPixelFormat.dwRGBBitCount == 16)
{
if(lpDDSurfaceDesc->ddpfPixelFormat.dwGBitMask == 0x7e0)
{
// it’s 565
}
else
{
// it’s 555
}
}
But the call in DirectX isn’t reliable (many cards report the wrong value)
and you don’t want to use DX. The other problem is there is no way in
windows to query the rgb masks of the hardware.
Hmmm… I’d probably be happy requiring DirectX 3 to do the test - even
NT4 supports it once you get it servicepacked up.
So, the only way that I know of to get the actual 555 565’nes of the display
is to ask GDI to write a green pixel (or a red pixel) read it back and look
and see what bits have been set. This works as long as you can get a GDI
surface on the device that you are writing to. Which you can do on just
about everything but the old Voodoo cards.
Ouch.
I don’t suppose you could provide some pseudo-code? I gave up on GDI
coding at a very early stage, wrote a dib-section wrapper, and stuck to
raw pixel access. So my knowledge of GDI calls is pretty flimsy. Off
the top of my head, I’d imagine I need to do something like this:
- Create a DC compatible with the desktop
- Create a small bitmap and select it into my DC
- Write a single red or green pixel using putpixel (or whatever
the name is)
- Ask for the raw pixeldata back from the bitmap
Am I close? Can it even be done all offscreen?
What is the problem with the old voodoo cards? I thought they simply
didn’t support GDI at all… (in which case, no problem
Again, thanks for the info,
Ben.–
Ben Campbell
Programmer, Creature Labs
ben.campbell at creaturelabs.com
http://www.creaturelabs.com