SDL with OpenGL and 16bit depth on WinXP uses MS GDI Generic

Hi

When I create a 16bit OpenGL window (or fullscreen) with SDL (1.2.5) on
WinXP the Microsoft GDI Generic driver is used instead of the graphicadaptor
vendor driver. I can reproduce the problem on several WinXP computers
(other Win versions don’t have this problem), 32bit color depth is always
working fine. I’m using the SDL_HWSURFACE | SDL_DOUBLEBUF | SDL_OPENGL flags
to create the window and check the format successful with SDL_VideoModeOK().
Does anybody have the same problem or can give my an advise how to solve it?

greetings
Milo Spirig

SDL is just using the generic function to “find a matching video mode”,
and doesn’t check the results, so if it gets an emulated video mode,
that’s what’s used. It’d probably be better to replace that generic
call with a manual iteration, and to either look for a best match
(tricky, since we don’t know what “best” is for a given application) or
to just fail when there’s no exact match. (The problem with the latter
is SDL doesn’t provide apps enough information to be able to recover
from this by choosing a different mode.)

Of course, best would be to have a layer to iterate video modes through
SDL and let the app be as smart as it needs to be, but I don’t think
that’ll get into SDL 1.2.On Sat, Dec 28, 2002 at 04:56:53PM +0100, ToyToy wrote:

When I create a 16bit OpenGL window (or fullscreen) with SDL (1.2.5) on
WinXP the Microsoft GDI Generic driver is used instead of the graphicadaptor
vendor driver. I can reproduce the problem on several WinXP computers
(other Win versions don’t have this problem), 32bit color depth is always
working fine. I’m using the SDL_HWSURFACE | SDL_DOUBLEBUF | SDL_OPENGL flags
to create the window and check the format successful with SDL_VideoModeOK().
Does anybody have the same problem or can give my an advise how to solve it?


Glenn Maynard

Are you sure about the enumeration? I saw there’s a function called
SDL_GetVideoMode which should return the closest non-emulated video mode and
my resolution of 800x600x16 seems to be ok. I know this resolution is
supported by my GeForce because I can set it up manuelly trough the winapi
(without SDL). As far as I know OpenGL uses automatically hardware
acceleration when possible.

greetings
Milo Spirig

-----Urspruengliche Nachricht-----
MaynardVon: sdl-admin at libsdl.org [mailto:sdl-admin at libsdl.org] Im Auftrag von Glenn
Gesendet: Samstag, 28. Dezember 2002 22:12
An: sdl at libsdl.org
Betreff: Re: [SDL] SDL with OpenGL and 16bit depth on WinXP uses MS GDI
Generic

On Sat, Dec 28, 2002 at 04:56:53PM +0100, ToyToy wrote:

When I create a 16bit OpenGL window (or fullscreen) with SDL (1.2.5)
on WinXP the Microsoft GDI Generic driver is used instead of the
graphicadaptor vendor driver. I can reproduce the problem on several
WinXP computers (other Win versions don’t have this problem), 32bit
color depth is always working fine. I’m using the SDL_HWSURFACE |
SDL_DOUBLEBUF | SDL_OPENGL flags to create the window and check the
format successful with SDL_VideoModeOK(). Does anybody have the same
problem or can give my an advise how to solve it?

SDL is just using the generic function to “find a matching video mode”, and
doesn’t check the results, so if it gets an emulated video mode, that’s
what’s used. It’d probably be better to replace that generic call with a
manual iteration, and to either look for a best match (tricky, since we
don’t know what “best” is for a given application) or to just fail when
there’s no exact match. (The problem with the latter is SDL doesn’t provide
apps enough information to be able to recover from this by choosing a
different mode.)

Of course, best would be to have a layer to iterate video modes through SDL
and let the app be as smart as it needs to be, but I don’t think that’ll get
into SDL 1.2.


Glenn Maynard


SDL mailing list
SDL at libsdl.org
http://www.libsdl.org/mailman/listinfo/sdl

(Hmm, there’s SDL_ListModes which I didn’t know about; adjust previous
response accordingly …)On Sat, Dec 28, 2002 at 10:45:56PM +0100, ToyToy wrote:

Are you sure about the enumeration? I saw there’s a function called
SDL_GetVideoMode which should return the closest non-emulated video mode and
my resolution of 800x600x16 seems to be ok. I know this resolution is
supported by my GeForce because I can set it up manuelly trough the winapi
(without SDL). As far as I know OpenGL uses automatically hardware
acceleration when possible.

The problem is that Windows will often emulate unsupported modes.

Right now, WIN_GL_SetupWindow searches for a mode (ChoosePixelFormat),
sets it (SetPixelFormat), and if that succeeds, continues. However, the
mode it just set might be emulated (GL_pfd.dwFlags & PFD_GENERIC_FORMAT),
which means (according to docs) that we’re not hardware accelerated.

(In practice, I’m not sure what it means. I’ve seen modes with this
flag end up with no HW accel, but I’ve also seen some that were
hardware accelerated …)


Glenn Maynard

Thank you Glenn. Then it would maybe help to delete the PFD_GENERIC_FORMAT
flag because I anyway can’t use an emulated mode in my game. Where did you
find that flag? I searched it but in SDL_wingl.c it’s defined like that:
GL_pfd.dwFlags = (PFD_DRAW_TO_WINDOW | PFD_SUPPORT_OPENGL);

greetings
Milo Spirig

-----Urspruengliche Nachricht-----
MaynardVon: sdl-admin at libsdl.org [mailto:sdl-admin at libsdl.org] Im Auftrag von Glenn
Gesendet: Samstag, 28. Dezember 2002 23:20
An: sdl at libsdl.org
Betreff: Re: Re: [SDL] SDL with OpenGL and 16bit depth on WinXP uses MS GDI
Generic

(Hmm, there’s SDL_ListModes which I didn’t know about; adjust previous
response accordingly …)

On Sat, Dec 28, 2002 at 10:45:56PM +0100, ToyToy wrote:

Are you sure about the enumeration? I saw there’s a function called
SDL_GetVideoMode which should return the closest non-emulated video
mode and my resolution of 800x600x16 seems to be ok. I know this
resolution is supported by my GeForce because I can set it up manuelly
trough the winapi (without SDL). As far as I know OpenGL uses
automatically hardware acceleration when possible.

The problem is that Windows will often emulate unsupported modes.

Right now, WIN_GL_SetupWindow searches for a mode (ChoosePixelFormat), sets
it (SetPixelFormat), and if that succeeds, continues. However, the mode it
just set might be emulated (GL_pfd.dwFlags & PFD_GENERIC_FORMAT), which
means (according to docs) that we’re not hardware accelerated.

(In practice, I’m not sure what it means. I’ve seen modes with this flag
end up with no HW accel, but I’ve also seen some that were hardware
accelerated …)


Glenn Maynard


SDL mailing list
SDL at libsdl.org
http://www.libsdl.org/mailman/listinfo/sdl

You can’t just “delete” it. It’s returned from a query; it’s just
telling you the mode it received.
(WIN_GL_SetupWindow -> DescribePixelFormat)On Sun, Dec 29, 2002 at 12:05:01AM +0100, ToyToy wrote:

Thank you Glenn. Then it would maybe help to delete the PFD_GENERIC_FORMAT
flag because I anyway can’t use an emulated mode in my game. Where did you
find that flag? I searched it but in SDL_wingl.c it’s defined like that:
GL_pfd.dwFlags = (PFD_DRAW_TO_WINDOW | PFD_SUPPORT_OPENGL);


Glenn Maynard