How to enable 2D hardware acceleration on Windows?

Hi,

Hardware acceleration doesn’t appear to get enabled. I’m using the
following code. What should I do to enable hardware acceleration?

int APIENTRY WinMain(HINSTANCE hInstance, HINSTANCE hPrevInstance,
LPSTR lpCmdLine, int nCmdShow)
{
if (SDL_Init(SDL_INIT_VIDEO))
return 1;
g_screen = SDL_SetVideoMode(1920, 1200, 32, SDL_DOUBLEBUF |
SDL_FULLSCREEN | SDL_HWSURFACE);
const SDL_VideoInfo vi = *SDL_GetVideoInfo();
SDL_Quit();
return 0;
}

Recent versions of SDL, AFAIK, all use “windib” as the default video
backend, and thus will never use hardware acceleration. You would need
to use the DirectX backend to do that. I believe this can be
controlled using an environmental variable
(http://www.libsdl.org/cgi/docwiki.cgi/SDL_5fenvvars). If you want to
know more about SDL and hardware, I highly recommend Bob Pendleton’s
SDL articles (http://www.linuxdevcenter.com/pub/a/linux/2003/08/07/sdl_anim.html)On Wed, Apr 16, 2008 at 2:54 PM, Olaf van der Spek wrote:

Hi,

Hardware acceleration doesn’t appear to get enabled. I’m using the
following code. What should I do to enable hardware acceleration?

Recent versions of SDL, AFAIK, all use “windib” as the default video
backend, and thus will never use hardware acceleration. You would need

Ah. Why isn’t DX used by default?

to use the DirectX backend to do that. I believe this can be
controlled using an environmental variable
(http://www.libsdl.org/cgi/docwiki.cgi/SDL_5fenvvars). If you want to
know more about SDL and hardware, I highly recommend Bob Pendleton’s
SDL articles (http://www.linuxdevcenter.com/pub/a/linux/2003/08/07/sdl_anim.html)

Thanks, I’ll have a look.On Wed, Apr 16, 2008 at 5:00 PM, Brian <brian.ripoff at gmail.com> wrote:

It’s DirectX 5, which has a number of compatibility problems on certain
versions of Windows. (and it turns out using hardware surfaces with
direct pixel access is usually slower on modern hardware anyway)

SDL 1.3 is completely redesigned with a model much better suited for 3D
hardware acceleration.

Plus, you can always use OpenGL if you want cross-platform hardware
acceleration, but that has its own set of problems.

See ya!
-Sam Lantinga, Lead Software Engineer, Blizzard Entertainment> On Wed, Apr 16, 2008 at 5:00 PM, Brian <brian.ripoff at gmail.com> wrote:

Recent versions of SDL, AFAIK, all use “windib” as the default video
backend, and thus will never use hardware acceleration. You would need

Ah. Why isn’t DX used by default?

Recent versions of SDL, AFAIK, all use “windib” as the default video
backend, and thus will never use hardware acceleration. You would need

Ah. Why isn’t DX used by default?

It’s DirectX 5, which has a number of compatibility problems on certain
versions of Windows.

Ah, that’s quite an old version. But I thought DX was quite backwards
compatible.

(and it turns out using hardware surfaces with
direct pixel access is usually slower on modern hardware anyway)

I’m aware of that, but I’m just using blits.
Even so, DX appears to be slower rather than faster already and I
don’t know why.

SDL 1.3 is completely redesigned with a model much better suited for 3D
hardware acceleration.

Note the 2D in the title. :wink:
When can a stable release of 1.3 be expected?On Wed, Apr 16, 2008 at 5:24 PM, Sam Lantinga wrote:

On Wed, Apr 16, 2008 at 5:00 PM, Brian <brian.ripoff at gmail.com> wrote:

Plus, you can always use OpenGL if you want cross-platform hardware
acceleration, but that has its own set of problems.

See ya!
-Sam Lantinga, Lead Software Engineer, Blizzard Entertainment


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org

In my experience, Olaf , (I’ve been playing with SDL for about 2-3 years
now) SDL hardware surfaces are not worth the bother. Yeah, you can get
amazingly fast FPS rates(I’ve had > 9000!) which you’ll never really see, as
most monitors only refresh in the 60-120 hz range, but the headaches of
dealing with hardware surfaces far outweigh the few benefits. And if you
want to do any alpha blending (so that your various sprites,buttons,objects
on-screen don’t have jagged edges) hardware suddenly gets MUCH slower( that
9000 can drop down to 50 or lower, depending on how many sprites you’re
blitting), and your better off with using software surfaces anyway.
Just my 2 cents.
Anyway, you can use the SDL_putenv(“SDL_VIDEODRIVER=directx”); command in
your program before you initialize SDL to use the directx driver - this will
enable you to experiment with hardware surfaces.
-Dave> ----- Original Message -----

From: olafvdspek@gmail.com (Olaf van der Spek)
To: "A list for developers using the SDL library. (includes SDL-announce)"

Sent: Wednesday, April 16, 2008 10:20 AM
Subject: Re: [SDL] How to enable 2D hardware acceleration on Windows?

On Wed, Apr 16, 2008 at 5:00 PM, Brian <brian.ripoff at gmail.com> wrote:

Recent versions of SDL, AFAIK, all use “windib” as the default video
backend, and thus will never use hardware acceleration. You would need

Ah. Why isn’t DX used by default?

to use the DirectX backend to do that. I believe this can be
controlled using an environmental variable
(http://www.libsdl.org/cgi/docwiki.cgi/SDL_5fenvvars). If you want to
know more about SDL and hardware, I highly recommend Bob Pendleton’s
SDL articles
(http://www.linuxdevcenter.com/pub/a/linux/2003/08/07/sdl_anim.html)

Thanks, I’ll have a look.


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org

In my experience, Olaf , (I’ve been playing with SDL for about 2-3 years
now) SDL hardware surfaces are not worth the bother. Yeah, you can get
amazingly fast FPS rates(I’ve had > 9000!) which you’ll never really see, as

I don’t need 9000, but I’m already down to 30 - 80. And I haven’t put
that much on the screen yet.
How well does 2D software blitting scale on resolutions like 1280 x
960 and 1920 x 1200?

most monitors only refresh in the 60-120 hz range, but the headaches of
dealing with hardware surfaces far outweigh the few benefits. And if you

What’s the big problem with hardware surfaces, except having to reload
them sometimes and not really having fast direct access?

want to do any alpha blending (so that your various sprites,buttons,objects
on-screen don’t have jagged edges) hardware suddenly gets MUCH slower( that
9000 can drop down to 50 or lower, depending on how many sprites you’re
blitting), and your better off with using software surfaces anyway.

Doesn’t DX support that kind of blitting in hardware as well? Why
would it get much slower?On Wed, Apr 16, 2008 at 5:39 PM, David Olsen wrote:

want to do any alpha blending (so that your various sprites,buttons,objects
on-screen don’t have jagged edges) hardware suddenly gets MUCH slower( that
9000 can drop down to 50 or lower, depending on how many sprites you’re
blitting), and your better off with using software surfaces anyway.

Doesn’t DX support that kind of blitting in hardware as well? Why
would it get much slower?
Because SDL doesn’t support it yet. As far as I know, the only support for that sort of stuff is through the SDL_gfx library, which is all software-based.

Ah. I guess I’d better look into using DX or OpenGL directly then.On Wed, Apr 16, 2008 at 6:01 PM, Mason Wheeler wrote:

want to do any alpha blending (so that your various sprites,buttons,objects
on-screen don’t have jagged edges) hardware suddenly gets MUCH slower( that
9000 can drop down to 50 or lower, depending on how many sprites you’re
blitting), and your better off with using software surfaces anyway.

Doesn’t DX support that kind of blitting in hardware as well? Why
would it get much slower?
Because SDL doesn’t support it yet. As far as I know, the only support for that sort of stuff is through the SDL_gfx library, which is all software-based.

want to do any alpha blending (so that your various sprites,buttons,objects
on-screen don’t have jagged edges) hardware suddenly gets MUCH slower( that
9000 can drop down to 50 or lower, depending on how many sprites you’re
blitting), and your better off with using software surfaces anyway.

Doesn’t DX support that kind of blitting in hardware as well?

Not in the 2D driver interface, that’s only available with 3D.

Why would it get much slower?

Because the alpha blending operation is a read + write operation, reading
individual pixels and writing them back over the bus stalls the graphics
pipeline and is reduced to system bus speeds.

See ya,
-Sam Lantinga, Lead Software Engineer, Blizzard Entertainment

Ah. I guess I’d better look into using DX or OpenGL directly then.

I’d recommend OpenGL, unless you’re Windows-only.

See ya,
-Sam Lantinga, Lead Software Engineer, Blizzard Entertainment

I don’t need 9000, but I’m already down to 30 - 80. And I haven’t put
that much on the screen yet.
How well does 2D software blitting scale on resolutions like 1280 x
960 and 1920 x 1200?

I’d be more curious as to how the surfaces you’re blitting are
arranged. On any system over a ghz you should have very few issues
with blitting. Is SDL having to convert surface formats for each
blit, etc etc etc. 1280x960 should be decent, but 1920x1200 … I
dunno. That’s alot of memory to move around… and I’ve no experience
with trying to write a game on that size screen.

-Will

Ah, that’s quite an old version. But I thought DX was quite backwards
compatible.

Mostly. :slight_smile:

Even so, DX appears to be slower rather than faster already and I
don’t know why.

On DX9 and above, I believe the 2D interface is emulated over 3D.

Note the 2D in the title. :wink:

Yes, I’m just talking about the underlying driver architecture. It’s
still a 2D API.

When can a stable release of 1.3 be expected?

Hopefully sometime this year. I have one more major revision of the
blitting system to work on, and we have a number of students joining
us for the Google Summer of Code, so by the end of the year I hope to
be iterating on release candidates.

You can check out a pre-release version here:
http://www.libsdl.org/tmp/SDL-1.3.zip

See ya!
-Sam Lantinga, Lead Software Engineer, Blizzard Entertainment

I don’t need 9000, but I’m already down to 30 - 80. And I haven’t put
that much on the screen yet.
How well does 2D software blitting scale on resolutions like 1280 x
960 and 1920 x 1200?

I’d be more curious as to how the surfaces you’re blitting are
arranged. On any system over a ghz you should have very few issues

I’d like to run 1280 x 960 on Athlon 1200 and Athlon XP 2500+ at high
frame rates.

with blitting. Is SDL having to convert surface formats for each
blit, etc etc etc.

All surfaces should be in the proper format (I’m using SDL_DisplayFormat).
It are lots of tiny 16 x 16 blits though. Maybe it’s even overhead in
my own code.
I guess I should be profiling the app, but I think MS didn’t include a
profiler in my version of VC.On Wed, Apr 16, 2008 at 6:25 PM, Will Langford wrote:

1280x960 should be decent, but 1920x1200 … I
dunno. That’s alot of memory to move around… and I’ve no experience
with trying to write a game on that size screen.

if you take the sdl test sprite test app that does a bunch of smiley
faces… and tweak the code to do the size and bitdepth of screen
you’re looking for… and give that a go… might be a decent place
to start for ‘where bottleneck lies’. Ya can then also go about
changing the count, or size of the smiley face sprite, etc etc etc.

-WillOn Wed, Apr 16, 2008 at 11:58 AM, Olaf van der Spek wrote:

On Wed, Apr 16, 2008 at 6:25 PM, Will Langford <@William_Langford> wrote:

I don’t need 9000, but I’m already down to 30 - 80. And I haven’t put
that much on the screen yet.
How well does 2D software blitting scale on resolutions like 1280 x
960 and 1920 x 1200?

I’d be more curious as to how the surfaces you’re blitting are
arranged. On any system over a ghz you should have very few issues

I’d like to run 1280 x 960 on Athlon 1200 and Athlon XP 2500+ at high
frame rates.

with blitting. Is SDL having to convert surface formats for each
blit, etc etc etc.

All surfaces should be in the proper format (I’m using SDL_DisplayFormat).
It are lots of tiny 16 x 16 blits though. Maybe it’s even overhead in
my own code.
I guess I should be profiling the app, but I think MS didn’t include a
profiler in my version of VC.

1280x960 should be decent, but 1920x1200 … I
dunno. That’s alot of memory to move around… and I’ve no experience
with trying to write a game on that size screen.


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org

100 sprites:
C:\VC\SDL-1.2.13\test>Release\testsprite.exe -width 1920 -height 1200 -bpp 8 -fa
st
Screen is at 8 bits per pixel
Screen is in system memory
Sprite is in system memory
Sprite blit uses RLE acceleration
997.21 frames per second

C:\VC\SDL-1.2.13\test>Release\testsprite.exe -width 1920 -height 1200 -bpp 32 -f
ast
Screen is at 32 bits per pixel
Screen is in system memory
Sprite is in system memory
Sprite blit uses RLE acceleration
213.63 frames per second

32-bit is 4.66x slower than 8-bit, that’s a bit weird.

1000 sprites:
C:\VC\SDL-1.2.13\test>Release\testsprite.exe -width 1920 -height 1200 -bpp 8 -fa
st
Screen is at 8 bits per pixel
Screen is in system memory
Sprite is in system memory
Sprite blit uses RLE acceleration
160.18 frames per second

C:\VC\SDL-1.2.13\test>Release\testsprite.exe -width 1920 -height 1200 -bpp 32 -f
ast
Screen is at 32 bits per pixel
Screen is in system memory
Sprite is in system memory
Sprite blit uses RLE acceleration
88.03 frames per second

1.8x slower

10000 sprites:
C:\VC\SDL-1.2.13\test>Release\testsprite.exe -width 1920 -height 1200 -bpp 8 -fa
st
Screen is at 8 bits per pixel
Screen is in system memory
Sprite is in system memory
Sprite blit uses RLE acceleration
16.59 frames per second

C:\VC\SDL-1.2.13\test>Release\testsprite.exe -width 1920 -height 1200 -bpp 32 -f
ast
Screen is at 32 bits per pixel
Screen is in system memory
Sprite is in system memory
Sprite blit uses RLE acceleration
12.29 frames per second

1.35x slower

The second test blits 1000 x 32 x 32 = 1024000 pixels, which is less
than half the screen without taken overdraw into account.
The last test may blit a bit too much, but the frame rate is very low
already and that’s with just blitting.

I also don’t understand the relation between 8-bit and 32-bit.On Wed, Apr 16, 2008 at 8:32 PM, Will Langford wrote:

if you take the sdl test sprite test app that does a bunch of smiley
faces… and tweak the code to do the size and bitdepth of screen
you’re looking for… and give that a go… might be a decent place
to start for ‘where bottleneck lies’. Ya can then also go about
changing the count, or size of the smiley face sprite, etc etc etc.

Annnd… although it’s a completely flawed way of thinking about it…
but… just an example of sheer size of numbers you’re talking about
when doing 1920x1200… at 32bit, a screen is 8.78MB. If you redraw
the full screen at 60HZ, you’re doing 527.35MB a second. And this is
just undoubtedly drawing to a software surface, which still has yet to
be moved to video ram via system API calls.

527MB… MHz loose correlation, etc. Granted, this doesn’t take into
account cpu cache, write combining, etc… but… it’s still a number
a bit beefier than you might have thought. Not to mention, you have
to push this 527MB / sec into system ram, then across whatever
graphics bus is your system (with associated stalls/delays).

I wonder what a commercial game (halflife ?) gets with software
rendering at 1920x1200 on your system. I think I’ve successfully
played an 8bit game at 1920x1200 on a similar system to what you’ve
got. ‘Swarm’ by Reflexive I believe. Can’t recall how it felt,
though.

1024x768, 1280x1024 (or other common medium-low res flat panel native
display) should be easy… but 1920 might be pushing it for your given
system :).

If you really want a 1920x1200 game, ya might consider building a 2D
engine within opengl. Lots of people have done similar… and you get
some handy ‘free’ effects in the process (hardware does it for ya).

-Will

Annnd… although it’s a completely flawed way of thinking about it…
but… just an example of sheer size of numbers you’re talking about
when doing 1920x1200… at 32bit, a screen is 8.78MB. If you redraw
the full screen at 60HZ, you’re doing 527.35MB a second. And this is
just undoubtedly drawing to a software surface, which still has yet to
be moved to video ram via system API calls.

527MB… MHz loose correlation, etc. Granted, this doesn’t take into
account cpu cache, write combining, etc… but… it’s still a number
a bit beefier than you might have thought. Not to mention, you have

Nah, I did the math. :wink:
You’ve got two reads and a write at least, so that’s 1.5 gbyte/s already.
That’s why I wanted 2D acceleration. Even a Windows desktop is slow
withoout 2D acceleration.

to push this 527MB / sec into system ram, then across whatever
graphics bus is your system (with associated stalls/delays).

I wonder what a commercial game (halflife ?) gets with software
rendering at 1920x1200 on your system. I think I’ve successfully

I don’t even want to know.

played an 8bit game at 1920x1200 on a similar system to what you’ve
got. ‘Swarm’ by Reflexive I believe. Can’t recall how it felt,
though.

1024x768, 1280x1024 (or other common medium-low res flat panel native
display) should be easy… but 1920 might be pushing it for your given
system :).

And I don’t even have a 30" monitor. :wink:

If you really want a 1920x1200 game, ya might consider building a 2D
engine within opengl. Lots of people have done similar… and you get
some handy ‘free’ effects in the process (hardware does it for ya).

Yes, OpenGL or DX is probably needed.On Wed, Apr 16, 2008 at 9:45 PM, Will Langford wrote:

Hello !

If you really want a 1920x1200 game, ya might consider building a 2D
engine within opengl. Lots of people have done similar… and you get
some handy ‘free’ effects in the process (hardware does it for ya).

Yes, OpenGL or DX is probably needed.

David Olofson has written an OpenGL
wrapper like thing for SDL. This allows
you to use the normal SDL blit commands with OpenGL.

CU

Agreed that in order to get that kind of resolution, you need direct access
to the straightest and most efficient way possible to talk to the hardware.
i.e OpenGl under Mac and DirectX under windows. GPU’s are fast as lightning
these days but you need to be able to use shaders and texture in an
efficient way. I doubt SDL in it’s current state would be the right choice.

Regards,
Marc> If you really want a 1920x1200 game, ya might consider building a 2D

engine within opengl. Lots of people have done similar… and you get
some handy ‘free’ effects in the process (hardware does it for ya).