glSDL backend test

Hello!

Just installed libsdl-1.2.8 + glSDL patch.

I tested it using supertux

opengl ~ 100 FPS
sdl-x11 ~ 70 FPS
sdl-gl ~ 45 FPS

./testsprite
Screen is at 32 bits per pixel
Screen is in system memory
Sprite is in system memory
Sprite blit uses RLE acceleration
471.92 frames per second

SDL_VIDEODRIVER=glSDL ./testsprite
glSDL videoinit
Screen is at 8 bits per pixel
Screen is in system memory
Sprite is in system memory
Sprite blit uses RLE acceleration
1.16 frames per second

Then I forced it to use a HWSURFACE

SDL_VIDEODRIVER=glSDL ./testsprite
glSDL videoinit
Screen is at 8 bits per pixel
Screen is in video memory
Sprite is in video memory
2622.03 frames per second

Finally I edited supertux setup.cpp around line 675 to use SDL_HWSURFACE
and I got:

opengl ~ 100 FPS
sdl-x11 ~ 70 FPS
sdl-gl ~ 100 FPS

(opengl seems to be slowed-down to 100 FPS…)

But when using sdl-gl some sprites are flickering badly - don’t know
what’s the reason.*******

Wouldn’t it make sense to use SDL_HWSURFACE as a default value when
using glSDL? Otherwise it’s painfully slow…

My system:

Gentoo, Kernel 2.6.9
Xorg 6.8.0 using a Radeon 9600 with the latest driver ( finally :wink: )

BTW: I’m not subscribed to the list.

Ciao,

Olaf

[…]

Finally I edited supertux setup.cpp around line 675 to use
SDL_HWSURFACE and I got:

opengl ~ 100 FPS
sdl-x11 ~ 70 FPS
sdl-gl ~ 100 FPS

(opengl seems to be slowed-down to 100 FPS…)

Looks like retrace sync and a 100 Hz refresh rate… (That is, the way
it should work, but usually doesn’t.)

But when using sdl-gl some sprites are flickering badly - don’t know
what’s the reason.

You have to use SDL_DOUBLEBUF with SDL_HWSURFACE, or you’ll get a
single buffered display. (Unlike normal SDL 2D backends, glSDL, can
actually do that, provided the OpenGL driver supports it - and most
do, in my experience.)

Wouldn’t it make sense to use SDL_HWSURFACE as a default value when
using glSDL? Otherwise it’s painfully slow…

The problem is that most applications are doing it wrong. Hardware
accelerated backends are effectively disabled when using a software
display surface. (All the backend can do, if even that, is to
accelerate the update/flip blits from the shadow surface to the
actual h/w display surface.) glSDL is “just another backend” in that
regard; the only difference is that glSDL takes a much harder
performance hit from this kind of abuse than other backends.

Now, of course it’s possible to cheat in order to have some
applications work well with glSDL, but this would break compatibility
with the SDL 2D API, and it would only work in a few special cases.

Many games that rely on s/w display surfaces do so for a reason: It’s
the only way they can have decent perfordance on most normal
backends! Plain blits and colorkey blits are (relatively) fast on all
backends, but as soon as you use alpha blending or custom blitters,
you run into performance issues due to issues with the CPU accessing
VRAM directly.

glSDL is the exception here; it actually does accelerate alpha blits -
but most existing games were written long before glSDL existed, and
back then, there was just no point in supporting backends with h/w
accelerated alpha, because there weren’t any around.

So, what you can do is try adding SDL_HWSURFACE and SDL_DOUBLEBUF to
the flags, and see if that works. If it doesn’t (or runs insanely
slow instead of getting faster), it’s probably because the game uses
custom blitters that are intended to operate directly on the display
surface. Making custom blitters work with glSDL is possible, but you
have to refactor the code to use intermediate surfaces in the same
way you’d implement procedural textures in a 3D engine. Screen
distortion effects (ie offsetting pixels on the screen and such) are
not possible to implement efficiently this way, though.

//David Olofson - Programmer, Composer, Open Source Advocate

.- Audiality -----------------------------------------------.
| Free/Open Source audio engine for games and multimedia. |
| MIDI, modular synthesis, real time effects, scripting,… |
`-----------------------------------> http://audiality.org -’
http://olofson.nethttp://www.reologica.se —On Wednesday 19 January 2005 01.49, Olaf Leidinger wrote:

David Olofson a ?crit :

[…]

Finally I edited supertux setup.cpp around line 675 to use
SDL_HWSURFACE and I got:

opengl ~ 100 FPS
sdl-x11 ~ 70 FPS
sdl-gl ~ 100 FPS

(opengl seems to be slowed-down to 100 FPS…)

Looks like retrace sync and a 100 Hz refresh rate… (That is, the way
it should work, but usually doesn’t.)

Actually, supertux limits the fps to 100 (it can make sense, especially
if you’re on laptop battery :).

[…]

Wouldn’t it make sense to use SDL_HWSURFACE as a default value when
using glSDL? Otherwise it’s painfully slow…

The problem is that most applications are doing it wrong. Hardware
accelerated backends are effectively disabled when using a software
display surface. (All the backend can do, if even that, is to
accelerate the update/flip blits from the shadow surface to the
actual h/w display surface.) glSDL is “just another backend” in that
regard; the only difference is that glSDL takes a much harder
performance hit from this kind of abuse than other backends.

Now, of course it’s possible to cheat in order to have some
applications work well with glSDL, but this would break compatibility
with the SDL 2D API, and it would only work in a few special cases.

Many games that rely on s/w display surfaces do so for a reason: It’s
the only way they can have decent perfordance on most normal
backends! Plain blits and colorkey blits are (relatively) fast on all
backends, but as soon as you use alpha blending or custom blitters,
you run into performance issues due to issues with the CPU accessing
VRAM directly.

glSDL is the exception here; it actually does accelerate alpha blits -
but most existing games were written long before glSDL existed, and
back then, there was just no point in supporting backends with h/w
accelerated alpha, because there weren’t any around.

So, what you can do is try adding SDL_HWSURFACE and SDL_DOUBLEBUF to
the flags, and see if that works. If it doesn’t (or runs insanely
slow instead of getting faster), it’s probably because the game uses
custom blitters that are intended to operate directly on the display
surface. Making custom blitters work with glSDL is possible, but you
have to refactor the code to use intermediate surfaces in the same
way you’d implement procedural textures in a 3D engine. Screen
distortion effects (ie offsetting pixels on the screen and such) are
not possible to implement efficiently this way, though.

As usual, all the relevant information for good 2D performance with
glSDL (and with SDL too) is there :
http://icps.u-strasbg.fr/~marchesin/sdl/glsdl.html

Stephane>On Wednesday 19 January 2005 01.49, Olaf Leidinger wrote:

[…]

Looks like retrace sync and a 100 Hz refresh rate… (That is, the
way it should work, but usually doesn’t.)

Actually, supertux limits the fps to 100 (it can make sense,
especially if you’re on laptop battery :).

I see. I’m doing that in Kobo Deluxe as well, though it’s configurable
(anything from 1 through 200 fps + unlimited) and disabled by
default, as it makes animation less smooth on most machines I’ve
tried it on so far.

[…]

As usual, all the relevant information for good 2D performance with
glSDL (and with SDL too) is there :
http://icps.u-strasbg.fr/~marchesin/sdl/glsdl.html

Yep, thought about that, but somehow forgot to mention it! :slight_smile:

//David Olofson - Programmer, Composer, Open Source Advocate

.- Audiality -----------------------------------------------.
| Free/Open Source audio engine for games and multimedia. |
| MIDI, modular synthesis, real time effects, scripting,… |
`-----------------------------------> http://audiality.org -’
http://olofson.nethttp://www.reologica.se —On Thursday 19 May 2005 03.04, Stephane Marchesin wrote: