A little problem with glSDL 0.5

Hello, I’m very interested in glSDL, the possibility of doing hardware
alpha blending in current 2D project without change code is great!!
In the last days, I have seen a lot of post about it and I’m happy to know
that its development continue.

I tested the 0.3 and it didn’t work with my project but the new 0.5 works
OK!! but I have a little problem, the file “stderr.txt” is full of “glSDL:
WARNING: On-the-fly conversion performed!” and I convert all surfaces with
SDL_DisplayFormat() before using it. The main problem is that my fps is very
low (3-8fps depending of color depth) so I suppose that the problem is with
download/upload surfaces(textures) from video memory to system memory, I
think that something is being converted each frame and it is being
transferred via AGP bus…

Some idea about this??

Thanks in advance!

Roberto Prieto
MegaStorm Systems
http://www.megastormsystems.com

[…]

I tested the 0.3 and it didn’t work with my project but the new 0.5
works OK!! but I have a little problem, the file “stderr.txt” is
full of “glSDL: WARNING: On-the-fly conversion performed!” and I
convert all surfaces with SDL_DisplayFormat() before using it. The
main problem is that my fps is very low (3-8fps depending of color
depth) so I suppose that the problem is with download/upload
surfaces(textures) from video memory to system memory, I think that
something is being converted each frame and it is being transferred
via AGP bus…

Sounds like you’re blitting something that’s not
SDL_DisplayFormat*()ed… The low fps could be because of that, or,
if we’re talking about lots of blits per frame, because of the
massive bandwidth to the stderr file.

However, if it’s as low as 3-8 fps, I suspect it might be because
you’re blitting from or whithin the screen. (It might actually work
in 0.5, although I remember fixing a few bugs in the backend
version…) I’d have to check to be sure, but I think such blits
involve these “on the fly conversions” as a side effect, when
blitting from the shadow surface to the actual screen.

//David Olofson - Programmer, Composer, Open Source Advocate

.- The Return of Audiality! --------------------------------.
| Free/Open Source Audio Engine for use in Games or Studio. |
| RT and off-line synth. Scripting. Sample accurate timing. |
`-----------------------------------> http://audiality.org -’
http://olofson.nethttp://www.reologica.se —On Monday 07 July 2003 16.37, Roberto Prieto wrote:

David, thanks you for your reply… when I have seen this: “I suspect it
might be because you’re blitting from or whithin the screen…”. I quickly
though in the one thing that works like this… and give a lot of problem
with glSDL… the cursor! I have an interface (ICursor) to work with cursors
(implemented as arrays of surfaces) that they automatically restore the
background… .Although my project give some “glSDL: WARNING: On-the-fly…”,
they are only at startup, not in execution time, so without ICursor I get
600-1000 fps with my AthlonXP 3000+ and Radeon 9700Pro, before, I hardly got
150fps (the engine isn’t optimized yet and it draws everything each
frame), and the more important for me… I can do real alpha blending in
hardware without speed penalization :slight_smile:
I’ll wait to the next revision (0.6?) to check the fix of this warning.
I have seen others “things”, in window mode I get some “lag” from the input
events(keyboard and mouse), I don’t know if this is from my
configuration/drivers or is something of glSDL and the size of used memory
is very big, my project needs about 30Mb of RAM when it runs in software
mode, in hardware mode it needs only 4Mb(the others 26Mb are surfaces
loading in video memory) but when I use glSDL the needs for memory grow up
to 50-60Mb… Does glSDL need a texture memory manager or something like
this?

If you need help in development of glSDL… I could help you in some tasks.

bye

Roberto Prieto
MegaStorm Systems
http://www.megastormsystems.com> ----- Original Message -----

From: david@olofson.net (David Olofson)
To:
Sent: Monday, July 07, 2003 4:51 PM
Subject: Re: [SDL] A little problem with glSDL 0.5

On Monday 07 July 2003 16.37, Roberto Prieto wrote:
[…]

I tested the 0.3 and it didn’t work with my project but the new 0.5
works OK!! but I have a little problem, the file “stderr.txt” is
full of “glSDL: WARNING: On-the-fly conversion performed!” and I
convert all surfaces with SDL_DisplayFormat() before using it. The
main problem is that my fps is very low (3-8fps depending of color
depth) so I suppose that the problem is with download/upload
surfaces(textures) from video memory to system memory, I think that
something is being converted each frame and it is being transferred
via AGP bus…

Sounds like you’re blitting something that’s not
SDL_DisplayFormat*()ed… The low fps could be because of that, or,
if we’re talking about lots of blits per frame, because of the
massive bandwidth to the stderr file.

However, if it’s as low as 3-8 fps, I suspect it might be because
you’re blitting from or whithin the screen. (It might actually work
in 0.5, although I remember fixing a few bugs in the backend
version…) I’d have to check to be sure, but I think such blits
involve these “on the fly conversions” as a side effect, when
blitting from the shadow surface to the actual screen.

//David Olofson - Programmer, Composer, Open Source Advocate

.- The Return of Audiality! --------------------------------.
| Free/Open Source Audio Engine for use in Games or Studio. |
| RT and off-line synth. Scripting. Sample accurate timing. |
`-----------------------------------> http://audiality.org -’
http://olofson.nethttp://www.reologica.se


SDL mailing list
SDL at libsdl.org
http://www.libsdl.org/mailman/listinfo/sdl

[…]

I get 600-1000 fps with my AthlonXP 3000+ and Radeon 9700Pro,
before, I hardly got 150fps (the engine isn’t optimized yet and it
draws everything each frame), and the more important for me… I
can do real alpha blending in hardware without speed penalization
:slight_smile:

That sounds more like it. :slight_smile:

I’ll wait to the next revision (0.6?) to check the fix of this
warning.

Unfortunately, the warning is irrelevant, and removing it wouldn’t
have all that much effect. (Especially not when sent to a file; it’s
worse when it’s sent to a dog slow terminal emulator as is the case
on most Unix like platforms.)

The issue is that there is no way to directly access the screen
directly through OpenGL, and thus, it’s not possible to avoid the
dreadfully slow full screen copying. You’d have to extend the SDL API
to allow locking of only a specified rect of the screen, but even
then things like this would be very slow, compared to normal
rendering.

You simply do not access the screen directly when using hardware
acceleration, be it OpenGL, Direct3D or something else. The
combination of the SDL 2D API and OpenGL is an exceptionally bad one,
but the underlying problems are in no way unique to SDL, OpenGL or
glSDL. I’m afraid applications that do these things will just have to
be fixed to take advantage of h/w acceleration.

I have seen others “things”, in window mode I get some
"lag" from the input events(keyboard and mouse), I don’t know if
this is from my configuration/drivers or is something of glSDL and

I’ve only seen this once, and that was with a broken OpenGL driver.
(It was not related to glSDL - which didn’t even exist at the time.)
The problem was that there was no synchronization at all around the
accelerator command buffer (between the app and the driver), except
that the app was blocked when the buffer was full. As a result, the
application would queue up tens of frames worth of commands before
the driver decided to run, and that obviously caused lag, insanely
unsmooth animation and other related issues.

It should not happen if OpenGL “kicks” the driver as it should when
you flip. (It should actually sync to the retrace as well before
flipping, but many drivers don’t by default, or can’t do it at
all…)

glSDL doesn’t add any command queues or anything that could cause lag.
All OpenGL calls are done directly inside the glSDL calls.

Have you checked how native OpenGL apps behave on your system?

the size of used memory is very big, my project needs about 30Mb of
RAM when it runs in software mode, in hardware mode it needs only
4Mb(the others 26Mb are surfaces loading in video memory) but when
I use glSDL the needs for memory grow up to 50-60Mb… Does glSDL
need a texture memory manager or something like this?

glSDL is forced to keep the surfaces in system memory even after
uploading them to, because there’s no way to read textures back from
OpenGL. That is, when you’re up running, the graphics is stored both
in system memory and in texture RAM.

If you need help in development of glSDL… I could help you in
some tasks.

Well, there are various improvements that could and probably should be
done after the backend version is released. Stay tuned… :slight_smile:

//David Olofson - Programmer, Composer, Open Source Advocate

.- The Return of Audiality! --------------------------------.
| Free/Open Source Audio Engine for use in Games or Studio. |
| RT and off-line synth. Scripting. Sample accurate timing. |
`-----------------------------------> http://audiality.org -’
http://olofson.nethttp://www.reologica.se —On Monday 07 July 2003 20.28, Roberto Prieto wrote: