Slow blitting ... is it us or SDL?

Hi,

We’re making a game with SDL, currently only for Windows. It’s a
puzzle/action game, 2D, with some sprites constantly moving on-screen
(on a 500x500 pixels area, screen res: 800x600).

We need to use Alpha blitting to show smooth sprites over the
background, but we found it too slow.
Basically we’re doing something like:

screen = SDLSurface (tried HWSURFACE, SWSURFACE, double and single
buffer, results are almost the same)
backbuffer = SDLSurface (SWSURFACE, also tried HWSURFACE and making all
our images HW also)

  1. blit the background to backbuffer
  2. blit all interface elements and sprites to the backbuffer (the
    game is mostly buttons and a big area with some characters running all
    the time)
  3. blit backbuffer to screen
  4. SDL_Flip() - voila!

We record the time with something like this:

   lastCalled = GetTickCount();
   Repaint();
   pThis.logFile << "Repaint Time: " << ( GetTickCount() - 

lastCalled ) << endl;

And we obtain something like this (milliseconds):


Repaint Time: 47
Repaint Time: 46
Repaint Time: 47
Repaint Time: 32
Repaint Time: 46
Repaint Time: 47
Repaint Time: 47
Repaint Time: 47
Repaint Time: 47
Repaint Time: 47

Which means that we might get between 20 and 30 FPS, although most of
the time it seems near 20 (and sometimes time is 60+ ms!)

Our machines are Athlon 1700+/384MB RAM/GForce2MX400 & Athlon
2000+/512MB RAM/GForce4MX440. Maybe not next-gen, but sure not slow.

Some extra info:

  • Game state is mantained using an SDL_Timer @ aprox. 20 ms interval.
    The timer function updates the game with the “real” time elapsed
    (according to GetTimerCount()).
  • Repaint snippet posted above resides in another SDL_Timer,
    somewhere else in the program.
  • Interface events are catched in a loop like:

while( !exitApp ) {
SDL_WaitEvent(&sdlEvent);
ProcMessage( sdlEvent );
}

In the “main” thread of the program.

So, the question is: ?Are this results the best we might get (20 fps
@ 800x600), or are we doing something really really wrong? I looked for
but could not find benchmark results of alpha blitting with SDL. I’ll
try to make some, but any info will be greatly appreciated.

Thanks a lot… and sorry for the lengthly mail!–
///
Sebasti?n Uribe
Inmune Games
suribe at inmune.com.ar
www.inmune.com.ar

Your repaint is on a timer ? Dejavu… I’ve tryed it before. It’s a nice
theory, but it simply doesn’t work. I will tell you why.

First, you got pretty machines, with a few microseconds per CPU cycle
and all, but your timer can only be called after about 10ms. This limits
your game to only 100FPS, even if the game is “what’s behind the black
screen” and you got a Deep Blue or better. :slight_smile: BTW, 10ms is usually the
MINUMUM. You can get a 13, 16 or even 20+ms in the real life.

Second, does your callback returns the same interval it received as
parameter ? Think about it… Your timer waits 10+ms to start, then
takes… 10ms ? 20ms ? to run. Then, you tell SDL to wait another 10ms
to call your function again (see the SDL manual). That’s 20/30+ms per
frame. A “black screen of death” game is limited to 50FPS or less.

Third, did you know that your interval must be a multiple of about 10ms
? Depending on your system, the timer can have intervals of 10, 20,
30… miliseconds (or 8, 16, 24… or 20, 40, 60… etc).

And this is because you got cool machines. If you got a K6-2 (500Mhz) as
I did some years ago, you will have serious trouble with a backbuffer
surface. Your game must do a lot of memory copy to the backbuffer
surface, then to the screen. This is about 1.44Mb per frame. If it’s a
SW double buffer screen, we must multiply it by 3+ (to sw bb, to sw fb,
to hw fb). Yikes… I guess this is the reason they created a HW
backbuffer screen. You do all the mess on screen’s backbuffer, then do a
SDL_Flip.

Oh, yes ! And you are using alpha blit without HW acceleration. Poor
CPU… :slight_smile:

I’m playing with SDL, and I could reach more than 300FPS copying
640x480/1000x480 surfaces to screen with HWBB with all HW accel I could
start… The machine ? An Athlon XP 1.7+/512 MB RAM/GeForce4 MX440…

I hope this information helps…

Best regards,
Eduardo Costa

PS: Have you tryed the SDL examples ? They are usually nice “speed meters”.

Sebastian Uribe wrote:>

Hi,

We’re making a game with SDL, currently only for Windows. It’s a
puzzle/action game, 2D, with some sprites constantly moving on-screen
(on a 500x500 pixels area, screen res: 800x600).

We need to use Alpha blitting to show smooth sprites over the
background, but we found it too slow.
Basically we’re doing something like:

screen = SDLSurface (tried HWSURFACE, SWSURFACE, double and single
buffer, results are almost the same)
backbuffer = SDLSurface (SWSURFACE, also tried HWSURFACE and making
all our images HW also)

  1. blit the background to backbuffer
  2. blit all interface elements and sprites to the backbuffer (the
    game is mostly buttons and a big area with some characters running all
    the time)
  3. blit backbuffer to screen
  4. SDL_Flip() - voila!

We record the time with something like this:

  lastCalled = GetTickCount();
  Repaint();
  pThis.logFile << "Repaint Time: " << ( GetTickCount() - 

lastCalled ) << endl;

And we obtain something like this (milliseconds):


Repaint Time: 47
Repaint Time: 46
Repaint Time: 47
Repaint Time: 32
Repaint Time: 46
Repaint Time: 47
Repaint Time: 47
Repaint Time: 47
Repaint Time: 47
Repaint Time: 47

Which means that we might get between 20 and 30 FPS, although most
of the time it seems near 20 (and sometimes time is 60+ ms!)

Our machines are Athlon 1700+/384MB RAM/GForce2MX400 & Athlon
2000+/512MB RAM/GForce4MX440. Maybe not next-gen, but sure not slow.

Some extra info:

  • Game state is mantained using an SDL_Timer @ aprox. 20 ms
    interval. The timer function updates the game with the “real” time
    elapsed (according to GetTimerCount()).
  • Repaint snippet posted above resides in another SDL_Timer,
    somewhere else in the program.
  • Interface events are catched in a loop like:

while( !exitApp ) {
SDL_WaitEvent(&sdlEvent);
ProcMessage( sdlEvent );
}

In the “main” thread of the program.

So, the question is: ?Are this results the best we might get (20 fps
@ 800x600), or are we doing something really really wrong? I looked
for but could not find benchmark results of alpha blitting with SDL.
I’ll try to make some, but any info will be greatly appreciated.

Thanks a lot… and sorry for the lengthly mail!

Did you convert the sprites to the display’s format?
(e.g., if the sprites are 24bpp, did you convert them down to 16bpp,
or up to 32bpp, if that’s what your display is?)

If not, then SDL is doing it for you every_time you blit a sprite!
(So multiple that by the number of sprites per frame, and it can get
pretty slow!)

Look into SDL_DisplayFormat() and SDL_DisplayFormatAlpha()

-bill!On Thu, Jan 08, 2004 at 11:13:50AM -0300, Sebastian Uribe wrote:

Hi,

We’re making a game with SDL, currently only for Windows. It’s a
puzzle/action game, 2D, with some sprites constantly moving on-screen
(on a 500x500 pixels area, screen res: 800x600).

Bill Kendrick wrote:

Did you convert the sprites to the display’s format?
(e.g., if the sprites are 24bpp, did you convert them down to 16bpp,
or up to 32bpp, if that’s what your display is?)

If not, then SDL is doing it for you every_time you blit a sprite!
(So multiple that by the number of sprites per frame, and it can get
pretty slow!)

Look into SDL_DisplayFormat() and SDL_DisplayFormatAlpha()

Yes, I gave that a try, but it didn't help. The other suggestion 

(about the timers) seems to be the way to go…–
///
Sebasti?n Uribe
Inmune Games
suribe at inmune.com.ar
www.inmune.com.ar

Thanks for your advice, I made a quick test and it seemed to improve
a lot. Now I have a couple of threads:

  1. Game updating thread

  2. Rendering thread

  3. Event processing thread (input)

    with something like:

    while (!mustExit) {
    SDL_Delay(5);

    before = GetTickCount();
     Repaint()
    log << "Render time: " << (GetTickCount() - before) << endl;
    

    }

    And rendering times are almost the same, but the overall speed and
    "feeling" of the app. is much improved.

    And as i read in some msg. it’s bad idea to do screen blitting from
    any but the main thread, so I’m fixing some things with that now. But it
    really looks much better. :slight_smile:

    Haven’t looked at the examples yet, but will do for sure.

Eduardo Costa wrote:> Your repaint is on a timer ? Dejavu… I’ve tryed it before. It’s a

nice theory, but it simply doesn’t work. I will tell you why.

First, you got pretty machines, with a few microseconds per CPU cycle
and all, but your timer can only be called after about 10ms. This
limits your game to only 100FPS, even if the game is “what’s behind
the black screen” and you got a Deep Blue or better. :slight_smile: BTW, 10ms is
usually the MINUMUM. You can get a 13, 16 or even 20+ms in the real life.

Second, does your callback returns the same interval it received as
parameter ? Think about it… Your timer waits 10+ms to start, then
takes… 10ms ? 20ms ? to run. Then, you tell SDL to wait another 10ms
to call your function again (see the SDL manual). That’s 20/30+ms per
frame. A “black screen of death” game is limited to 50FPS or less.

Third, did you know that your interval must be a multiple of about
10ms ? Depending on your system, the timer can have intervals of 10,
20, 30… miliseconds (or 8, 16, 24… or 20, 40, 60… etc).

And this is because you got cool machines. If you got a K6-2 (500Mhz)
as I did some years ago, you will have serious trouble with a
backbuffer surface. Your game must do a lot of memory copy to the
backbuffer surface, then to the screen. This is about 1.44Mb per
frame. If it’s a SW double buffer screen, we must multiply it by 3+
(to sw bb, to sw fb, to hw fb). Yikes… I guess this is the reason
they created a HW backbuffer screen. You do all the mess on screen’s
backbuffer, then do a SDL_Flip.

Oh, yes ! And you are using alpha blit without HW acceleration. Poor
CPU… :slight_smile:

I’m playing with SDL, and I could reach more than 300FPS copying
640x480/1000x480 surfaces to screen with HWBB with all HW accel I
could start… The machine ? An Athlon XP 1.7+/512 MB RAM/GeForce4
MX440…

I hope this information helps…

Best regards,
Eduardo Costa

PS: Have you tryed the SDL examples ? They are usually nice “speed
meters”.


///
Sebasti?n Uribe
Inmune Games
suribe at inmune.com.ar
www.inmune.com.ar

Hi all !

I have a little question. How much time i need to wait after the
SDL_SetVideoMode
with SDL_HWSURFACE | SDL_DOUBLEBUF | SDL_FULLSCREEN and
SDL_VIDEODRIVER=dga to assure that the frame is displayed onto the screen ?

Is there some way to know it ? After openning the video and painting a
frame, I play a sound,
but I have a synchronization problem because i can hear the music but the
screen is black !
If I put a ‘sleep(1)’ before the sound playing i can see the frame.

Any comment is welcome,
Thanks

Jorge

Hello.

Just something about that SDL_Delay(5): Maybe you’ll want to make some
kind of dynamic framerate limitation. Look at
http://www.ifm.liu.se/~ulfek/projects/timers/ for details.

Regards,
Bernhard

Sebastian Uribe wrote:>

Thanks for your advice, I made a quick test and it seemed to improve
a lot. Now I have a couple of threads:

  1. Game updating thread

  2. Rendering thread

  3. Event processing thread (input)

    with something like:

    while (!mustExit) {
    SDL_Delay(5);

    before = GetTickCount();
    Repaint()
    log << "Render time: " << (GetTickCount() - before) << endl;
    }

    And rendering times are almost the same, but the overall speed and
    "feeling" of the app. is much improved.

    And as i read in some msg. it’s bad idea to do screen blitting from
    any but the main thread, so I’m fixing some things with that now. But it
    really looks much better. :slight_smile:

    Haven’t looked at the examples yet, but will do for sure.

Eduardo Costa wrote:

Your repaint is on a timer ? Dejavu… I’ve tryed it before. It’s a
nice theory, but it simply doesn’t work. I will tell you why.

First, you got pretty machines, with a few microseconds per CPU cycle
and all, but your timer can only be called after about 10ms. This
limits your game to only 100FPS, even if the game is “what’s behind
the black screen” and you got a Deep Blue or better. :slight_smile: BTW, 10ms is
usually the MINUMUM. You can get a 13, 16 or even 20+ms in the real life.

Second, does your callback returns the same interval it received as
parameter ? Think about it… Your timer waits 10+ms to start, then
takes… 10ms ? 20ms ? to run. Then, you tell SDL to wait another 10ms
to call your function again (see the SDL manual). That’s 20/30+ms per
frame. A “black screen of death” game is limited to 50FPS or less.

Third, did you know that your interval must be a multiple of about
10ms ? Depending on your system, the timer can have intervals of 10,
20, 30… miliseconds (or 8, 16, 24… or 20, 40, 60… etc).

And this is because you got cool machines. If you got a K6-2 (500Mhz)
as I did some years ago, you will have serious trouble with a
backbuffer surface. Your game must do a lot of memory copy to the
backbuffer surface, then to the screen. This is about 1.44Mb per
frame. If it’s a SW double buffer screen, we must multiply it by 3+
(to sw bb, to sw fb, to hw fb). Yikes… I guess this is the reason
they created a HW backbuffer screen. You do all the mess on screen’s
backbuffer, then do a SDL_Flip.

Oh, yes ! And you are using alpha blit without HW acceleration. Poor
CPU… :slight_smile:

I’m playing with SDL, and I could reach more than 300FPS copying
640x480/1000x480 surfaces to screen with HWBB with all HW accel I
could start… The machine ? An Athlon XP 1.7+/512 MB RAM/GeForce4
MX440…

I hope this information helps…

Best regards,
Eduardo Costa

PS: Have you tryed the SDL examples ? They are usually nice “speed
meters”.

That’s interesting, I’ll test it and see how it works in Windows.

Bernhard Bliem wrote:> Hello.

Just something about that SDL_Delay(5): Maybe you’ll want to make some
kind of dynamic framerate limitation. Look at
http://www.ifm.liu.se/~ulfek/projects/timers/ for details.


///
Sebasti?n Uribe
Inmune Games
suribe at inmune.com.ar
www.inmune.com.ar

jorgefm at cirsa.com wrote:

Hi all !

I have a little question. How much time i need to wait after the
SDL_SetVideoMode
with SDL_HWSURFACE | SDL_DOUBLEBUF | SDL_FULLSCREEN and
SDL_VIDEODRIVER=dga to assure that the frame is displayed onto the screen ?

Is there some way to know it ? After openning the video and painting a
frame, I play a sound,
but I have a synchronization problem because i can hear the music but the
screen is black !

It’s probably not SDL’s fault but rather your screen switching video
mode (this is normal in DGA). Switching a video mode can take some time
(like in th 1 sec. range).
If you open the SDL display and just then start displaying something,
the screen will be switching during the first second or so and you won’t
be able to see what’s happening. Opening the SDL display should be the
first thing you do, do that even before loading your data so that data
loading time and screen video mode switching time overlap.

Stephane

I have a little question. How much time i need to wait after the
SDL_SetVideoMode
with SDL_HWSURFACE | SDL_DOUBLEBUF | SDL_FULLSCREEN and
SDL_VIDEODRIVER=dga to assure that the frame is displayed onto the screen ?

It depends on how long your monitor takes to sync to the new resolution.
Flat panel monitors probably take little if any time, while my Sony
monitor takes up to 3 seconds.

See ya!
-Sam Lantinga, Software Engineer, Blizzard Entertainment

I have a little question. How much time i need to wait after the
SDL_SetVideoMode
with SDL_HWSURFACE | SDL_DOUBLEBUF | SDL_FULLSCREEN and
SDL_VIDEODRIVER=dga to assure that the frame is displayed onto the
screen ?

It depends on how long your monitor takes to sync to the new resolution.
Flat panel monitors probably take little if any time, while my Sony
monitor takes up to 3 seconds.

Thanks for the replies !

Is there someway to calculate this time in runtime or it must be
approximate
in a platform-dependant way ?

Thanks,
Jorge

Due to the nature of the problem, it is not simply impractical to do, it
is actually impossible.

The time it takes a monitor to once again display a /legible/ image
after a video mode switch differs wildly from monitor to monitor, and
sometimes even differs depending on what modes you are switching
between.

It can also change as the monitor gets older in some cases.

To make matters worse, I know of no monitor<->computer interface that
gives a way for the monitor to report ANY of this, nor to report when an
image is actually being displayed.

In short, no, you’re SOL.On Mon, Jan 12, 2004 at 10:54:35AM +0100, jorgefm at cirsa.com wrote:

I have a little question. How much time i need to wait after the
SDL_SetVideoMode
with SDL_HWSURFACE | SDL_DOUBLEBUF | SDL_FULLSCREEN and
SDL_VIDEODRIVER=dga to assure that the frame is displayed onto the
screen ?

It depends on how long your monitor takes to sync to the new resolution.
Flat panel monitors probably take little if any time, while my Sony
monitor takes up to 3 seconds.

Thanks for the replies !

Is there someway to calculate this time in runtime or it must be
approximate in a platform-dependant way ?


1024D/E65A7801 Zephaniah E. Hull <@Zephaniah_E_Hull>
92ED 94E4 B1E6 3624 226D 5727 4453 008B E65A 7801
CCs of replies from mailing lists are requested.

Perl == Being
– Descartes (paraphrased).
-------------- next part --------------
A non-text attachment was scrubbed…
Name: not available
Type: application/pgp-signature
Size: 189 bytes
Desc: Digital signature
URL: http://lists.libsdl.org/pipermail/sdl-libsdl.org/attachments/20040112/a68778b1/attachment.pgp

Is there someway to calculate this time in runtime or it must be
approximate
in a platform-dependant way ?

Thanks,
Jorge

While theres no way to calculate or get the time value, most people can
assume the time to be less than 15 seconds (the folks up at Microsoft are in
this category), while other, more conservative people, assume it to be
somewhat less than 3 hours.

If you don’t mind me asking, to what pupose would you be using this
information?

  • Silicon

Outgoing mail is certified Virus Free.
Checked by AVG anti-virus system (http://www.grisoft.com).
Version: 6.0.552 / Virus Database: 344 - Release Date: 12/15/2003

----- Original Message -----
From: jorgefm@cirsa.com (Jorge Fernandez Monteagudo)
To:
Sent: Monday, January 12, 2004 2:54 AM
Subject: Re: [SDL] Video openning time.

While theres no way to calculate or get the time value, most people can
assume the time to be less than 15 seconds (the folks up at Microsoft are
in
this category), while other, more conservative people, assume it to be
somewhat less than 3 hours.

If you don’t mind me asking, to what pupose would you be using this
information?

Well, I’m developing an embedded system with a graphic application in DGA
mode.
But I have to release the video and sound resources, this is, closing the
DGA
video screen and releasing the sound hardware with some external event.
Then I launch a QT app. to do some maintenance tasks. Another external
event
makes me restore the graphical app to the point when the access was
released.
I have to assure the synchronization of video and sound, but the sound
could
be played when the image is not painted! Now i make a delay to assure the
image
is visible.

Thanks,
Jorge

                  "John Silicon"                                                                                                
                  <jsilicon at earthlink      Para:     <sdl at libsdl.org>                                                           
                  .net>                    cc:                                                                                  
                  Enviado por:             Asunto:   Re: [SDL] Video openning time.                                             
                  sdl-admin at libsdl.or                                                                                           
                  g                                                                                                             
                                                                                                                                
                                                                                                                                
                  12/01/2004 14.17                                                                                              
                  Por favor, responda                                                                                           
                  a sdl> ----- Original Message -----

From: @Jorge_Fernandez_Mont (Jorge Fernandez Monteagudo)
To:
Sent: Monday, January 12, 2004 2:54 AM
Subject: Re: [SDL] Video openning time.

Is there someway to calculate this time in runtime or it must be
approximate
in a platform-dependant way ?

Thanks,
Jorge

While theres no way to calculate or get the time value, most people can
assume the time to be less than 15 seconds (the folks up at Microsoft are
in
this category), while other, more conservative people, assume it to be
somewhat less than 3 hours.

If you don’t mind me asking, to what pupose would you be using this
information?

  • Silicon

Outgoing mail is certified Virus Free.
Checked by AVG anti-virus system (http://www.grisoft.com).
Version: 6.0.552 / Virus Database: 344 - Release Date: 12/15/2003


SDL mailing list
SDL at libsdl.org
http://www.libsdl.org/mailman/listinfo/sdl