Still haven't found my framerate

Hello

I’ve tried tonnes of things now to get my framerate back up and still
haven’t succeeded. My system spec is a P4 2 GHz, 256 DDR Ram and a Geforce
4 - Ti4600.

I’ve tried creating the surfaces in hardware and software, with double
buffering, without double buffering and so on.

If I draw 1 large 640x480 image at 32 bit. Then fill the screen with 32x32
images over the top (so it’s a tiled map ) I’m pretty much drawing the
screen twice with the extra overhead of lots of smaller blits for the second
layer.

My frames per second is like 17-20fps for this.

Is this roughly correct? When I switch to CDX I get 800+ but I’d prefer to
use SDL.

Any suggestions?

Thanks

Scott

If I draw 1 large 640x480 image at 32 bit. Then fill the screen with 32x32
images over the top (so it’s a tiled map ) I’m pretty much drawing the
screen twice with the extra overhead of lots of smaller blits for the second
layer.

Don’t draw the screen twice. :slight_smile:

Make sure the tiles you blit are in the same format as the screen with
SDL_ConvertSurface(). Not doing this is a common (and extremely
expensive!) mistake.

My frames per second is like 17-20fps for this.

Is your desktop at 32-bits? If it isn’t, you’re really drawing everything
three times, since SDL converts internally to the screen format if
needed.

Is this roughly correct? When I switch to CDX I get 800+ but I’d prefer to
use SDL.

800+ fps? I find that extremely hard to believe.

–ryan.

IM sure it is some configuration problem… its imposible to get that frame
rate
with such a poweful machine you have. Have you checked what drivers are
using the aps with SDL? There was a way of seeing that.

Sorry if i cant help you, but im sure its not a hardware or SDL problem, may
be some system config… like the kernel config or some driver issue.

                       Eduardo Garcia Rajo (h)------------------------------------------------------------------

Visite: http://www.solucion-digital.com.ar
SOLUCION DIGITAL
Redes - Software - Servicios

----- Original Message -----
From: waheyluggage@blueyonder.co.uk (Scott Newby)
To:
Sent: Wednesday, June 04, 2003 9:59 PM
Subject: [SDL] still haven’t found my framerate.

I’m using…

surface = SDL_DisplayFormat(surface) ;

For them. I was under the impression this would do it.
My desktop is at 32bit. Why does it need to convert to the screen format?
Does this mean that if your desktop is in 16 bit and you run a 32 bit game
it will be slower than running with both in 32 bit?

800+ fps? I find that extremely hard to believe.

Why? I’ve got a 128 meg card and everything is in hardware.

What I’m worried about is that I don’t seem to be able to draw 2 screens at
full framerate. That isn’t asking very much. I get a lot better framerate
when I drop resolution and colour depth but this is a new millenium.> ----- Original Message -----

I’m using
From: icculus@icculus.org (Ryan C. Gordon)
To:
Sent: Thursday, June 05, 2003 2:30 AM
Subject: Re: [SDL] still haven’t found my framerate.

If I draw 1 large 640x480 image at 32 bit. Then fill the screen with
32x32

images over the top (so it’s a tiled map ) I’m pretty much drawing the
screen twice with the extra overhead of lots of smaller blits for the
second

layer.

Don’t draw the screen twice. :slight_smile:

Make sure the tiles you blit are in the same format as the screen with
SDL_ConvertSurface(). Not doing this is a common (and extremely
expensive!) mistake.

My frames per second is like 17-20fps for this.

Is your desktop at 32-bits? If it isn’t, you’re really drawing everything
three times, since SDL converts internally to the screen format if
needed.

Is this roughly correct? When I switch to CDX I get 800+ but I’d prefer
to

use SDL.

800+ fps? I find that extremely hard to believe.

–ryan.


SDL mailing list
SDL at libsdl.org
http://www.libsdl.org/mailman/listinfo/sdl

“Scott Newby” writes:

Hello

I’ve tried tonnes of things now to get my framerate back up and still
haven’t succeeded. My system spec is a P4 2 GHz, 256 DDR Ram and a Geforce
4 - Ti4600.

I’ve tried creating the surfaces in hardware and software, with double
buffering, without double buffering and so on.

If I draw 1 large 640x480 image at 32 bit. Then fill the screen with 32x32
images over the top (so it’s a tiled map ) I’m pretty much drawing the
screen twice with the extra overhead of lots of smaller blits for the second
layer.

My frames per second is like 17-20fps for this.

Is this roughly correct? When I switch to CDX I get 800+ but I’d prefer to
use SDL.

Any suggestions?

Can you post the program / graphics your using for testing perhaps?
That might help us help you.–
[ Below is a random fortune, which is unrelated to the above message. ]
I once decorated my apartment entirely in ten foot salad forks!!

I’m using…

surface = SDL_DisplayFormat(surface) ;

For them. I was under the impression this would do it.
My desktop is at 32bit. Why does it need to convert to the screen
format?

Becaues otherwise, SDL will have to convert the surfaces on the fly
while blitting.

Does this mean that if your desktop is in 16 bit and you
run a 32 bit game it will be slower than running with both in 32
bit?

Yes, quite possibly. However, if it’s all software rendering (ie no
DMA blit from the shadow surface to the screen), it’s also quite
possible that the 16 bit screen is faster regardless. Touching VRAM
with the CPU is relatively speaking very expensive these days.

800+ fps? I find that extremely hard to believe.

Why? I’ve got a 128 meg card and everything is in hardware.

Why, yes - and you should get that kind of frame rates with glSDL,
which is currently the only way to get full h/w acceleration with the
SDL API, AFAIK. Few other targets, if any, can accelerate alpha
blending, and on many targets, blits cannot be accelerated at all.
OpenGL is pretty much the only portable and complete solution for
fully h/w accelerated rendering.

What I’m worried about is that I don’t seem to be able to draw 2
screens at full framerate. That isn’t asking very much.

It is, unless you rely on hardware acceleration. You will need to use
Direct3D or OpenGL to do that reliably, but SDL does not (yet) have
backends for these.

Still, if you’re on Win32, you should be able to get SDL to use
DirectX h/w accelerated 2D blitting. All surfaces must be hardware
surfaces for that to work, AFAIK. For some reason (probably because
there are too many things DirectDraw can’t accelerate), it doesn’t
seem to be nearly as fast as OpenGL, but it does help.

//David Olofson - Programmer, Composer, Open Source Advocate

.- The Return of Audiality! --------------------------------.
| Free/Open Source Audio Engine for use in Games or Studio. |
| RT and off-line synth. Scripting. Sample accurate timing. |
`-----------------------------------> http://audiality.org -’
http://olofson.nethttp://www.reologica.se —On Thursday 05 June 2003 05.01, Scott Newby wrote:

Hi,
Could you post the program(or graphics code fragment) that you are using for
testing? I think that something is wrong… I’m programming a RTS in 2D and
I’m using a tilemap, at 1024x768x32bits with GeForce4 Ti4200 and Athlon XP
3000+ I get ~180fps, everything is in video ram and hardware accelerated but
I redraw in each frame, how you see, this is far from your ~20fps…,
you must be careful with alpha blending in hardware mode and with the
different surfaces formats.
I think that nobody can help you becouse this only can a speculative post
without any code fragment…

And… Do you get 800fps with CDX? I know that CDX and SDL are wrappers(SDL
is more than this…but to simplify) over DirectX, both use the same base…
so… why the difference is so enormous? something is not equal, I’m sure
about this.

bye

Roberto Prieto
MegaStorm Systems © 2003
http://www.megastormsystems.com_______________________________________________
SDL mailing list
SDL at libsdl.org
http://www.libsdl.org/mailman/listinfo/sdl

“Scott Newby” writes:

Hello

I’ve tried tonnes of things now to get my framerate back up and still
haven’t succeeded. My system spec is a P4 2 GHz, 256 DDR Ram and a Geforce
4 - Ti4600.

I’ve tried creating the surfaces in hardware and software, with double
buffering, without double buffering and so on.

Did you actually test to see that you got hardware surfaces? Just
telling SDL to create the surface in hardware does not ensure that it is
created in hardware.

How did you set the bits/pixel? Is it the same as the screen setting on
the machine. Full screen or a window? Do you base the setting on
information you get back from SDL or do you just set it?

What is the AGP bus speed on your computer? What is the AGP aperture
size set to?

What OS are you using and have you upgraded to the latest drivers?

I’m asking all this because on a machine with about a quarter the speed
of yours doing about half the work you are doing entirely in software I
have to throttle my code to stay under 100 FPS.

You can see that code at:
http://linux.oreillynet.com/pub/a/linux/2003/05/15/sdl_anim.html?page=last&x-showcontent=text#threadOn Wed, 2003-06-04 at 22:28, David Hedbor wrote:

If I draw 1 large 640x480 image at 32 bit. Then fill the screen with 32x32
images over the top (so it’s a tiled map ) I’m pretty much drawing the
screen twice with the extra overhead of lots of smaller blits for the second
layer.

My frames per second is like 17-20fps for this.

Is this roughly correct? When I switch to CDX I get 800+ but I’d prefer to
use SDL.

Any suggestions?

Can you post the program / graphics your using for testing perhaps?
That might help us help you.

±----------------------------------+

  • Bob Pendleton: independent writer +
  • and programmer. +
  • email: Bob at Pendleton.com +
    ±----------------------------------+

Thanks the replies people.

I’m running Win 2000, have got the very latest drivers (there were a new set
released a week or so ago).
How do I test if the surface is in fact in hardware? Even so from what you
say it should be quick enough in software. My AGP is 4x. In a window it’s
about 8 fps better than fullscreen.

I’ll put some of the bits of code here. Grrr. my tabs all go haywire.

mainGlobs is just a structure to hold any global variables for main.c. It’s
declared at the top of main.c and no other source file has access to it
unless by Get and Set function calls.

Thanks for any help.

typedef struct
{

SDL_Surface *screen;

BOOL useHardware;
BOOL windowed;
U32 width;
U32 height;
U32 depth;

}mainGlobsType, *lpMainGlobsType;

mainGlobsType mainGlobs = { 0 };

Setting the video mode…

void Main_SetVideoMode( U32 width, U32 height, U32 depth, BOOL isFullScreen,
BOOL useHardware)
{
U32 flags = 0;

mainGlobs.useHardware = useHardware;
mainGlobs.isWindowed = !isFullScreen;
mainGlobs.width = width;
mainGlobs.height = height;
mainGlobs.depth = depth;

if (useHardware) flags |= SDL_HWSURFACE | SDL_DOUBLEBUF;
else flags |= SDL_SWSURFACE ;

if (fullScreen) flags |= SDL_FULLSCREEN;

mainGlobs.screen = SDL_SetVideoMode(width, height, depth, flags );
}

creating an image…

lpImage Image_Create(lpFILENAME imageFilename )
{
lpImage newImage = NULL;

if (imageFilename)
{
newImage = (lpImage) Mem_Alloc( sizeof( struct _Image ) );

if (newImage)
{
newImage->sdlSurface = IMG_Load( imageFilename );

SDL_DisplayFormat(newImage->sdlSurface) ;

if (newImage->sdlSurface)
{
Image_SetFlags( newImage, 0 );
}
else
{
Mem_Free(newImage);
newImage = NULL;
}
}
}

return newImage;

}

rendering an image…

VOID Image_RenderSurface( lpImage image, S16 x, S16 y )
{
SDL_Rect destSDL;

if (image)
{
destSDL.x = x;
destSDL.y = y;
destSDL.w = 0;
destSDL.h = 0;

SDL_BlitSurface( image->sdlSurface, NULL, Main_GetScreen(), &destSDL);
}

}> ----- Original Message -----

From: bob@pendleton.com (Bob Pendleton)
To: “SDL Mailing List”
Sent: Thursday, June 05, 2003 5:12 PM
Subject: Re: [SDL] still haven’t found my framerate.

On Wed, 2003-06-04 at 22:28, David Hedbor wrote:

“Scott Newby” writes:

Hello

I’ve tried tonnes of things now to get my framerate back up and still
haven’t succeeded. My system spec is a P4 2 GHz, 256 DDR Ram and a
Geforce

4 - Ti4600.

I’ve tried creating the surfaces in hardware and software, with double
buffering, without double buffering and so on.

Did you actually test to see that you got hardware surfaces? Just
telling SDL to create the surface in hardware does not ensure that it is
created in hardware.

How did you set the bits/pixel? Is it the same as the screen setting on
the machine. Full screen or a window? Do you base the setting on
information you get back from SDL or do you just set it?

What is the AGP bus speed on your computer? What is the AGP aperture
size set to?

What OS are you using and have you upgraded to the latest drivers?

I’m asking all this because on a machine with about a quarter the speed
of yours doing about half the work you are doing entirely in software I
have to throttle my code to stay under 100 FPS.

You can see that code at:

http://linux.oreillynet.com/pub/a/linux/2003/05/15/sdl_anim.html?page=last&x
-showcontent=text#thread

If I draw 1 large 640x480 image at 32 bit. Then fill the screen with
32x32

images over the top (so it’s a tiled map ) I’m pretty much drawing the
screen twice with the extra overhead of lots of smaller blits for the
second

layer.

My frames per second is like 17-20fps for this.

Is this roughly correct? When I switch to CDX I get 800+ but I’d
prefer to

use SDL.

Any suggestions?

Can you post the program / graphics your using for testing perhaps?
That might help us help you.

±----------------------------------+

  • Bob Pendleton: independent writer +
  • and programmer. +
  • email: Bob at Pendleton.com +
    ±----------------------------------+

SDL mailing list
SDL at libsdl.org
http://www.libsdl.org/mailman/listinfo/sdl

Thanks the replies people.

I’m running Win 2000, have got the very latest drivers (there were a new set
released a week or so ago).
How do I test if the surface is in fact in hardware? Even so from what you
say it should be quick enough in software. My AGP is 4x. In a window it’s
about 8 fps better than fullscreen.

I’ll put some of the bits of code here. Grrr. my tabs all go haywire.

mainGlobs is just a structure to hold any global variables for main.c. It’s
declared at the top of main.c and no other source file has access to it
unless by Get and Set function calls.

Thanks for any help.

typedef struct
{

SDL_Surface *screen;

BOOL useHardware;
BOOL windowed;
U32 width;
U32 height;
U32 depth;

}mainGlobsType, *lpMainGlobsType;

mainGlobsType mainGlobs = { 0 };

Setting the video mode…

void Main_SetVideoMode( U32 width, U32 height, U32 depth, BOOL isFullScreen,
BOOL useHardware)
{
U32 flags = 0;

mainGlobs.useHardware = useHardware;
mainGlobs.isWindowed = !isFullScreen;
mainGlobs.width = width;
mainGlobs.height = height;
mainGlobs.depth = depth;

if (useHardware) flags |= SDL_HWSURFACE | SDL_DOUBLEBUF;
else flags |= SDL_SWSURFACE ;

if (fullScreen) flags |= SDL_FULLSCREEN;

mainGlobs.screen = SDL_SetVideoMode(width, height, depth, flags );

You don’t even check to see if got a hardware surface. Just check the
flags variable in the screen to see if it is set. If the depth doesn’t
match what is available on your machine SDL will emulate it for you. So,
don’t specify, use what your system gives you.

}

creating an image…

lpImage Image_Create(lpFILENAME imageFilename )
{
lpImage newImage = NULL;

if (imageFilename)
{
newImage = (lpImage) Mem_Alloc( sizeof( struct _Image ) );

if (newImage)
{
newImage->sdlSurface = IMG_Load( imageFilename );

SDL_DisplayFormat(newImage->sdlSurface) ;

Right here you convert to the display format and then ignore the
converted image. You then proceed to use the unconverted image.

The two problems I’ve pointed out so far would explain the slow down you
are getting. You really should go through your code and compare the way
you use SDL functions to the way they are documented.

	Bob PendletonOn Thu, 2003-06-05 at 11:43, Scott Newby wrote:

if (newImage->sdlSurface)
{
Image_SetFlags( newImage, 0 );
}
else
{
Mem_Free(newImage);
newImage = NULL;
}
}
}

return newImage;

}

rendering an image…

VOID Image_RenderSurface( lpImage image, S16 x, S16 y )
{
SDL_Rect destSDL;

if (image)
{
destSDL.x = x;
destSDL.y = y;
destSDL.w = 0;
destSDL.h = 0;

SDL_BlitSurface( image->sdlSurface, NULL, Main_GetScreen(), &destSDL);
}

}

----- Original Message -----
From: “Bob Pendleton” <@Bob_Pendleton>
To: “SDL Mailing List”
Sent: Thursday, June 05, 2003 5:12 PM
Subject: Re: [SDL] still haven’t found my framerate.

On Wed, 2003-06-04 at 22:28, David Hedbor wrote:

“Scott Newby” writes:

Hello

I’ve tried tonnes of things now to get my framerate back up and still
haven’t succeeded. My system spec is a P4 2 GHz, 256 DDR Ram and a
Geforce

4 - Ti4600.

I’ve tried creating the surfaces in hardware and software, with double
buffering, without double buffering and so on.

Did you actually test to see that you got hardware surfaces? Just
telling SDL to create the surface in hardware does not ensure that it is
created in hardware.

How did you set the bits/pixel? Is it the same as the screen setting on
the machine. Full screen or a window? Do you base the setting on
information you get back from SDL or do you just set it?

What is the AGP bus speed on your computer? What is the AGP aperture
size set to?

What OS are you using and have you upgraded to the latest drivers?

I’m asking all this because on a machine with about a quarter the speed
of yours doing about half the work you are doing entirely in software I
have to throttle my code to stay under 100 FPS.

You can see that code at:

http://linux.oreillynet.com/pub/a/linux/2003/05/15/sdl_anim.html?page=last&x
-showcontent=text#thread

If I draw 1 large 640x480 image at 32 bit. Then fill the screen with
32x32

images over the top (so it’s a tiled map ) I’m pretty much drawing the
screen twice with the extra overhead of lots of smaller blits for the
second

layer.

My frames per second is like 17-20fps for this.

Is this roughly correct? When I switch to CDX I get 800+ but I’d
prefer to

use SDL.

Any suggestions?

Can you post the program / graphics your using for testing perhaps?
That might help us help you.

±----------------------------------+

  • Bob Pendleton: independent writer +
  • and programmer. +
  • email: Bob at Pendleton.com +
    ±----------------------------------+

SDL mailing list
SDL at libsdl.org
http://www.libsdl.org/mailman/listinfo/sdl


SDL mailing list
SDL at libsdl.org
http://www.libsdl.org/mailman/listinfo/sdl

±----------------------------------+

  • Bob Pendleton: independent writer +
  • and programmer. +
  • email: Bob at Pendleton.com +
    ±----------------------------------+

Hello

Thanks for the reply. The converting bit was my mistake when making the
email. You see, if I copy from MSDEV straight to outlook I get huge spacing
in my email so I have to copy it to notepad first. Then I selected it
wrong.

Fair point regarding the hardware surface but my gfx card can do 640x480x32
bit in hardware so it shouldn’t be responsible for the poor framerate.

On Thu, 5 Jun 2003 04:01:07 +0100, “Scott Newby”
said:

I’m using…

surface = SDL_DisplayFormat(surface) ;

Not a solution, but incidentally, SDL_DisplayFormat makes a copy of the
surface in the display format. So you’re actually losing the original
surface with the code above. Better would be:

SDL_Surface * temp = SDL_DisplayFormat(surface);
SDL_FreeSurface(surface);
surface = temp;

Or something similar…

Dave.–
Dave Slutzkin
Melbourne, Australia
@Dave_Slutzkin


http://www.fastmail.fm - A fast, anti-spam email service.

It doesn’t matter what you card can do. There are all sorts of reasons
why you won’t get that setting even though you asked for it and your
card can do it. Until you verify that you got a hardware surface at the
setting you asked for, you can not assume that you got it.

I can’t tell you how many times I wasted days or weeks because I "knew"
that something was true, when it fact it was not true. In programming
nothing is true until you verify it. This goes for the buffers for your
sprites too.

I just spent a couple of weeks testing hardware buffers and I can tell
you that getting them right is a lot harder than most people think.

	Bob PendletonOn Thu, 2003-06-05 at 18:29, Scott Newby wrote:

Hello

Thanks for the reply. The converting bit was my mistake when making the
email. You see, if I copy from MSDEV straight to outlook I get huge spacing
in my email so I have to copy it to notepad first. Then I selected it
wrong.

Fair point regarding the hardware surface but my gfx card can do 640x480x32
bit in hardware so it shouldn’t be responsible for the poor framerate.


±----------------------------------+

  • Bob Pendleton: independent writer +
  • and programmer. +
  • email: Bob at Pendleton.com +
    ±----------------------------------+