OpenGL speed, Linux vs W2000

The SDL test program testgl.c gives quite different execution speeds on
Linux and W2000 on my system (P4, GF4 Ti). Running flat out in 1024x768
fullscreen, bpp=24, Linux gives 620 fps, while W2000 gives 1550 fps. It
seems that the difference is in SDL_GL_SwapBuffers(). Commenting that
out pushes the Linux version up to 1650 fps. (I think W2000 gives
something similar, but the program doesn’t like to run with buffer swap
suppressed. When run from a
Command Prompt window it returns quickly to the command prompt but the
process continues to execute. This odd behaviour might be of interest
to someone.)

I realize that 620 vs 1550 fps is actually rather academic, since it
represents only an extra 0.1 ms per frame, but nonetheless I’m curious
to know why swapping buffers with W2000 is faster than with Linux, since
I think this is happening on the video card.

Gib

On Thu, 23 Jan 2003 12:19:03 +1300, “Gib Bogle” said:

The SDL test program testgl.c gives quite different execution speeds on
Linux and W2000 on my system (P4, GF4 Ti). Running flat out in
1024x768 fullscreen, bpp=24, Linux gives 620 fps, while W2000 gives
1550 fps. It seems that the difference is in SDL_GL_SwapBuffers().
Commenting that out pushes the Linux version up to 1650 fps. (I think
W2000 gives something similar, but the program doesn’t like to run with
buffer swap suppressed. When run from a Command Prompt window it
returns quickly to the command prompt but the process continues to
execute. This odd behaviour might be of interest to someone.)

I realize that 620 vs 1550 fps is actually rather academic, since it
represents only an extra 0.1 ms per frame, but nonetheless I’m curious
to know why swapping buffers with W2000 is faster than with Linux,
since I think this is happening on the video card.

I’ve seen this sort of symptom on machines with different video cards and
drivers, but with the same OS. I think it occurs because one of the
machines (the 2000 machine, in your case), is not waiting for the vsync
on a SwapBuffers call - as you say, when it is commented out on the other
machine, you get about the same frame rate (disregarding small OS
specific performance differences).

I’m not certain exactly why this is - I think there’s an option on the
driver of many cards in Windows to turn off or on synchronization with
the vertical retrace. Some drivers may not even support vsync waiting.
Does that sound about right?

Dave.–
David Slutzkin, BCS (Hons)
Melbourne, Australia
@Dave_Slutzkin


http://fastmail.fm - The way an email service should be

On Thu, 23 Jan 2003 12:19:03 +1300, “Gib Bogle” said:

The SDL test program testgl.c gives quite different execution speeds on
Linux and W2000 on my system (P4, GF4 Ti). Running flat out in
1024x768 fullscreen, bpp=24, Linux gives 620 fps, while W2000 gives
1550 fps. It seems that the difference is in SDL_GL_SwapBuffers().
Commenting that out pushes the Linux version up to 1650 fps. (I think
W2000 gives something similar, but the program doesn’t like to run with
buffer swap suppressed. When run from a Command Prompt window it
returns quickly to the command prompt but the process continues to
execute. This odd behaviour might be of interest to someone.)

I realize that 620 vs 1550 fps is actually rather academic, since it
represents only an extra 0.1 ms per frame, but nonetheless I’m curious
to know why swapping buffers with W2000 is faster than with Linux,
since I think this is happening on the video card.

I’ve seen this sort of symptom on machines with different video cards and
drivers, but with the same OS. I think it occurs because one of the
machines (the 2000 machine, in your case), is not waiting for the vsync
on a SwapBuffers call - as you say, when it is commented out on the other
machine, you get about the same frame rate (disregarding small OS
specific performance differences).

I’m not certain exactly why this is - I think there’s an option on the
driver of many cards in Windows to turn off or on synchronization with
the vertical retrace. Some drivers may not even support vsync waiting.
Does that sound about right?

Dave.–
David Slutzkin, BCS (Hons)
Melbourne, Australia
@Dave_Slutzkin


http://fastmail.fm - mmm… fastmail…

The SDL test program testgl.c gives quite different execution speeds on
Linux and W2000 on my system (P4, GF4 Ti). Running flat out in 1024x768
fullscreen, bpp=24, Linux gives 620 fps, while W2000 gives 1550 fps. It
seems that the difference is in SDL_GL_SwapBuffers(). Commenting that
out pushes the Linux version up to 1650 fps. (I think W2000 gives
something similar, but the program doesn’t like to run with buffer swap
suppressed. When run from a
Command Prompt window it returns quickly to the command prompt but the
process continues to execute. This odd behaviour might be of interest
to someone.)

I realize that 620 vs 1550 fps is actually rather academic, since it
represents only an extra 0.1 ms per frame, but nonetheless I’m curious
to know why swapping buffers with W2000 is faster than with Linux, since
I think this is happening on the video card.

See what happens if you use a 1024x768 window in Windows. Linux might
be blitting on flip instead of page flipping (which would be a bug).

I’ve seen this sort of symptom on machines with different video cards and
drivers, but with the same OS. I think it occurs because one of the
machines (the 2000 machine, in your case), is not waiting for the vsync
on a SwapBuffers call - as you say, when it is commented out on the other
machine, you get about the same frame rate (disregarding small OS
specific performance differences).

It’s not a vsync problem–both configurations have vsync entirely disabled,
since he’s testing the speed of the actual flip. (Unless he has a
monitor that can do 620Hz in 1024x768, which I’m sure he doesn’t. :)On Thu, Jan 23, 2003 at 12:19:03PM +1300, Gib Bogle wrote:


Glenn Maynard

Are you sure that it is running in 24 bpp mode in windows?
Just asking, because the source by default tries to make 16 bit display,
but in X it will set 24 bpp mode if X is set to 24 bpp mode.

So when you changed the width and height did you also give the bpp value
to the program in windows?On Thursday 23 January 2003 01:19, Gib Bogle wrote:

The SDL test program testgl.c gives quite different execution speeds
on Linux and W2000 on my system (P4, GF4 Ti). Running flat out in
1024x768 fullscreen, bpp=24, Linux gives 620 fps, while W2000 gives
1550 fps. It seems that the difference is in SDL_GL_SwapBuffers().
Commenting that out pushes the Linux version up to 1650 fps. (I
think W2000 gives something similar, but the program doesn’t like to
run with buffer swap suppressed. When run from a
Command Prompt window it returns quickly to the command prompt but
the process continues to execute. This odd behaviour might be of
interest to someone.)

I realize that 620 vs 1550 fps is actually rather academic, since it
represents only an extra 0.1 ms per frame, but nonetheless I’m
curious to know why swapping buffers with W2000 is faster than with
Linux, since I think this is happening on the video card.

Message: 38

The SDL test program testgl.c gives quite different execution speeds
on Linux and W2000 on my system (P4, GF4 Ti). Running flat out in
1024x768 fullscreen, bpp=24, Linux gives 620 fps, while W2000 gives
1550 fps. It seems that the difference is in SDL_GL_SwapBuffers().
Commenting that out pushes the Linux version up to 1650 fps. (I
think W2000 gives something similar, but the program doesn’t like to
run with buffer swap suppressed. When run from a
Command Prompt window it returns quickly to the command prompt but
the process continues to execute. This odd behaviour might be of
interest to someone.)

I realize that 620 vs 1550 fps is actually rather academic, since it
represents only an extra 0.1 ms per frame, but nonetheless I’m
curious to know why swapping buffers with W2000 is faster than with
Linux, since I think this is happening on the video card.

Are you sure that it is running in 24 bpp mode in windows?
Just asking, because the source by default tries to make 16 bit display,
but in X it will set 24 bpp mode if X is set to 24 bpp mode.

So when you changed the width and height did you also give the bpp value
to the program in windows?

Yes, I made a couple of small changes to the program. In both cases I’m
using 24 bpp and 1024x768 fullscreen (no border). BTW, I also made it
display a cube by changing the glOrtho() call to:
glOrtho(-2.0,2.0,-2.0/aspect,2.0/aspect,-20.0,20.0) where aspect =
1024.0/768.0.

Gib> From: Sami Naatanen <sami.naatanen at kolumbus.fi>

To: sdl at libsdl.org
Subject: Re: [SDL] OpenGL speed, Linux vs W2000
Date: Thu, 23 Jan 2003 17:41:47 +0200
Reply-To: sdl at libsdl.org
On Thursday 23 January 2003 01:19, Gib Bogle wrote:

Which Linux driver version are you using, and what distro? I upgraded
from RH 7.1 to 8 and to the latest nvidia drivers and fps speeds in the
planetary gears screensaver are way down - from 42fps to 32fps on a
gforce2 400mx 64mb card. I’ve heard that the latest drivers may have
some 2d issues with X, but haven’t heard of any 3d issues so it’s hard
to tell whether it’s the very young Gnome2 libs eating up resources or
the new driver…

Be well,
Mike

Gib Bogle wrote:>Yes, I made a couple of small changes to the program. In both cases I’m

using 24 bpp and 1024x768 fullscreen (no border). BTW, I also made it
display a cube by changing the glOrtho() call to:
glOrtho(-2.0,2.0,-2.0/aspect,2.0/aspect,-20.0,20.0) where aspect =
1024.0/768.0.

Gib

Message: 16

The SDL test program testgl.c gives quite different execution speeds on
Linux and W2000 on my system (P4, GF4 Ti). Running flat out in 1024x768
fullscreen, bpp=24, Linux gives 620 fps, while W2000 gives 1550 fps. It
seems that the difference is in SDL_GL_SwapBuffers(). Commenting that
out pushes the Linux version up to 1650 fps. (I think W2000 gives
something similar, but the program doesn’t like to run with buffer swap
suppressed. When run from a
Command Prompt window it returns quickly to the command prompt but the
process continues to execute. This odd behaviour might be of interest
to someone.)

I realize that 620 vs 1550 fps is actually rather academic, since it
represents only an extra 0.1 ms per frame, but nonetheless I’m curious
to know why swapping buffers with W2000 is faster than with Linux, since
I think this is happening on the video card.

See what happens if you use a 1024x768 window in Windows. Linux might
be blitting on flip instead of page flipping (which would be a bug).

Very interesting. If I remove both SDL_FULLSCREEN and SDL_NOFRAME from
video_flags in the W2000 version I get 690 fps. Removes either one
alone has no effect. Do you think this means there might be a Linux
bug, and do you mean in SDL or X?

Now I have to correct a previous statement I made. I now realize from
looking at the output from the program that the W2000 version always
sets SDL_GL_DEPTH_SIZE to 16, while the Linux version sets it to 24,
regardless of what I try to set it at (16 or 24). The call
SDL_GL_SetAttribute(SDL_GL_DEPTH_SIZE, n) seems to be redundant -
commenting it out has no effect.

Gib> Date: Wed, 22 Jan 2003 19:14:21 -0500

From: Glenn Maynard <g_sdl at zewt.org>
To: sdl at libsdl.org
Subject: Re: [SDL] OpenGL speed, Linux vs W2000
Reply-To: sdl at libsdl.org
On Thu, Jan 23, 2003 at 12:19:03PM +1300, Gib Bogle wrote:

This is because X doesn’t allow on the fly color depth changes
currently. The windows one because of some driver setting AFAIK.On Thu, 2003-01-23 at 14:35, Gib Bogle wrote:

Now I have to correct a previous statement I made. I now realize from
looking at the output from the program that the W2000 version always
sets SDL_GL_DEPTH_SIZE to 16, while the Linux version sets it to 24,
regardless of what I try to set it at (16 or 24). The call
SDL_GL_SetAttribute(SDL_GL_DEPTH_SIZE, n) seems to be redundant -
commenting it out has no effect.

The reason 2d performance is down with the new drivers is because they
have removed XAA support. The reasoning was that they could not provide
adequate support for professional simultaneous 2D and 3D acceleration
with the XAA architecture in use. The Render extension is being worked
on though, and support for it is supposed to be greatly expanded for the
next release of the driver.On Thu, 2003-01-23 at 14:02, Mike Vanecek wrote:

Which Linux driver version are you using, and what distro? I upgraded
from RH 7.1 to 8 and to the latest nvidia drivers and fps speeds in the
planetary gears screensaver are way down - from 42fps to 32fps on a
gforce2 400mx 64mb card. I’ve heard that the latest drivers may have
some 2d issues with X, but haven’t heard of any 3d issues so it’s hard
to tell whether it’s the very young Gnome2 libs eating up resources or
the new driver…

If nothing else is onscreen–if the entire screen is your app–the back
and front buffers can be switched instantly in hardware. This happens
when you’re fullscreen, and it also happens if you use SDL_NOFRAME and
the window is as large or larger than the screen. This is mandatory for
acceptable performance at higher resolutions.

(Don’t count on the latter SDL_NOFRAME behavior; I’d only expect most
platforms will only do hardware page flipping if SDL_FULLSCREEN is set.)

X may not be correctly page swapping in fullscreen for some reason.On Fri, Jan 24, 2003 at 09:35:51AM +1300, Gib Bogle wrote:

See what happens if you use a 1024x768 window in Windows. Linux might
be blitting on flip instead of page flipping (which would be a bug).

Very interesting. If I remove both SDL_FULLSCREEN and SDL_NOFRAME from
video_flags in the W2000 version I get 690 fps. Removes either one
alone has no effect. Do you think this means there might be a Linux
bug, and do you mean in SDL or X?


Glenn Maynard

Now I have to correct a previous statement I made. I now realize from
looking at the output from the program that the W2000 version always
sets SDL_GL_DEPTH_SIZE to 16, while the Linux version sets it to 24,
regardless of what I try to set it at (16 or 24). The call
SDL_GL_SetAttribute(SDL_GL_DEPTH_SIZE, n) seems to be redundant -
commenting it out has no effect.

I think this is almost certainly the reason for the fps difference.
Regardless of why it’s happening, linux is using a higher and slower bpp
then the W2000 counterpart is. That’s going to have a negative impact
on the speed.

Since the bpp is being inherited by X in linux, you could try setting
the bpp of your X server to 16, and trying the test again. I would
expect that the speeds would be closer.>

Gib


SDL mailing list
SDL at libsdl.org
http://www.libsdl.org/mailman/listinfo/sdl

Jimmy <@Jimmy>
Jimmy’s World.org
-------------- next part --------------
A non-text attachment was scrubbed…
Name: not available
Type: application/pgp-signature
Size: 189 bytes
Desc: This is a digitally signed message part
URL: http://lists.libsdl.org/pipermail/sdl-libsdl.org/attachments/20030123/f5affc4b/attachment.pgp

SDL_GL_DEPTH_SIZE refers to the number of zbuffer bits, not the X color
depth, so this has nothing to do with X not being able to change the color
depth on the fly. The problem setting the zbuffer depth is a bug
somewhere, which I’ve started sniffing around at, but have not yet tracked
down.

SDL does not appear to be the guilty party, as it basically just does
glXChooseVisual, which screws up in some way that requires delving into Mesa
and XFree code to understand.

Regards,

DanielOn Thursday 23 January 2003 21:45, Shawn wrote:

On Thu, 2003-01-23 at 14:35, Gib Bogle wrote:

Now I have to correct a previous statement I made. I now realize from
looking at the output from the program that the W2000 version always
sets SDL_GL_DEPTH_SIZE to 16, while the Linux version sets it to 24,
regardless of what I try to set it at (16 or 24). The call
SDL_GL_SetAttribute(SDL_GL_DEPTH_SIZE, n) seems to be redundant -
commenting it out has no effect.

This is because X doesn’t allow on the fly color depth changes
currently. The windows one because of some driver setting AFAIK.

I thought so. :wink:

That’s why I asked if you did give the windows version "-bpp 24"
parameter so that it will be made 24 bpp like the X11 version.

PS. I should use a litle bit more time for the phrase forming, so that
somebody else can understand what the heck I’m talking about. ;)On Thursday 23 January 2003 22:35, Gib Bogle wrote:

Very interesting. If I remove both SDL_FULLSCREEN and SDL_NOFRAME
from video_flags in the W2000 version I get 690 fps. Removes either
one alone has no effect. Do you think this means there might be a
Linux bug, and do you mean in SDL or X?

Now I have to correct a previous statement I made. I now realize
from looking at the output from the program that the W2000 version
always sets SDL_GL_DEPTH_SIZE to 16, while the Linux version sets it
to 24, regardless of what I try to set it at (16 or 24). The call
SDL_GL_SetAttribute(SDL_GL_DEPTH_SIZE, n) seems to be redundant -
commenting it out has no effect.

SDL_GL_DEPTH_SIZE should be irrelevant here; as someone pointed out, it
sets the depth buffer size, not the color buffer. Just make sure you’re
not enabling depth buffering (no glEnable(GL_DEPTH_TEST)) and that you’re
not clearing the depth buffer (no glClear(GL_DEPTH_BUFFER_BIT)) for this
benchmark.

But this almost certainly has nothing to do with the difference in
speed. He’s not rendering anything (one poly, did he say?), and when he
forced Windows to buffer-blit instead of buffer-swap, it slowed down to more
or less the speed he’s getting in X, so it’s most likely a flipping problem
in X, not a bit depth problem.

In fact, the best way to test the speed here is to only call
SDL_GL_SwapBuffers();. Don’t clear the buffers at all, and don’t draw
anything. If you do this, the color buffer depth should also be
irrelevant, at least for determining whether you’re getting buffer blits
or buffer swaps.

(Not that it’s not a generally good idea to get matching configurations;
I just don’t think it’s the problem here.)On Fri, Jan 24, 2003 at 02:48:32AM +0200, Sami N??t?nen wrote:

Very interesting. If I remove both SDL_FULLSCREEN and SDL_NOFRAME
from video_flags in the W2000 version I get 690 fps. Removes either
one alone has no effect. Do you think this means there might be a
Linux bug, and do you mean in SDL or X?

Now I have to correct a previous statement I made. I now realize
from looking at the output from the program that the W2000 version
always sets SDL_GL_DEPTH_SIZE to 16, while the Linux version sets it
to 24, regardless of what I try to set it at (16 or 24). The call
SDL_GL_SetAttribute(SDL_GL_DEPTH_SIZE, n) seems to be redundant -
commenting it out has no effect.

I thought so. :wink:

That’s why I asked if you did give the windows version "-bpp 24"
parameter so that it will be made 24 bpp like the X11 version.


Glenn Maynard

Well I missread it.
I thought he was referring to framebuffer depth,
which does count if you are clearing the background.On Friday 24 January 2003 03:08, Glenn Maynard wrote:

On Fri, Jan 24, 2003 at 02:48:32AM +0200, Sami N??t?nen wrote:

Very interesting. If I remove both SDL_FULLSCREEN and
SDL_NOFRAME from video_flags in the W2000 version I get 690 fps.
Removes either one alone has no effect. Do you think this means
there might be a Linux bug, and do you mean in SDL or X?

Now I have to correct a previous statement I made. I now realize
from looking at the output from the program that the W2000
version always sets SDL_GL_DEPTH_SIZE to 16, while the Linux
version sets it to 24, regardless of what I try to set it at (16
or 24). The call SDL_GL_SetAttribute(SDL_GL_DEPTH_SIZE, n) seems
to be redundant - commenting it out has no effect.

I thought so. :wink:

That’s why I asked if you did give the windows version "-bpp 24"
parameter so that it will be made 24 bpp like the X11 version.

SDL_GL_DEPTH_SIZE should be irrelevant here; as someone pointed out,
it sets the depth buffer size, not the color buffer. Just make sure
you’re not enabling depth buffering (no glEnable(GL_DEPTH_TEST)) and
that you’re not clearing the depth buffer (no
glClear(GL_DEPTH_BUFFER_BIT)) for this benchmark.

Jimmy wrote:

Now I have to correct a previous statement I made. I now realize from
looking at the output from the program that the W2000 version always
sets SDL_GL_DEPTH_SIZE to 16, while the Linux version sets it to 24,
regardless of what I try to set it at (16 or 24). The call
SDL_GL_SetAttribute(SDL_GL_DEPTH_SIZE, n) seems to be redundant -
commenting it out has no effect.

I think this is almost certainly the reason for the fps difference.
Regardless of why it’s happening, linux is using a higher and slower bpp
then the W2000 counterpart is. That’s going to have a negative impact
on the speed.

Since the bpp is being inherited by X in linux, you could try setting
the bpp of your X server to 16, and trying the test again. I would
expect that the speeds would be closer.

bpp = 24 in both cases.

Gib