Render performance

Hello all,

I’m having some performance issues and I was wondering if any of you guys
can figure out why.
(I know it’s a long shot, but you may spot something that I’ve missed).

I’ve got a very simple mesh renderer and I’m using SDL with openGL.
I’m passing a vertex pointer, normal pointer, and texture pointer and then
calling glDrawElements from my app.
I’m batching things up, so I’m only making 2 calls to glDrawElements per
render loop (tbh I don’t think that’s where the problem is)
Anyway, rendering just under 100K trianlges, this comes to about 3 FPS ;-(
(This is with no texturing and lighitng enabled (with LIGHT0 only)).
gprof says that 75% of the running time goes in my Render() call.

This is on an old-ish PC running fedora6 with a GeForce FX 5500 and the
latest nVidia drivers, but still, it should be running loooooads faster than
this. To give you an idea…glxgears comes up with 1200 FPS.

One thing that is still a mystery to me, is why SDL_GetError returns this at
start-up:
Failed loading glXGetSwapIntervalMESA: /usr/lib/nvidia/libGL.so.1: undefined
symbol: glXGetSwapIntervalMESA

Does this mean anything to anyone? Is it related to my performance issues?

Here’s how I’m initialising SDL (in case I’m doing something wrong)…

Thanx in advance for any help,
Kos.

SDL_GL_SetAttribute( SDL_GL_RED_SIZE, 8 );
SDL_GL_SetAttribute( SDL_GL_GREEN_SIZE, 8 );
SDL_GL_SetAttribute( SDL_GL_BLUE_SIZE, 8 );
SDL_GL_SetAttribute( SDL_GL_ALPHA_SIZE, 8 );
SDL_GL_SetAttribute( SDL_GL_DOUBLEBUFFER, 1 );

SDL_GL_SetAttribute( SDL_GL_BUFFER_SIZE, 24 );

SDL_GL_SetAttribute( SDL_GL_DEPTH_SIZE, 24 );
SDL_GL_SetAttribute( SDL_GL_STENCIL_SIZE, 8 );
SDL_GL_SetAttribute( SDL_GL_ACCUM_RED_SIZE, 16 );
SDL_GL_SetAttribute( SDL_GL_ACCUM_GREEN_SIZE, 16 );
SDL_GL_SetAttribute( SDL_GL_ACCUM_BLUE_SIZE, 16 );
SDL_GL_SetAttribute( SDL_GL_ACCUM_ALPHA_SIZE, 16 );

SDL_GL_SetAttribute( SDL_GL_STEREO, 0 );
SDL_GL_SetAttribute( SDL_GL_MULTISAMPLEBUFFERS, 0 );
SDL_GL_SetAttribute( SDL_GL_MULTISAMPLESAMPLES, 0 );

unsigned int flags = SDL_OPENGL;

if (mRenderSetupData.IsFullScreen)
{
flags |= SDL_FULLSCREEN;
}
else if (mRenderSetupData.HasNoFrame)
{
flags |= SDL_NOFRAME;
}

SDL_Surface *screen = SDL_SetVideoMode( 800, 600, 0, flags );

if (screen == 0 )
{
ErrorMsg(“Video mode set failed: %s\n”, SDL_GetError());
}

Hi Kostas,

Tests like you are doing must be done carefully, fps is always a result
of many, many factors. Some of them can be overlooked easily.

Standard OpenGL Vertex Arrays are not that fast as they might look like.
Every frame the whole bunch of vertex data will be transferred to your
graphics card, since Vertex Arrays are stored in System Memory. Here
are some things, that might speedup your program.

  • If possible, enabling glTexGen can speed up the app, since the
    texture generation can by done by hardware.
  • Maybe your frame rate is limited due to fill rate. 100k triangles
    with every single triangle drawn over the half of the screen will surely
    be slower than drawing 100k very small triangles all next to each other.
  • Eliminate duplicate vertices
  • Render GL_TRIANGLE_STRIPS instead of GL_TRIANGLES
  • Make use of a extensions:
    • GL_ARB_vertex_buffer_object to enable hardware vertex buffers.
      (GL 1.5)
    • GL_EXT_compiled_vertex_array to let the driver know, that the
      buffer is static
    • GL_NV_vertex_array_range and GL_NV_vertex_array_range2 will use
      DMA (but only with nVidia cards)

Anyway, 1200 FPS is a lot.

Hope that helps!
Matthias

Kostas Kostiadis wrote:> Hello all,

I’m having some performance issues and I was wondering if any of you
guys can figure out why.
(I know it’s a long shot, but you may spot something that I’ve missed).

I’ve got a very simple mesh renderer and I’m using SDL with openGL.
I’m passing a vertex pointer, normal pointer, and texture pointer and
then calling glDrawElements from my app.
I’m batching things up, so I’m only making 2 calls to glDrawElements
per render loop (tbh I don’t think that’s where the problem is)
Anyway, rendering just under 100K trianlges, this comes to about 3 FPS ;-(
(This is with no texturing and lighitng enabled (with LIGHT0 only)).
gprof says that 75% of the running time goes in my Render() call.

This is on an old-ish PC running fedora6 with a GeForce FX 5500 and
the latest nVidia drivers, but still, it should be running loooooads
faster than this. To give you an idea…glxgears comes up with 1200 FPS.

One thing that is still a mystery to me, is why SDL_GetError returns
this at start-up:
Failed loading glXGetSwapIntervalMESA: /usr/lib/nvidia/libGL.so.1:
undefined symbol: glXGetSwapIntervalMESA

Does this mean anything to anyone? Is it related to my performance issues?

Here’s how I’m initialising SDL (in case I’m doing something wrong)…

Thanx in advance for any help,
Kos.

SDL_GL_SetAttribute( SDL_GL_RED_SIZE, 8 );
SDL_GL_SetAttribute( SDL_GL_GREEN_SIZE, 8 );
SDL_GL_SetAttribute( SDL_GL_BLUE_SIZE, 8 );
SDL_GL_SetAttribute( SDL_GL_ALPHA_SIZE, 8 );
SDL_GL_SetAttribute( SDL_GL_DOUBLEBUFFER, 1 );

SDL_GL_SetAttribute( SDL_GL_BUFFER_SIZE, 24 );

SDL_GL_SetAttribute( SDL_GL_DEPTH_SIZE, 24 );
SDL_GL_SetAttribute( SDL_GL_STENCIL_SIZE, 8 );
SDL_GL_SetAttribute( SDL_GL_ACCUM_RED_SIZE, 16 );
SDL_GL_SetAttribute( SDL_GL_ACCUM_GREEN_SIZE, 16 );
SDL_GL_SetAttribute( SDL_GL_ACCUM_BLUE_SIZE, 16 );
SDL_GL_SetAttribute( SDL_GL_ACCUM_ALPHA_SIZE, 16 );

SDL_GL_SetAttribute( SDL_GL_STEREO, 0 );
SDL_GL_SetAttribute( SDL_GL_MULTISAMPLEBUFFERS, 0 );
SDL_GL_SetAttribute( SDL_GL_MULTISAMPLESAMPLES, 0 );

unsigned int flags = SDL_OPENGL;

if (mRenderSetupData.IsFullScreen)
{
flags |= SDL_FULLSCREEN;
}
else if (mRenderSetupData.HasNoFrame)
{
flags |= SDL_NOFRAME;
}

SDL_Surface *screen = SDL_SetVideoMode( 800, 600, 0, flags );

if (screen == 0 )
{
ErrorMsg(“Video mode set failed: %s\n”, SDL_GetError());
}



SDL mailing list
SDL at libsdl.org
http://www.libsdl.org/mailman/listinfo/sdl

Hi Matthias,

Thank you for your response…I think althought

glxinfo|grep render

returns

direct rendering: Yes
OpenGL renderer string: GeForce FX 5500/AGP/SSE2

there’s something going on with the version of gl that my system is using.
It seems to be looking in /usr/lib/nvidia/ for all the libGL stuff (which
I’m guessing is where my livna rpm nvidia driver stuff goes) rather than
/usr/lib.

If I look for glXGetSwapIntervalMESA in /usr/lib, the symbol is there…but
if I look in /usr/lib/nvidia it’s missing.
I think my performance is very bad because nothing is hardware
accelerated…

Any ideas on how I can tell whether things use hardware acceleration or not?

Cheers,
Kos> ----- Original Message -----

From: sdl-bounces+kos=climaxgroup.com@libsdl.org
[mailto:sdl-bounces+kos=climaxgroup.com at libsdl.org] On Behalf Of Matthias
Weigand
Sent: 16 November 2006 16:29
To: A list for developers using the SDL library. (includes SDL-announce)
Subject: Re: [SDL] render performance

Hi Kostas,

Tests like you are doing must be done carefully, fps is always a result
of many, many factors. Some of them can be overlooked easily.

Standard OpenGL Vertex Arrays are not that fast as they might look like.
Every frame the whole bunch of vertex data will be transferred to your
graphics card, since Vertex Arrays are stored in System Memory. Here
are some things, that might speedup your program.

  • If possible, enabling glTexGen can speed up the app, since the
    texture generation can by done by hardware.
  • Maybe your frame rate is limited due to fill rate. 100k triangles
    with every single triangle drawn over the half of the screen will surely
    be slower than drawing 100k very small triangles all next to each other.
  • Eliminate duplicate vertices
  • Render GL_TRIANGLE_STRIPS instead of GL_TRIANGLES
  • Make use of a extensions:
    • GL_ARB_vertex_buffer_object to enable hardware vertex buffers.
      (GL 1.5)
    • GL_EXT_compiled_vertex_array to let the driver know, that the
      buffer is static
    • GL_NV_vertex_array_range and GL_NV_vertex_array_range2 will use
      DMA (but only with nVidia cards)

Anyway, 1200 FPS is a lot.

Hope that helps!
Matthias

Kostas Kostiadis wrote:

Hello all,

I’m having some performance issues and I was wondering if any of you
guys can figure out why.
(I know it’s a long shot, but you may spot something that I’ve missed).

I’ve got a very simple mesh renderer and I’m using SDL with openGL.
I’m passing a vertex pointer, normal pointer, and texture pointer and
then calling glDrawElements from my app.
I’m batching things up, so I’m only making 2 calls to glDrawElements
per render loop (tbh I don’t think that’s where the problem is)
Anyway, rendering just under 100K trianlges, this comes to about 3 FPS ;-(
(This is with no texturing and lighitng enabled (with LIGHT0 only)).
gprof says that 75% of the running time goes in my Render() call.

This is on an old-ish PC running fedora6 with a GeForce FX 5500 and
the latest nVidia drivers, but still, it should be running loooooads
faster than this. To give you an idea…glxgears comes up with 1200 FPS.

One thing that is still a mystery to me, is why SDL_GetError returns
this at start-up:
Failed loading glXGetSwapIntervalMESA: /usr/lib/nvidia/libGL.so.1:
undefined symbol: glXGetSwapIntervalMESA

Does this mean anything to anyone? Is it related to my performance issues?

Here’s how I’m initialising SDL (in case I’m doing something wrong)…

Thanx in advance for any help,
Kos.

SDL_GL_SetAttribute( SDL_GL_RED_SIZE, 8 );
SDL_GL_SetAttribute( SDL_GL_GREEN_SIZE, 8 );
SDL_GL_SetAttribute( SDL_GL_BLUE_SIZE, 8 );
SDL_GL_SetAttribute( SDL_GL_ALPHA_SIZE, 8 );
SDL_GL_SetAttribute( SDL_GL_DOUBLEBUFFER, 1 );

SDL_GL_SetAttribute( SDL_GL_BUFFER_SIZE, 24 );

SDL_GL_SetAttribute( SDL_GL_DEPTH_SIZE, 24 );
SDL_GL_SetAttribute( SDL_GL_STENCIL_SIZE, 8 );
SDL_GL_SetAttribute( SDL_GL_ACCUM_RED_SIZE, 16 );
SDL_GL_SetAttribute( SDL_GL_ACCUM_GREEN_SIZE, 16 );
SDL_GL_SetAttribute( SDL_GL_ACCUM_BLUE_SIZE, 16 );
SDL_GL_SetAttribute( SDL_GL_ACCUM_ALPHA_SIZE, 16 );

SDL_GL_SetAttribute( SDL_GL_STEREO, 0 );
SDL_GL_SetAttribute( SDL_GL_MULTISAMPLEBUFFERS, 0 );
SDL_GL_SetAttribute( SDL_GL_MULTISAMPLESAMPLES, 0 );

unsigned int flags = SDL_OPENGL;

if (mRenderSetupData.IsFullScreen)
{
flags |= SDL_FULLSCREEN;
}
else if (mRenderSetupData.HasNoFrame)
{
flags |= SDL_NOFRAME;
}

SDL_Surface *screen = SDL_SetVideoMode( 800, 600, 0, flags );

if (screen == 0 )
{
ErrorMsg(“Video mode set failed: %s\n”, SDL_GetError());
}



SDL mailing list
SDL at libsdl.org
http://www.libsdl.org/mailman/listinfo/sdl


SDL mailing list
SDL at libsdl.org
http://www.libsdl.org/mailman/listinfo/sdl

I think that glGetString(GL_RENDERER) will return a string containing “Mesa” if your application is currently using Mesa instead if the nVidia driver.

Other calls to help you identify the current driver may be;
glGetString( GL_VENDOR )
glGetString( GL_VERSION )
glGetString( GL_EXTENSIONS )

HTH

Kostas Kostiadis wrote:> Hi Matthias,

Thank you for your response…I think althought

glxinfo|grep render

returns

direct rendering: Yes
OpenGL renderer string: GeForce FX 5500/AGP/SSE2

there’s something going on with the version of gl that my system is using.
It seems to be looking in /usr/lib/nvidia/ for all the libGL stuff (which
I’m guessing is where my livna rpm nvidia driver stuff goes) rather than
/usr/lib.

If I look for glXGetSwapIntervalMESA in /usr/lib, the symbol is there…but
if I look in /usr/lib/nvidia it’s missing.
I think my performance is very bad because nothing is hardware
accelerated…

Any ideas on how I can tell whether things use hardware acceleration or not?

Cheers,
Kos

Andreas,

Thank you for your response…
glGetString for vendor, renderer, and version, return the following:

OpenGL Vendor : NVIDIA Corporation
OpenGL Renderer : GeForce FX 5500/AGP/SSE2
OpenGL Version : 2.1.0 NVIDIA 96.29

I’m at a loss with this one…For some reason, SDL causes openGL to fall
back to software rendering but I’ve no idea why.
I’m guessing that’s why I’m getting the “undefined symbol:
glXGetSwapIntervalMESA” stuff as well…
The nVidia OGL stuff don’t contain anything MESA related, which is correct.
I just don’t know why SDL_GetError returns this at runtime…

Glxgears uses hardware acceleration, and glxinfo returns “direct rendering:
YES”, so I’m not sure why SDL causes OGL to fall back to software
rendering…Any ideas are always welcome!

Cheers,
Kos> ----- Original Message -----

From: sdl-bounces+kos=climaxgroup.com@libsdl.org
[mailto:sdl-bounces+kos=climaxgroup.com at libsdl.org] On Behalf Of andreas
Sent: 17 November 2006 19:29
To: A list for developers using the SDL library. (includes SDL-announce)
Subject: Re: [SDL] render performance

I think that glGetString(GL_RENDERER) will return a string containing "Mesa"
if your application is currently using Mesa instead if the nVidia driver.

Other calls to help you identify the current driver may be;
glGetString( GL_VENDOR )
glGetString( GL_VERSION )
glGetString( GL_EXTENSIONS )

HTH

Kostas Kostiadis wrote:

Hi Matthias,

Thank you for your response…I think althought

glxinfo|grep render

returns

direct rendering: Yes
OpenGL renderer string: GeForce FX 5500/AGP/SSE2

there’s something going on with the version of gl that my system is using.
It seems to be looking in /usr/lib/nvidia/ for all the libGL stuff (which
I’m guessing is where my livna rpm nvidia driver stuff goes) rather than
/usr/lib.

If I look for glXGetSwapIntervalMESA in /usr/lib, the symbol is
there…but
if I look in /usr/lib/nvidia it’s missing.
I think my performance is very bad because nothing is hardware
accelerated…

Any ideas on how I can tell whether things use hardware acceleration or
not?

Cheers,
Kos


SDL mailing list
SDL at libsdl.org
http://www.libsdl.org/mailman/listinfo/sdl

Hi Kostas,

I had a similar problem. When I was developin’ my library, I’d had trouble
with fps when rendering in OpenGL. No more than 80fps bursted on a "classic"
triangle (a rotating one)…640x480x32; thats all, no more FPS (even in
fullscreen mode).

I have the same videocard as you.

Anyway, I didn’t find out how to fix it =S…

C ya

2006/11/20, Kostas Kostiadis :>

Andreas,

Thank you for your response…
glGetString for vendor, renderer, and version, return the following:

OpenGL Vendor : NVIDIA Corporation
OpenGL Renderer : GeForce FX 5500/AGP/SSE2
OpenGL Version : 2.1.0 NVIDIA 96.29

I’m at a loss with this one…For some reason, SDL causes openGL to fall
back to software rendering but I’ve no idea why.
I’m guessing that’s why I’m getting the “undefined symbol:
glXGetSwapIntervalMESA” stuff as well…
The nVidia OGL stuff don’t contain anything MESA related, which is
correct.
I just don’t know why SDL_GetError returns this at runtime…

Glxgears uses hardware acceleration, and glxinfo returns “direct
rendering:
YES”, so I’m not sure why SDL causes OGL to fall back to software
rendering…Any ideas are always welcome!

Cheers,
Kos

-----Original Message-----
From: sdl-bounces+kos=climaxgroup.com at libsdl.org
[mailto:sdl-bounces+kos=climaxgroup.com at libsdl.org] On Behalf Of andreas
Sent: 17 November 2006 19:29
To: A list for developers using the SDL library. (includes SDL-announce)
Subject: Re: [SDL] render performance

I think that glGetString(GL_RENDERER) will return a string containing
"Mesa"
if your application is currently using Mesa instead if the nVidia driver.

Other calls to help you identify the current driver may be;
glGetString( GL_VENDOR )
glGetString( GL_VERSION )
glGetString( GL_EXTENSIONS )

HTH

Kostas Kostiadis wrote:

Hi Matthias,

Thank you for your response…I think althought

glxinfo|grep render

returns

direct rendering: Yes
OpenGL renderer string: GeForce FX 5500/AGP/SSE2

there’s something going on with the version of gl that my system is
using.
It seems to be looking in /usr/lib/nvidia/ for all the libGL stuff
(which
I’m guessing is where my livna rpm nvidia driver stuff goes) rather than
/usr/lib.

If I look for glXGetSwapIntervalMESA in /usr/lib, the symbol is
there…but
if I look in /usr/lib/nvidia it’s missing.
I think my performance is very bad because nothing is hardware
accelerated…

Any ideas on how I can tell whether things use hardware acceleration or
not?

Cheers,
Kos


SDL mailing list
SDL at libsdl.org
http://www.libsdl.org/mailman/listinfo/sdl


SDL mailing list
SDL at libsdl.org
http://www.libsdl.org/mailman/listinfo/sdl


Phantom Lord
Caelis Studios —> From Gods Hands To Yours

Check glxinfo

if you have direct render active, probaly you have a good performance.

with ATI driver 32bits i get 250fps in glxgears test, but now i’m trying to
install 64bits and it dont work…

i was try some time ago to install nvidia drivers, but without success.On 11/20/06, Phantom Lord wrote:

Hi Kostas,

I had a similar problem. When I was developin’ my library, I’d had trouble
with fps when rendering in OpenGL. No more than 80fps bursted on a "classic"
triangle (a rotating one)…640x480x32; thats all, no more FPS (even in
fullscreen mode).

I have the same videocard as you.

Anyway, I didn’t find out how to fix it =S…

C ya

2006/11/20, Kostas Kostiadis :

Andreas,

Thank you for your response…
glGetString for vendor, renderer, and version, return the following:

OpenGL Vendor : NVIDIA Corporation
OpenGL Renderer : GeForce FX 5500/AGP/SSE2
OpenGL Version : 2.1.0 NVIDIA 96.29

I’m at a loss with this one…For some reason, SDL causes openGL to fall
back to software rendering but I’ve no idea why.
I’m guessing that’s why I’m getting the “undefined symbol:
glXGetSwapIntervalMESA” stuff as well…
The nVidia OGL stuff don’t contain anything MESA related, which is
correct.
I just don’t know why SDL_GetError returns this at runtime…

Glxgears uses hardware acceleration, and glxinfo returns “direct
rendering:
YES”, so I’m not sure why SDL causes OGL to fall back to software
rendering…Any ideas are always welcome!

Cheers,
Kos

-----Original Message-----
From: sdl-bounces+kos=climaxgroup.com at libsdl.org
[mailto: sdl-bounces+kos=climaxgroup.com at libsdl.org] On Behalf Of
andreas
Sent: 17 November 2006 19:29
To: A list for developers using the SDL library. (includes SDL-announce)
Subject: Re: [SDL] render performance

I think that glGetString(GL_RENDERER) will return a string containing
"Mesa"
if your application is currently using Mesa instead if the nVidia
driver.

Other calls to help you identify the current driver may be;
glGetString( GL_VENDOR )
glGetString( GL_VERSION )
glGetString( GL_EXTENSIONS )

HTH

Kostas Kostiadis wrote:

Hi Matthias,

Thank you for your response…I think althought

glxinfo|grep render

returns

direct rendering: Yes
OpenGL renderer string: GeForce FX 5500/AGP/SSE2

there’s something going on with the version of gl that my system is
using.
It seems to be looking in /usr/lib/nvidia/ for all the libGL stuff
(which
I’m guessing is where my livna rpm nvidia driver stuff goes) rather
than
/usr/lib.

If I look for glXGetSwapIntervalMESA in /usr/lib, the symbol is
there…but
if I look in /usr/lib/nvidia it’s missing.
I think my performance is very bad because nothing is hardware
accelerated…

Any ideas on how I can tell whether things use hardware acceleration
or
not?

Cheers,
Kos


SDL mailing list
SDL at libsdl.org
http://www.libsdl.org/mailman/listinfo/sdl


SDL mailing list
SDL at libsdl.org
http://www.libsdl.org/mailman/listinfo/sdl


Phantom Lord
Caelis Studios —> From Gods Hands To Yours


SDL mailing list
SDL at libsdl.org
http://www.libsdl.org/mailman/listinfo/sdl


Linux user #433535
Linux because we are freedon.

Ruben,

Thank you for your reply…
I think my problem is related to that “undefined symbol:
glXSwapIntervalMESA”, but I haven’t managed to track down what’s causing it
and nobody else seems to know unfortunately :frowning:

Cheers,
Kos_____

From: sdl-bounces+kos=climaxgroup.com@libsdl.org
[mailto:sdl-bounces+kos=climaxgroup.com at libsdl.org] On Behalf Of R?ben L?cio
Sent: 21 November 2006 11:46
To: A list for developers using the SDL library. (includes SDL-announce)
Subject: Re: [SDL] render performance

Check glxinfo

if you have direct render active, probaly you have a good performance.

with ATI driver 32bits i get 250fps in glxgears test, but now i’m trying to
install 64bits and it dont work…

i was try some time ago to install nvidia drivers, but without success.

On 11/20/06, Phantom Lord <thephantomlord at gmail.com <mailto:thephantomlord at gmail.com> > wrote:

Hi Kostas,

I had a similar problem. When I was developin’ my library, I’d had trouble
with fps when rendering in OpenGL. No more than 80fps bursted on a "classic"
triangle (a rotating one)…640x480x32; thats all, no more FPS (even in
fullscreen mode).

I have the same videocard as you.

Anyway, I didn’t find out how to fix it =S…

C ya

2006/11/20, Kostas Kostiadis <@Kostas_Kostiadis
mailto:Kostas_Kostiadis >:

Andreas,

Thank you for your response…
glGetString for vendor, renderer, and version, return the following:

OpenGL Vendor : NVIDIA Corporation
OpenGL Renderer : GeForce FX 5500/AGP/SSE2
OpenGL Version : 2.1.0 NVIDIA 96.29

I’m at a loss with this one…For some reason, SDL causes openGL to fall
back to software rendering but I’ve no idea why.
I’m guessing that’s why I’m getting the “undefined symbol:
glXGetSwapIntervalMESA” stuff as well…
The nVidia OGL stuff don’t contain anything MESA related, which is correct.
I just don’t know why SDL_GetError returns this at runtime…

Glxgears uses hardware acceleration, and glxinfo returns “direct rendering:
YES”, so I’m not sure why SDL causes OGL to fall back to software
rendering…Any ideas are always welcome!

Cheers,
Kos

----- Original Message -----
From: mailto:climaxgroup.com@libsdl.org (sdl-bounces+kos= )
climaxgroup.com at libsdl.org
[mailto: <mailto:sdl-bounces+kos=climaxgroup.com at libsdl.org>
sdl-bounces+kos=climaxgroup.com at libsdl.org] On Behalf Of andreas
Sent: 17 November 2006 19:29
To: A list for developers using the SDL library. (includes SDL-announce)
Subject: Re: [SDL] render performance

I think that glGetString(GL_RENDERER) will return a string containing "Mesa"
if your application is currently using Mesa instead if the nVidia driver.

Other calls to help you identify the current driver may be;
glGetString( GL_VENDOR )
glGetString( GL_VERSION )
glGetString( GL_EXTENSIONS )

HTH

Kostas Kostiadis wrote:

Hi Matthias,

Thank you for your response…I think althought

glxinfo|grep render

returns

direct rendering: Yes
OpenGL renderer string: GeForce FX 5500/AGP/SSE2

there’s something going on with the version of gl that my system is using.

It seems to be looking in /usr/lib/nvidia/ for all the libGL stuff (which
I’m guessing is where my livna rpm nvidia driver stuff goes) rather than
/usr/lib.

If I look for glXGetSwapIntervalMESA in /usr/lib, the symbol is
there…but
if I look in /usr/lib/nvidia it’s missing.
I think my performance is very bad because nothing is hardware
accelerated…

Any ideas on how I can tell whether things use hardware acceleration or
not?

Cheers,
Kos


SDL mailing list
SDL at libsdl.org <mailto:SDL at libsdl.org>
http://www.libsdl.org/mailman/listinfo/sdl
http://www.libsdl.org/mailman/listinfo/sdl


SDL mailing list
SDL at libsdl.org <mailto:SDL at libsdl.org>
http://www.libsdl.org/mailman/listinfo/sdl
http://www.libsdl.org/mailman/listinfo/sdl


Phantom Lord
Caelis Studios —> From Gods Hands To Yours


SDL mailing list
SDL at libsdl.org <mailto:SDL at libsdl.org>
http://www.libsdl.org/mailman/listinfo/sdl
http://www.libsdl.org/mailman/listinfo/sdl


Linux user #433535
Linux because we are freedon.

Kos,

How distro do you use?
You have to install mesa drivers. Mesa is the OpenGL for linux.
That is too important for 3D aceleration.

You can try it too http://dri.freedesktop.org/wiki/

That active the direct hardware acess in linux.

R?benOn 11/21/06, Kostas Kostiadis wrote:

Ruben,

Thank you for your reply…
I think my problem is related to that “undefined symbol:
glXSwapIntervalMESA”, but I haven’t managed to track down what’s causing it
and nobody else seems to know unfortunately :frowning:

Cheers,
Kos


From: sdl-bounces+kos=climaxgroup.com at libsdl.org [mailto:
sdl-bounces+kos=climaxgroup.com at libsdl.org] *On Behalf Of *R?ben L?cio
Sent: 21 November 2006 11:46
To: A list for developers using the SDL library. (includes SDL-announce)
Subject: Re: [SDL] render performance

Check glxinfo

if you have direct render active, probaly you have a good performance.

with ATI driver 32bits i get 250fps in glxgears test, but now i’m trying
to install 64bits and it dont work…

i was try some time ago to install nvidia drivers, but without success.

On 11/20/06, Phantom Lord wrote:

Hi Kostas,

I had a similar problem. When I was developin’ my library, I’d had
trouble with fps when rendering in OpenGL. No more than 80fps bursted on a
"classic" triangle (a rotating one)…640x480x32; thats all, no more FPS
(even in fullscreen mode).

I have the same videocard as you.

Anyway, I didn’t find out how to fix it =S…

C ya

2006/11/20, Kostas Kostiadis :

Andreas,

Thank you for your response…
glGetString for vendor, renderer, and version, return the following:

OpenGL Vendor : NVIDIA Corporation
OpenGL Renderer : GeForce FX 5500/AGP/SSE2
OpenGL Version : 2.1.0 NVIDIA 96.29

I’m at a loss with this one…For some reason, SDL causes openGL to
fall
back to software rendering but I’ve no idea why.
I’m guessing that’s why I’m getting the “undefined symbol:
glXGetSwapIntervalMESA” stuff as well…
The nVidia OGL stuff don’t contain anything MESA related, which is
correct.
I just don’t know why SDL_GetError returns this at runtime…

Glxgears uses hardware acceleration, and glxinfo returns “direct
rendering:
YES”, so I’m not sure why SDL causes OGL to fall back to software
rendering…Any ideas are always welcome!

Cheers,
Kos

-----Original Message-----
From: sdl-bounces+kos= climaxgroup.com at libsdl.org
[mailto: sdl-bounces+kos=climaxgroup.com at libsdl.org] On Behalf Of
andreas
Sent: 17 November 2006 19:29
To: A list for developers using the SDL library. (includes
SDL-announce)
Subject: Re: [SDL] render performance

I think that glGetString(GL_RENDERER) will return a string containing
"Mesa"
if your application is currently using Mesa instead if the nVidia
driver.

Other calls to help you identify the current driver may be;
glGetString( GL_VENDOR )
glGetString( GL_VERSION )
glGetString( GL_EXTENSIONS )

HTH

Kostas Kostiadis wrote:

Hi Matthias,

Thank you for your response…I think althought

glxinfo|grep render

returns

direct rendering: Yes
OpenGL renderer string: GeForce FX 5500/AGP/SSE2

there’s something going on with the version of gl that my system is
using.
It seems to be looking in /usr/lib/nvidia/ for all the libGL stuff
(which
I’m guessing is where my livna rpm nvidia driver stuff goes) rather
than
/usr/lib.

If I look for glXGetSwapIntervalMESA in /usr/lib, the symbol is
there…but
if I look in /usr/lib/nvidia it’s missing.
I think my performance is very bad because nothing is hardware
accelerated…

Any ideas on how I can tell whether things use hardware acceleration
or
not?

Cheers,
Kos


SDL mailing list
SDL at libsdl.org
http://www.libsdl.org/mailman/listinfo/sdl


SDL mailing list
SDL at libsdl.org
http://www.libsdl.org/mailman/listinfo/sdl


Phantom Lord
Caelis Studios —> From Gods Hands To Yours


SDL mailing list
SDL at libsdl.org
http://www.libsdl.org/mailman/listinfo/sdl


Linux user #433535
Linux because we are freedon.


SDL mailing list
SDL at libsdl.org
http://www.libsdl.org/mailman/listinfo/sdl


Linux user #433535
Linux because we are freedon.

Ruben,

Thank you for your reply…
I think my problem is related to that “undefined symbol:
glXSwapIntervalMESA”, but I haven’t managed to track down what’s
causing it and nobody else seems to know unfortunately :frowning:

This could be linking problem. Have you tried to uninstall MESA, because
it seams you really don’t need that. Is there opengl implementation
switching aplication in your distro?

I don’t have the original message, so…
How slow is the slow speed you refer to? Is it well below the refresh
rate of the system?On Tuesday 21 November 2006 13:58, Kostas Kostiadis wrote:

Cheers,
Kos


From: sdl-bounces+kos=climaxgroup.com at libsdl.org
[mailto:sdl-bounces+kos=climaxgroup.com at libsdl.org] On Behalf Of
R?ben L?cio Sent: 21 November 2006 11:46
To: A list for developers using the SDL library. (includes
SDL-announce) Subject: Re: [SDL] render performance

Check glxinfo

if you have direct render active, probaly you have a good
performance.

with ATI driver 32bits i get 250fps in glxgears test, but now i’m
trying to install 64bits and it dont work…

i was try some time ago to install nvidia drivers, but without
success.

On 11/20/06, Phantom Lord <thephantomlord at gmail.com <mailto:thephantomlord at gmail.com> > wrote:

Hi Kostas,

I had a similar problem. When I was developin’ my library, I’d had
trouble with fps when rendering in OpenGL. No more than 80fps bursted
on a “classic” triangle (a rotating one)…640x480x32; thats all, no
more FPS (even in fullscreen mode).

I have the same videocard as you.

Anyway, I didn’t find out how to fix it =S…

C ya

2006/11/20, Kostas Kostiadis <kos at climaxgroup.com
<mailto:kos at climaxgroup.com> >:

Andreas,

Thank you for your response…
glGetString for vendor, renderer, and version, return the following:

OpenGL Vendor : NVIDIA Corporation
OpenGL Renderer : GeForce FX 5500/AGP/SSE2
OpenGL Version : 2.1.0 NVIDIA 96.29

I’m at a loss with this one…For some reason, SDL causes openGL to
fall back to software rendering but I’ve no idea why.
I’m guessing that’s why I’m getting the “undefined symbol:
glXGetSwapIntervalMESA” stuff as well…
The nVidia OGL stuff don’t contain anything MESA related, which is
correct. I just don’t know why SDL_GetError returns this at
runtime…

Glxgears uses hardware acceleration, and glxinfo returns “direct
rendering: YES”, so I’m not sure why SDL causes OGL to fall back to
software rendering…Any ideas are always welcome!

Cheers,
Kos

-----Original Message-----
From: sdl-bounces+kos= <mailto:climaxgroup.com at libsdl.org>
climaxgroup.com at libsdl.org
[mailto: <mailto:sdl-bounces+kos=climaxgroup.com at libsdl.org>
sdl-bounces+kos=climaxgroup.com at libsdl.org] On Behalf Of andreas
Sent: 17 November 2006 19:29
To: A list for developers using the SDL library. (includes
SDL-announce) Subject: Re: [SDL] render performance

I think that glGetString(GL_RENDERER) will return a string containing
"Mesa" if your application is currently using Mesa instead if the
nVidia driver.

Other calls to help you identify the current driver may be;
glGetString( GL_VENDOR )
glGetString( GL_VERSION )
glGetString( GL_EXTENSIONS )

HTH

Kostas Kostiadis wrote:

Hi Matthias,

Thank you for your response…I think althought

glxinfo|grep render

returns

direct rendering: Yes
OpenGL renderer string: GeForce FX 5500/AGP/SSE2

there’s something going on with the version of gl that my system is
using.

It seems to be looking in /usr/lib/nvidia/ for all the libGL stuff
(which I’m guessing is where my livna rpm nvidia driver stuff goes)
rather than /usr/lib.

If I look for glXGetSwapIntervalMESA in /usr/lib, the symbol is

there…but

if I look in /usr/lib/nvidia it’s missing.
I think my performance is very bad because nothing is hardware
accelerated…

Any ideas on how I can tell whether things use hardware
acceleration or

not?

Cheers,
Kos


SDL mailing list
SDL at libsdl.org <mailto:SDL at libsdl.org>
http://www.libsdl.org/mailman/listinfo/sdl
http://www.libsdl.org/mailman/listinfo/sdl


SDL mailing list
SDL at libsdl.org <mailto:SDL at libsdl.org>
http://www.libsdl.org/mailman/listinfo/sdl
http://www.libsdl.org/mailman/listinfo/sdl


Phantom Lord
Caelis Studios —> From Gods Hands To Yours


SDL mailing list
SDL at libsdl.org <mailto:SDL at libsdl.org>
http://www.libsdl.org/mailman/listinfo/sdl
http://www.libsdl.org/mailman/listinfo/sdl

Reflesh rate can be the problem too.

Here i have do all right to work, but reflesh rate was limit the fps, when i
set correct reflesh rate for my monitor, the application up 2.5x ±On 11/21/06, Sami N??t?nen <sn.ml at bayminer.com> wrote:

On Tuesday 21 November 2006 13:58, Kostas Kostiadis wrote:

Ruben,

Thank you for your reply…
I think my problem is related to that “undefined symbol:
glXSwapIntervalMESA”, but I haven’t managed to track down what’s
causing it and nobody else seems to know unfortunately :frowning:

This could be linking problem. Have you tried to uninstall MESA, because
it seams you really don’t need that. Is there opengl implementation
switching aplication in your distro?

I don’t have the original message, so…
How slow is the slow speed you refer to? Is it well below the refresh
rate of the system?

Cheers,
Kos


From: sdl-bounces+kos=climaxgroup.com at libsdl.org
[mailto:sdl-bounces+kos=climaxgroup.com at libsdl.org] On Behalf Of
R?ben L?cio Sent: 21 November 2006 11:46
To: A list for developers using the SDL library. (includes
SDL-announce) Subject: Re: [SDL] render performance

Check glxinfo

if you have direct render active, probaly you have a good
performance.

with ATI driver 32bits i get 250fps in glxgears test, but now i’m
trying to install 64bits and it dont work…

i was try some time ago to install nvidia drivers, but without
success.

On 11/20/06, Phantom Lord <thephantomlord at gmail.com <mailto:thephantomlord at gmail.com> > wrote:

Hi Kostas,

I had a similar problem. When I was developin’ my library, I’d had
trouble with fps when rendering in OpenGL. No more than 80fps bursted
on a “classic” triangle (a rotating one)…640x480x32; thats all, no
more FPS (even in fullscreen mode).

I have the same videocard as you.

Anyway, I didn’t find out how to fix it =S…

C ya

2006/11/20, Kostas Kostiadis <kos at climaxgroup.com
<mailto:kos at climaxgroup.com> >:

Andreas,

Thank you for your response…
glGetString for vendor, renderer, and version, return the following:

OpenGL Vendor : NVIDIA Corporation
OpenGL Renderer : GeForce FX 5500/AGP/SSE2
OpenGL Version : 2.1.0 NVIDIA 96.29

I’m at a loss with this one…For some reason, SDL causes openGL to
fall back to software rendering but I’ve no idea why.
I’m guessing that’s why I’m getting the “undefined symbol:
glXGetSwapIntervalMESA” stuff as well…
The nVidia OGL stuff don’t contain anything MESA related, which is
correct. I just don’t know why SDL_GetError returns this at
runtime…

Glxgears uses hardware acceleration, and glxinfo returns “direct
rendering: YES”, so I’m not sure why SDL causes OGL to fall back to
software rendering…Any ideas are always welcome!

Cheers,
Kos

-----Original Message-----
From: sdl-bounces+kos= <mailto:climaxgroup.com at libsdl.org>
climaxgroup.com at libsdl.org
[mailto: <mailto:sdl-bounces+kos=climaxgroup.com at libsdl.org>
sdl-bounces+kos=climaxgroup.com at libsdl.org] On Behalf Of andreas
Sent: 17 November 2006 19:29
To: A list for developers using the SDL library. (includes
SDL-announce) Subject: Re: [SDL] render performance

I think that glGetString(GL_RENDERER) will return a string containing
"Mesa" if your application is currently using Mesa instead if the
nVidia driver.

Other calls to help you identify the current driver may be;
glGetString( GL_VENDOR )
glGetString( GL_VERSION )
glGetString( GL_EXTENSIONS )

HTH

Kostas Kostiadis wrote:

Hi Matthias,

Thank you for your response…I think althought

glxinfo|grep render

returns

direct rendering: Yes
OpenGL renderer string: GeForce FX 5500/AGP/SSE2

there’s something going on with the version of gl that my system is
using.

It seems to be looking in /usr/lib/nvidia/ for all the libGL stuff
(which I’m guessing is where my livna rpm nvidia driver stuff goes)
rather than /usr/lib.

If I look for glXGetSwapIntervalMESA in /usr/lib, the symbol is

there…but

if I look in /usr/lib/nvidia it’s missing.
I think my performance is very bad because nothing is hardware
accelerated…

Any ideas on how I can tell whether things use hardware
acceleration or

not?

Cheers,
Kos


SDL mailing list
SDL at libsdl.org <mailto:SDL at libsdl.org>
http://www.libsdl.org/mailman/listinfo/sdl
http://www.libsdl.org/mailman/listinfo/sdl


SDL mailing list
SDL at libsdl.org <mailto:SDL at libsdl.org>
http://www.libsdl.org/mailman/listinfo/sdl
http://www.libsdl.org/mailman/listinfo/sdl


Phantom Lord
Caelis Studios —> From Gods Hands To Yours


SDL mailing list
SDL at libsdl.org <mailto:SDL at libsdl.org>
http://www.libsdl.org/mailman/listinfo/sdl
http://www.libsdl.org/mailman/listinfo/sdl


SDL mailing list
SDL at libsdl.org
http://www.libsdl.org/mailman/listinfo/sdl


Linux user #433535
Linux because we are freedon.

Ruben,

To my knowledge nVidia stuff doesn’t do DRI.
Also, I don’t want to use MESA since this is the software version of the
API.
glxinfo reports “direct rendering: yes”, so I should be able to use the
hardware accelerated stuff.

Cheers,
Kos_____

From: sdl-bounces+kos=climaxgroup.com@libsdl.org
[mailto:sdl-bounces+kos=climaxgroup.com at libsdl.org] On Behalf Of R?ben L?cio
Sent: 21 November 2006 12:37
To: A list for developers using the SDL library. (includes SDL-announce)
Subject: Re: [SDL] render performance

Kos,

How distro do you use?
You have to install mesa drivers. Mesa is the OpenGL for linux.
That is too important for 3D aceleration.

You can try it too http://dri.freedesktop.org/wiki/
http://dri.freedesktop.org/wiki/

That active the direct hardware acess in linux.

R?ben

On 11/21/06, Kostas Kostiadis <@Kostas_Kostiadis mailto:Kostas_Kostiadis > wrote:

Ruben,

Thank you for your reply…
I think my problem is related to that “undefined symbol:
glXSwapIntervalMESA”, but I haven’t managed to track down what’s causing it
and nobody else seems to know unfortunately :frowning:

Cheers,
Kos


From: sdl-bounces+kos=climaxgroup.com@libsdl.org
<mailto:climaxgroup.com at libsdl.org>
[mailto:sdl-bounces+kos=climaxgroup.com at libsdl.org
<mailto:sdl-bounces+kos=climaxgroup.com at libsdl.org> ] On Behalf Of R?ben
L?cio
Sent: 21 November 2006 11:46

To: A list for developers using the SDL library. (includes SDL-announce)
Subject: Re: [SDL] render performance

Check glxinfo

if you have direct render active, probaly you have a good performance.

with ATI driver 32bits i get 250fps in glxgears test, but now i’m trying to
install 64bits and it dont work…

i was try some time ago to install nvidia drivers, but without success.

On 11/20/06, Phantom Lord <thephantomlord at gmail.com <mailto:thephantomlord at gmail.com> > wrote:

Hi Kostas,

I had a similar problem. When I was developin’ my library, I’d had trouble
with fps when rendering in OpenGL. No more than 80fps bursted on a "classic"
triangle (a rotating one)…640x480x32; thats all, no more FPS (even in
fullscreen mode).

I have the same videocard as you.

Anyway, I didn’t find out how to fix it =S…

C ya

2006/11/20, Kostas Kostiadis <@Kostas_Kostiadis
mailto:Kostas_Kostiadis >:

Andreas,

Thank you for your response…
glGetString for vendor, renderer, and version, return the following:

OpenGL Vendor : NVIDIA Corporation
OpenGL Renderer : GeForce FX 5500/AGP/SSE2
OpenGL Version : 2.1.0 NVIDIA 96.29

I’m at a loss with this one…For some reason, SDL causes openGL to fall
back to software rendering but I’ve no idea why.
I’m guessing that’s why I’m getting the “undefined symbol:
glXGetSwapIntervalMESA” stuff as well…
The nVidia OGL stuff don’t contain anything MESA related, which is correct.
I just don’t know why SDL_GetError returns this at runtime…

Glxgears uses hardware acceleration, and glxinfo returns “direct rendering:
YES”, so I’m not sure why SDL causes OGL to fall back to software
rendering…Any ideas are always welcome!

Cheers,
Kos

----- Original Message -----
From: mailto:climaxgroup.com@libsdl.org (sdl-bounces+kos= )
climaxgroup.com at libsdl.org
[mailto: <mailto:sdl-bounces+kos=climaxgroup.com at libsdl.org>
sdl-bounces+kos=climaxgroup.com at libsdl.org] On Behalf Of andreas
Sent: 17 November 2006 19:29
To: A list for developers using the SDL library. (includes SDL-announce)
Subject: Re: [SDL] render performance

I think that glGetString(GL_RENDERER) will return a string containing "Mesa"
if your application is currently using Mesa instead if the nVidia driver.

Other calls to help you identify the current driver may be;
glGetString( GL_VENDOR )
glGetString( GL_VERSION )
glGetString( GL_EXTENSIONS )

HTH

Kostas Kostiadis wrote:

Hi Matthias,

Thank you for your response…I think althought

glxinfo|grep render

returns

direct rendering: Yes
OpenGL renderer string: GeForce FX 5500/AGP/SSE2

there’s something going on with the version of gl that my system is using.

It seems to be looking in /usr/lib/nvidia/ for all the libGL stuff (which
I’m guessing is where my livna rpm nvidia driver stuff goes) rather than
/usr/lib.

If I look for glXGetSwapIntervalMESA in /usr/lib, the symbol is
there…but
if I look in /usr/lib/nvidia it’s missing.
I think my performance is very bad because nothing is hardware
accelerated…

Any ideas on how I can tell whether things use hardware acceleration or
not?

Cheers,
Kos


SDL mailing list
SDL at libsdl.org <mailto:SDL at libsdl.org>
http://www.libsdl.org/mailman/listinfo/sdl
http://www.libsdl.org/mailman/listinfo/sdl


SDL mailing list
SDL at libsdl.org <mailto:SDL at libsdl.org>
http://www.libsdl.org/mailman/listinfo/sdl
http://www.libsdl.org/mailman/listinfo/sdl


Phantom Lord
Caelis Studios —> From Gods Hands To Yours


SDL mailing list
SDL at libsdl.org <mailto:SDL at libsdl.org>
http://www.libsdl.org/mailman/listinfo/sdl
http://www.libsdl.org/mailman/listinfo/sdl


Linux user #433535
Linux because we are freedon.


SDL mailing list
SDL at libsdl.org <mailto:SDL at libsdl.org>
http://www.libsdl.org/mailman/listinfo/sdl
http://www.libsdl.org/mailman/listinfo/sdl


Linux user #433535
Linux because we are freedon.

Sami,

This is on fedora core 6.
Not sure what you mean by “opengl implementation switching application”.

Slow, means reeeeeeeeeeeally slow…i.e. between 3 and 10 fps (depending on
where the camera is looking at).
(And this is just trying to render 10K tris)

Cheers,
Kos> ----- Original Message -----

From: sdl-bounces+kos=climaxgroup.com@libsdl.org
[mailto:sdl-bounces+kos=climaxgroup.com at libsdl.org] On Behalf Of Sami
N??t?nen
Sent: 21 November 2006 12:40
To: A list for developers using the SDL library. (includes SDL-announce)
Subject: Re: [SDL] render performance

On Tuesday 21 November 2006 13:58, Kostas Kostiadis wrote:

Ruben,

Thank you for your reply…
I think my problem is related to that “undefined symbol:
glXSwapIntervalMESA”, but I haven’t managed to track down what’s
causing it and nobody else seems to know unfortunately :frowning:

This could be linking problem. Have you tried to uninstall MESA, because
it seams you really don’t need that. Is there opengl implementation
switching aplication in your distro?

I don’t have the original message, so…
How slow is the slow speed you refer to? Is it well below the refresh
rate of the system?

Cheers,
Kos


From: sdl-bounces+kos=climaxgroup.com at libsdl.org
[mailto:sdl-bounces+kos=climaxgroup.com at libsdl.org] On Behalf Of
R?ben L?cio Sent: 21 November 2006 11:46
To: A list for developers using the SDL library. (includes
SDL-announce) Subject: Re: [SDL] render performance

Check glxinfo

if you have direct render active, probaly you have a good
performance.

with ATI driver 32bits i get 250fps in glxgears test, but now i’m
trying to install 64bits and it dont work…

i was try some time ago to install nvidia drivers, but without
success.

On 11/20/06, Phantom Lord <thephantomlord at gmail.com <mailto:thephantomlord at gmail.com> > wrote:

Hi Kostas,

I had a similar problem. When I was developin’ my library, I’d had
trouble with fps when rendering in OpenGL. No more than 80fps bursted
on a “classic” triangle (a rotating one)…640x480x32; thats all, no
more FPS (even in fullscreen mode).

I have the same videocard as you.

Anyway, I didn’t find out how to fix it =S…

C ya

2006/11/20, Kostas Kostiadis <@Kostas_Kostiadis
mailto:Kostas_Kostiadis >:

Andreas,

Thank you for your response…
glGetString for vendor, renderer, and version, return the following:

OpenGL Vendor : NVIDIA Corporation
OpenGL Renderer : GeForce FX 5500/AGP/SSE2
OpenGL Version : 2.1.0 NVIDIA 96.29

I’m at a loss with this one…For some reason, SDL causes openGL to
fall back to software rendering but I’ve no idea why.
I’m guessing that’s why I’m getting the “undefined symbol:
glXGetSwapIntervalMESA” stuff as well…
The nVidia OGL stuff don’t contain anything MESA related, which is
correct. I just don’t know why SDL_GetError returns this at
runtime…

Glxgears uses hardware acceleration, and glxinfo returns “direct
rendering: YES”, so I’m not sure why SDL causes OGL to fall back to
software rendering…Any ideas are always welcome!

Cheers,
Kos

-----Original Message-----
From: sdl-bounces+kos= <mailto:climaxgroup.com at libsdl.org>
climaxgroup.com at libsdl.org
[mailto: <mailto:sdl-bounces+kos=climaxgroup.com at libsdl.org>
sdl-bounces+kos=climaxgroup.com at libsdl.org] On Behalf Of andreas
Sent: 17 November 2006 19:29
To: A list for developers using the SDL library. (includes
SDL-announce) Subject: Re: [SDL] render performance

I think that glGetString(GL_RENDERER) will return a string containing
"Mesa" if your application is currently using Mesa instead if the
nVidia driver.

Other calls to help you identify the current driver may be;
glGetString( GL_VENDOR )
glGetString( GL_VERSION )
glGetString( GL_EXTENSIONS )

HTH

Kostas Kostiadis wrote:

Hi Matthias,

Thank you for your response…I think althought

glxinfo|grep render

returns

direct rendering: Yes
OpenGL renderer string: GeForce FX 5500/AGP/SSE2

there’s something going on with the version of gl that my system is
using.

It seems to be looking in /usr/lib/nvidia/ for all the libGL stuff
(which I’m guessing is where my livna rpm nvidia driver stuff goes)
rather than /usr/lib.

If I look for glXGetSwapIntervalMESA in /usr/lib, the symbol is

there…but

if I look in /usr/lib/nvidia it’s missing.
I think my performance is very bad because nothing is hardware
accelerated…

Any ideas on how I can tell whether things use hardware
acceleration or

not?

Cheers,
Kos


SDL mailing list
SDL at libsdl.org <mailto:SDL at libsdl.org>
http://www.libsdl.org/mailman/listinfo/sdl
http://www.libsdl.org/mailman/listinfo/sdl


SDL mailing list
SDL at libsdl.org <mailto:SDL at libsdl.org>
http://www.libsdl.org/mailman/listinfo/sdl
http://www.libsdl.org/mailman/listinfo/sdl


Phantom Lord
Caelis Studios —> From Gods Hands To Yours


SDL mailing list
SDL at libsdl.org <mailto:SDL at libsdl.org>
http://www.libsdl.org/mailman/listinfo/sdl
http://www.libsdl.org/mailman/listinfo/sdl


SDL mailing list
SDL at libsdl.org
http://www.libsdl.org/mailman/listinfo/sdl

Sami,

This is on fedora core 6.
Not sure what you mean by “opengl implementation switching
application”.

I mean the things that more modern distros has to control the opengl
implementation the admin wants to use.

In Gentoo this is opengl-select. Well actually this is the old name.
Currently this is a script to use the eselect, which is a tool for
selecting between many alternative packages. For example this is used
to select which gcc, java or opengl the system and/or user wants to
use.

Slow, means reeeeeeeeeeeally slow…i.e. between 3 and 10 fps
(depending on where the camera is looking at).
(And this is just trying to render 10K tris)

This sounds like software rendering speed.
So most likely the application is linked to the MESA.

What the next command outputs?

ldd your_applicationOn Tuesday 21 November 2006 15:12, Kostas Kostiadis wrote:

I don’t think it’s linked to MESA…ldd shows the following:

linux-gate.so.1 =>  (0x00649000)
libSDL-1.2.so.0 => /usr/lib/libSDL-1.2.so.0 (0x46392000)
libSDL_image-1.2.so.0 => /usr/lib/libSDL_image-1.2.so.0 (0x48555000)
libGLU.so.1 => /usr/lib/libGLU.so.1 (0x47dc7000)
libGL.so.1 => /usr/lib/nvidia/libGL.so.1 (0x4c9e3000)
libstdc++.so.6 => /usr/lib/libstdc++.so.6 (0x452f4000)
libm.so.6 => /lib/libm.so.6 (0x44878000)
libgcc_s.so.1 => /lib/libgcc_s.so.1 (0x452e6000)
libc.so.6 => /lib/libc.so.6 (0x44739000)
libdl.so.2 => /lib/libdl.so.2 (0x448a1000)
libpthread.so.0 => /lib/libpthread.so.0 (0x448a7000)
libGLcore.so.1 => /usr/lib/nvidia/libGLcore.so.1 (0x4bb0a000)
libnvidia-tls.so.1 => /usr/lib/nvidia/tls/libnvidia-tls.so.1

(0x4bb06000)
libXext.so.6 => /usr/lib/libXext.so.6 (0x4ba7f000)
libX11.so.6 => /usr/lib/libX11.so.6 (0x4c3ee000)
/lib/ld-linux.so.2 (0x43d6a000)
libXau.so.6 => /usr/lib/libXau.so.6 (0x44980000)
libXdmcp.so.6 => /usr/lib/libXdmcp.so.6 (0x44aca000)

Cheers,
Kos> ----- Original Message -----

From: sdl-bounces+kos=climaxgroup.com@libsdl.org
[mailto:sdl-bounces+kos=climaxgroup.com at libsdl.org] On Behalf Of Sami
N??t?nen
Sent: 22 November 2006 09:54
To: A list for developers using the SDL library. (includes SDL-announce)
Subject: Re: [SDL] render performance

On Tuesday 21 November 2006 15:12, Kostas Kostiadis wrote:

Sami,

This is on fedora core 6.
Not sure what you mean by “opengl implementation switching
application”.

I mean the things that more modern distros has to control the opengl
implementation the admin wants to use.

In Gentoo this is opengl-select. Well actually this is the old name.
Currently this is a script to use the eselect, which is a tool for
selecting between many alternative packages. For example this is used
to select which gcc, java or opengl the system and/or user wants to
use.

Slow, means reeeeeeeeeeeally slow…i.e. between 3 and 10 fps
(depending on where the camera is looking at).
(And this is just trying to render 10K tris)

This sounds like software rendering speed.
So most likely the application is linked to the MESA.

What the next command outputs?

ldd your_application


SDL mailing list
SDL at libsdl.org
http://www.libsdl.org/mailman/listinfo/sdl