Frame rate of 20000 rectangles

Hello,

I’m doing some stress tests on SDL 2 and came up with an unexpected
behavior (at least for me).

I create 20k rectangles (each with its own random position and speed)
and update them continuously as fast as possible.

I’m not using memory allocation or anything fancy. I created a small
and straightforward test case here (50 locs):
http://codepad.org/6GwlXXhD

Compiled with
gcc tst.c -lSDL2

After 2 minutes running smoothly, the frame rate “chokes” every 4 seconds.
I tested with 20k and 7.5k rectangles and the pattern is the same…
However, with 5k rectangles it runs smoothly (I tried running for 10
minutes without problems).

I don’t see any change in CPU usage for the Xorg, test program, or any
other process (no page faults, nothing else that I can suspect).

I would either expect a slow frame rate since the beginning (as all
rectangles are already “running”), or a smooth execution forever.
Am I missing something important here?

My system is an Ubuntu 11.10, Intel Core 2 Duo 2.2 GHz.
I’m using the Hg repo updated today.

Thanks in advance,
Francisco

After 2 minutes running smoothly, the frame rate “chokes” every 4 seconds.
I tested with 20k and 7.5k rectangles and the pattern is the same…
However, with 5k rectangles it runs smoothly (I tried running for 10
minutes without problems).

I ran this for 10 minutes without any slowdown, ubuntu 12.04 / 32bit, with
drivers:

OpenGL version string: 3.3.0 NVIDIA 313.09

… and your code seems correct.–
Bye,
Gabry

I ran this for 10 minutes without any slowdown, ubuntu 12.04 / 32bit, with
drivers:

Thanks, Gabriele.

OpenGL version string: 3.3.0 NVIDIA 313.09
… and your code seems correct.

This is what I get from glxinfo:
OpenGL vendor string: Tungsten Graphics, Inc
OpenGL renderer string: Mesa DRI Mobile Intel? GM45 Express Chipset x86/MMX/SSE2
OpenGL version string: 2.1 Mesa 7.11
OpenGL shading language version string: 1.20

I also changed the code to assert that it is accelerated:

SDL_Renderer* ren =
SDL_CreateRenderer(win, -1, SDL_RENDERER_ACCELERATED);
SDL_RendererInfo info;
SDL_GetRendererInfo(ren, &info);
if ((info.flags & SDL_RENDERER_ACCELERATED) == 0)
printf(“ERROR!!\n”);

Anyway, even for software rendering, shouldn’t I expect a more
deterministic behavior?
(i.e. either slow or fast from the beginning)

Thanks,
Francisco

I ran this for 10 minutes without any slowdown, ubuntu 12.04 / 32bit,
with
drivers:

Thanks, Gabriele.

OpenGL version string: 3.3.0 NVIDIA 313.09
… and your code seems correct.

This is what I get from glxinfo:
OpenGL vendor string: Tungsten Graphics, Inc
OpenGL renderer string: Mesa DRI Mobile Intel? GM45 Express Chipset
x86/MMX/SSE2
OpenGL version string: 2.1 Mesa 7.11
OpenGL shading language version string: 1.20

I also changed the code to assert that it is accelerated:

SDL_Renderer* ren =
SDL_CreateRenderer(win, -1, SDL_RENDERER_ACCELERATED);
SDL_RendererInfo info;
SDL_GetRendererInfo(ren, &info);
if ((info.flags & SDL_RENDERER_ACCELERATED) == 0)
printf(“ERROR!!\n”);

On my system SDL is accelerated too, I’ve added to your main loop this code:

   frames++;

    if (!(frames % 100)) {
        fprintf(stderr, "\rFPS: %5d ", frames * 1000 / (now - start));
    }
    SDL_Event e;
    while (SDL_PollEvent(&e)) {
        if (e.type == SDL_QUIT)
            exit(0);
    }

… to see my FPS, and be able to quit without ‘kill -9’ :slight_smile:

On my system your program give an almost constant 120fps (ubuntu 12.04,
gnome 2.x without 3d accel, NVDIA 8400 and P4 3ghz DUAL)

I’m quite sure your problem can be related to something in your display
chain (unity, X.org, drivers, kernel…)On Tue, Jun 4, 2013 at 3:23 PM, Francisco Sant’anna < francisco.santanna at gmail.com> wrote:


Bye,
Gabry

Running smooth here… and I couldn’t see any issues with your code…

That’s really weird, Francisco…

(by the way, my father’s name is Francisco Santana - without apostrophe and
the extra n- that email on SDL list surprised me for a moment)–
Abra?o C. de Santana

Sent from my Motorola RAZR
Em 04/06/2013 12:42, “Gabriele Greco” <gabriele.greco at darts.it> escreveu:

On Tue, Jun 4, 2013 at 3:23 PM, Francisco Sant’anna < francisco.santanna at gmail.com> wrote:

I ran this for 10 minutes without any slowdown, ubuntu 12.04 / 32bit,
with
drivers:

Thanks, Gabriele.

OpenGL version string: 3.3.0 NVIDIA 313.09
… and your code seems correct.

This is what I get from glxinfo:
OpenGL vendor string: Tungsten Graphics, Inc
OpenGL renderer string: Mesa DRI Mobile Intel? GM45 Express Chipset
x86/MMX/SSE2
OpenGL version string: 2.1 Mesa 7.11
OpenGL shading language version string: 1.20

I also changed the code to assert that it is accelerated:

SDL_Renderer* ren =
SDL_CreateRenderer(win, -1, SDL_RENDERER_ACCELERATED);
SDL_RendererInfo info;
SDL_GetRendererInfo(ren, &info);
if ((info.flags & SDL_RENDERER_ACCELERATED) == 0)
printf(“ERROR!!\n”);

On my system SDL is accelerated too, I’ve added to your main loop this
code:

   frames++;

    if (!(frames % 100)) {
        fprintf(stderr, "\rFPS: %5d ", frames * 1000 / (now - start));
    }
    SDL_Event e;
    while (SDL_PollEvent(&e)) {
        if (e.type == SDL_QUIT)
            exit(0);
    }

… to see my FPS, and be able to quit without ‘kill -9’ :slight_smile:

On my system your program give an almost constant 120fps (ubuntu 12.04,
gnome 2.x without 3d accel, NVDIA 8400 and P4 3ghz DUAL)

I’m quite sure your problem can be related to something in your display
chain (unity, X.org, drivers, kernel…)


Bye,
Gabry


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org

I’ve seen issues like that before in applications that don’t pump events
and don’t yield the CPU. Often what is happening is a graphics driver is
forced to stop and flush at some point. NVIDIA’s drivers are the only ones
I haven’t seen with this problem at some time or another.

Cheers!On Tue, Jun 4, 2013 at 5:32 AM, Francisco Sant’anna < francisco.santanna at gmail.com> wrote:

Hello,

I’m doing some stress tests on SDL 2 and came up with an unexpected
behavior (at least for me).

I create 20k rectangles (each with its own random position and speed)
and update them continuously as fast as possible.

I’m not using memory allocation or anything fancy. I created a small
and straightforward test case here (50 locs):
http://codepad.org/6GwlXXhD

Compiled with
gcc tst.c -lSDL2

After 2 minutes running smoothly, the frame rate “chokes” every 4 seconds.
I tested with 20k and 7.5k rectangles and the pattern is the same…
However, with 5k rectangles it runs smoothly (I tried running for 10
minutes without problems).

I don’t see any change in CPU usage for the Xorg, test program, or any
other process (no page faults, nothing else that I can suspect).

I would either expect a slow frame rate since the beginning (as all
rectangles are already “running”), or a smooth execution forever.
Am I missing something important here?

My system is an Ubuntu 11.10, Intel Core 2 Duo 2.2 GHz.
I’m using the Hg repo updated today.

Thanks in advance,
Francisco


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org

Thank you for all of you that took a time to try the code.
I still don’t know what is happening, though.
I ended up testing in another computer without any problems.

FYI, I was evaluating the performance of C?u (announced in a previous
e-mail [1]) when running 40000 “lightweight threads” at the same time.
The results were quite satisfactory, with a 10% decrease in FPS in
comparison to the simple C application I posted in this thread.
As C?u is a source-to-source compiler that generates single-threaded
C, it is possible to compile the application [2] as if it was C.

I also wrote a blog post about this stress test:

[1] http://forums.libsdl.org/viewtopic.php?p=37067&sid=4c0c63c615766be95e3d9fea964917be
[2] http://codepad.org/qXdVI0Py

Thanks again,
FranciscoOn Wed, Jun 5, 2013 at 6:19 AM, Sam Lantinga wrote:

I’ve seen issues like that before in applications that don’t pump events and
don’t yield the CPU. Often what is happening is a graphics driver is forced
to stop and flush at some point. NVIDIA’s drivers are the only ones I
haven’t seen with this problem at some time or another.

Hello !

I’ve seen issues like that before in applications that don’t pump events
and
don’t yield the CPU. Often what is happening is a graphics driver is
forced
to stop and flush at some point. NVIDIA’s drivers are the only ones I
haven’t seen with this problem at some time or another.

Thank you for all of you that took a time to try the code.
I still don’t know what is happening, though.
I ended up testing in another computer without any problems.

Did you try SDL_Delay to give some CPU time back to your OS and GFX
driver and to allow it to process the rectangle data ?

CU