Strange performance problem

I’m having a strange performance problem with my SDL applications (Win32). They consume 100% cpu time, even the one which is very gpu bound. Sometimes, cpu usage will drop to a more reasonable level, only to go back to 100% a few minutes later. Sometimes this will have no visible impact at the user level. Other times, fps will drop to <10, and the entire system will slow down to a crawl. Then, a few minutes later, performance will be normal again.

I can’t make heads or tails of this. Here is what profiling with Very Sleepy reveals:

Suppsedly, RtlInitializeExceptionChain is dominating cpu usage. Googling this function was unhelpful, and one post suggested it was a profiling artifact.

Supposedly, RtlInitializeExceptionChain is calling SDL_LogCritical. Does SDL install some kind of exception handler that this function would call? However, I’m not seeing any logging being output. Where could this logging be being directed?

This happens on two different systems, so it is not a hardware issue.

From

“Turned out I had left a VS conditional breakpoint on which was getting
tested constantly but not triggering. Removing the breakpoint fixed the
issue.”

Might be that?

Simon

2014-08-17 7:04 GMT+02:00 tjcbs :> I’m having a strange performance problem with my SDL applications

(Win32). They consume 100% cpu time, even the one which is very gpu bound.
Sometimes, cpu usage will drop to a more reasonable level, only to go back
to 100% a few minutes later. Sometimes this will have no visible impact at
the user level. Other times, fps will drop to <10, and the entire system
will slow down to a crawl. Then, a few minutes later, performance will be
normal again.

I can’t make heads or tails of this. Here is what profiling with Very
Sleepy reveals:

https://www.dropbox.com/s/1jxgd0xlwbdfxeg/Screenshot%202014-08-16%2023.24.39.png
Suppsedly, RtlInitializeExceptionChain is dominating cpu usage. Googling
this function was unhelpful, and one post suggested it was a profiling
artifact.

Supposedly, RtlInitializeExceptionChain is calling SDL_LogCritical. Does
SDL install some kind of exception handler that this function would call?
However, I’m not seeing any logging being output. Where could this logging
be being directed?

This happens on two different systems, so it is not a hardware issue.


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org

No, checked that, and it happens even when I run outside of VS.

Just to rule out the obvious: Do you have any frame limiting code in
place that calls SDL_Delay or similar to give the OS some CPU time back?
Especially Windows is quite fragile if you don’t do that.Am 17.08.2014 07:04, schrieb tjcbs:

I’m having a strange performance problem with my SDL applications
(Win32). They consume 100% cpu time, even the one which is very gpu
bound. Sometimes, cpu usage will drop to a more reasonable level, only
to go back to 100% a few minutes later. Sometimes this will have no
visible impact at the user level. Other times, fps will drop to

Links:

[1]
https://www.dropbox.com/s/1jxgd0xlwbdfxeg/Screenshot%202014-08-16%2023.24.39.png


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org

Huh? Is that really standard practice? I’ve never had to do this before. When I write apps that genuinely consume 100% cpu, the system will slow down badly, but only if it only has 1 core. Otherwise there are no untoward effects. In the past, with games, when I have v-sync on, cpu use is always very low. When it is off it can be high, but it never seems to adversely affect the rest of the system.

But anyway, I tried inserting a SDL_Delay(5) in the render loop, and it hurt performance, but did not fix the problem.

My system always creeped to a halt when I had no delay somewhere in the
main loop (although I was using a different framework, back when I was
using windows, maybe that was the culprit :stuck_out_tongue: )
VSync would be fine too, but I’ve heard drivers can override it on
windows.Am 18.08.2014 01:35, schrieb tjcbs:

Huh? Is that really standard practice? I’ve never had to do this
before. When I write apps that genuinely consume 100% cpu, the system
will slow down badly, but only if it only has 1 core. Otherwise there
are no untoward effects. In the past, with games, when I have v-sync
on, cpu use is always very low. When it is off it can be high, but it
never seems to adversely affect the rest of the system.

But anyway, I tried inserting a SDL_Delay(5) in the render loop, and
it hurt performance, but did not fix the problem.


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org