Poor performance of SDL2 on macOS

I am working through Lazy Foo’s tutorials and am seeing really poor performance with SDL2 on my MacBook Pro. I am doing what I thought to be a pretty simple program: create surface, create texture from surface, then call SDL_RenderCopy to display an image. I have tried setting the backend to both “opengl” and “metal” both with the same results.

As soon as I run my program, the GPU% spikes to 80+%, fans turn on, system overheats, and rest of my applications are very sluggish.

I am on macOS Catalina 10.15.7, 32GB RAM, Radeon Pro Vega 20 4 GB video card, retina display, Metal is supported. The page I’m using is: https://lazyfoo.net/tutorials/SDL/07_texture_loading_and_rendering/index.php. It displays a simple 14kb .png file to the screen. I am trying to run his exact code on that page.

So I am trying to figure out: are there major performance issues with SDL2 and Metal? I tried setting the renderer hint to use “opengl”, but am observing the same problem. Interestingly, setting the renderer hint to “software” results in still poor performance, but better than “metal”.

I am using SDL version 2.0.12_1.

Any help would be appreciated, have spent the last several days trying to figure out what the problem is and have not been able to find a solution. At this point, this is kind of a deal-breaker for me to use SDL2 unless I can figure out what is going on. Does anyone out there have any experience running SDL2 programs on OS X?

Additional note: I tried running the game “Into the Breach” which I have read uses SDL on my machine and it seems to run fine, I don’t observe any of the same performance issues.

The simplest explanation would be that you have not specified the SDL_RENDERER_PRESENTVSYNC flag in your SDL_CreateRenderer() call; that would cause your program to spin continuously, using large amounts of CPU and GPU time. So I would check that flag first.

It’s just because the Lazy Foo example runs a tight loop that runs flat out and never allows the CPU to take a break:

while( !quit )
{
while( SDL_PollEvent( &e ) != 0 )
{

}
}

It’s just a quick tutorial to show the basics of SDL. Later on he shows how to use timers, so your CPU can sleep 99% of the time and just be woken up say every 60th of a second.

For my games I use:

int main(int argc, char* args[])
{
SDL_TimerID timerID;
timerID = SDL_AddTimer(TIMER_TICK_IN_MILLISECONDS, timerTickCallBack, NULL);

   ........

  // loop
  while (!bQuit)
  {
  	if (SDL_WaitEvent(&event))
  	{
  		switch (event.type)
           case SDL_USEREVENT:
                // do an animation frame here
                break;
     }
}

}

Uint32 timerTickCallBack(Uint32 iIntervalInMilliseconds, void *param)
{
SDL_Event event;
SDL_UserEvent userevent;

if (SDL_HasEvent(SDL_USEREVENT) == false)
{
// add an SDL_USEREVENT to the message queue
userevent.type = SDL_USEREVENT;
userevent.code = 0;
userevent.data1 = NULL;
userevent.data2 = NULL;

   event.type = SDL_USEREVENT;
   event.user = userevent;

   SDL_PushEvent(&event);

}

// call back again in ‘iIntervalInMilliseconds’ milliseconds
return iIntervalInMilliseconds;
}

Calling SDL_WaitEvent lets the CPU sleep until an event occurs and then the code continues, but Lazy Foo’s example uses SDL_PollEvent which exits immediately so the CPU never sleeps. Don’t worry about your CPU activity right now (the code is actually telling your computer to blast the CPU!), just work your way through Lazy Foo’s examples to get the hang of SDL for now.

Thank you so much Sean for your help with this. I understand and everything you said makes perfect sense. Thanks also for the sample code. I’ll keep plugging away on the tutorials.

Thanks so much. Got it.

Just remember that it’s better to use the built-in vsync when creating the SDL_Renderer than try to roll your own.

1 Like