I’ve recently learned that it’s a much better practice to activate VSYNC when creating the renderer than to try to set the FPS of the game by
SDL_Delay, and let the CPU/GPU handle “sleeps” between frames.
However, I’m wondering if I should also add a fallback mode if the video driver does not support vsync: what happens in that case? I guess
SDL_RenderPresent does not wait at all, so I’m afraid the game is going to keep computing frames at a too high rate, overly consuming CPU (and draining battery if the machine is using battery). In that case, I guess it would be a good idea to add an
SDL_Delay to avoid constantly computing new frames?
Which leads me to my main questions: among all video drivers supported by SDL, are there some that do not support VSYNC? If so, are they documented somewhere? Is it also possible to know it at runtime (when creating the renderer, for example)?
Thanks for your help!
I think it is a good practice put at least as SDL_Delay(1) after SDL_RenderPresent() anyway, as in my experience it we don’t some platforms won’t work properly.
But answering your question, the SDL_gfx library has an useful set of functions to control the framerate that probably will do exactly what want to, so take a look on it, specifically on
Usually, but not always. On 60 Hz monitors enabling vsync does create more lag that can be noticed if you are, say, drawing a software mouse cursor or implementing a pen-based drawing app. If vsync is on you will see that the line you draw with your pen more noticeably lags behind where your pen actually is, which is why the drawing app Krita disables vsync by default.
Try creating a simple SDL app that draws a rectangle where your “hardware” OS mouse cursor is and don’t hide your “hardware” mouse cursor. Run the app with vsync enabled and then disabled. You will notice the rectangle stays closer to your hardware cursor with vsync disabled, there is less drawing lag. The downside to disabling vsync is of course tearing. Thus most of the time I prefer it on, but many competitive gamers prefer it off.
I recently noticed that the hit game Hollow Knight disables vsync by default even on my RTX beast GPU. This leads to noticeably ugly tearing artifacts after acquiring the Super Dash ability which allows the Knight to rapidly horizontally fly across vast scenic backgrounds. But I never noticed vsync was off until I acquired the Super Dash, at which point I went to Options and enabled vsync. I would guess the Hollow Knight devs disabled vsync by default because on old integrated GPUs that need more milliseconds to draw a frame the input lag was noticeably worse.
The fallback software renderer doesn’t support vsync, even though it lists “PresentVSync” in flags. (This seems like bug to me, but perhaps the SDL devs have a Really Good Explanation why the presence of PresentVSync in flags does not and should not be taken to mean the software renderer actually supports vsync.) For example, here’s running “testrendercopyex” from SDL2/test:
$ cd /c/SDL2/test
$ ./testrendercopyex --info render
INFO: Built-in render drivers:
INFO: Renderer opengl:
INFO: Flags: 0x0000000E (Accelerated | PresentVSync | TargetTexturesSupported)
INFO: Renderer software:
INFO: Flags: 0x0000000D (Software | PresentVSync | TargetTexturesSupported)
$ ./testrendercopyex --renderer opengl
INFO: 1876.44 frames per second
$ ./testrendercopyex --vsync --renderer opengl
INFO: 60.32 frames per second
$ ./testrendercopyex --vsync --renderer software
INFO: 807.64 frames per second
So adding --vsync doesn’t cap the software renderer to 60ish FPS (my monitor’s refresh rate).