Question about SDL_RENDERER_PRESENTVSYNC

When creating a renderer I have successfully used SDL_RENDERER_PRESENTVSYNC and SDL_RENDERER_ACCELERATED. I currently always use SDL_RENDERER_ACCELERATED because I’m not 100% sure how SDL_RENDERER_PRESENTVSYNC works…

I am currently setting the speed of things by comparing a Uint32-variable to SDL_GetTicks(). So I have something like this in my game loop:

if(SDL_GetTicks() - AnimationCounter > 9)
{
  AnimationCounter=SDL_GetTicks();
  [everything in here gets done every 10ms...]
}

Here’s my understanding of what happens when using SDL_RENDERER_PRESENTVSYNC:

  • the program “waits” before executing SDL_RenderPresent so that it syncs up with the refresh rate
  • this means that whereever I have tested my programs I get a constant 60 frames/s

But what happens if someone uses a computer/screen with a higher/lower refresh rate? Wouldn’t the program run faster/slower?

thanks!

RomanH

Generally monitors with less than 60 fps are extremely rare, but you may want to add an option to decrease the FPS in that case.

thank for the answer! So my assumption is correct? The speed of the program does depend on the refresh rate of the system it runs on, when using SDL_RENDERER_PRESENTVSYNC?

I can’t confirm it, but it probably does.

vsync is mostly a tool to avoid screen tearing (and reduce unnecessary CPU/GPU load in some cases). It’s sort of independent from whatever timing code you have – you shouldn’t make assumptions about timing while vsync is enabled. For example I use a 144hz monitor, and users can often set a graphics driver setting to disable vsync even if an app requests it.

TVs with a 50 Hz refresh rate are however common, for example in Europe and the UK. So if you use a TV as your monitor (which I do for both my Raspberry Pis) then 50 fps is quite normal.

Yes, but as long as you’re measuring time between frames and operating based on actual time elapsed, rather than assuming 60hz, everything should be fine.

Also, don’t use SDL_GetTicks(). There’s nothing wrong with it per se, but it only has 1ms resolution, which isn’t enough to measure time between frames without introducing jitter. Instead use SDL_GetPerformanceCounter() or std::chrono if you’re using C++

Thanks a lot everybody for all the clarifications!

Interesting. I never even knew about SDL_GetPerformanceCounter(). Neat! :slight_smile:

If you’re concerned about jitter at the sub-millisecond level you have more serious problems than SDL_GetTicks() to worry about!

Specifically, you need to render the next frame so that the various objects etc. that it contains are in the correct positions for the instant that it will actually be presented on the display device, but you have no way of knowing that - not least because it’s in the future!

Even if you assume a constant frame rate (e.g. 60 fps), so that you can predict the time when the next frame is presented will be exactly 1/60 second after the previous frame was presented, SDL2 provides no way of discovering that. You can call SDL_GetPerformanceCounter() to get a timestamp now, but that’s not the same thing.

In an ideal world SDL_RenderPresent() would return an accurate timestamp for when the frame was actually displayed, but it doesn’t (and probably can’t, because with some buffering schemes it may return before the frame is presented). In the absence of that, I would suggest that SDL_GetTicks() is accurate enough.

:roll_eyes:

What happens if the user runs your game on a 144, 240, or 300 hz monitor? At 300hz, that’s only ~3.3ms per frame. Even 144hz is ~6.9 ms. What happens if (as somebody else mentioned) they change a driver setting to force vsync off entirely?

You want to calculate time deltas between frames with sub-millisecond accuracy if possible.

Such high frame rates are a gimmick, the human eye/brain system cannot perceive them as significantly different from 60 fps. It’s better to use the available power of the GPU to improve the quality of the rendered image at 60 fps than to waste it on creating more, lower quality, frames.

Why would anybody do that? It has no benefit, and could easily result in a program wastefully spinning, using 100% CPU time, shortening battery life and potentially causing overheating problems.

  1. This is wrong, many can tell the different between 60 and 120Hz at least, esp. from how responsive it feels due to reduced input lag
  2. Even if this were a gimmick, displays with 144Hz or 165Hz or 240Hz are commonly used, and ones with even more exist, so you should make sure that using those refreshrates works properly

Why would anybody do that [forcing vsync off]?

Some games feel choppy with vsync enabled, for whatever reason.
Though I’d say that if someone forces vsync off for all games and that breaks yours (if it works properly with vsync on) that’s the user’s problem.
Still doesn’t change the fact that “vsync on” doesn’t mean “60Hz”.

And even at 60Hz millisecond timer resolution sucks because a frame should be 16.666ms which is different enough from 16 or 17 that using full milliseconds could feel like dropping a frame every few frames

People use high refresh rate monitors because common media frame rates divide evenly into them, and because it reduces the lag time between when the frame is finished vs when it’s presented without disabling vsync. Past 85-ish hz is indistinguishable.

Users do stuff like that all the time, dude.

And that one’s fairly benign; “power gamers” change settings in the drivers without really understanding what they do because they want to get more performance.

It’s why most games have an FPS limiter unless running in performance testing mode. But to do that you need sub-millisecond time deltas.

I didn’t say that 120 fps was a gimmick, there is an argument for this offering a marginal subjective improvement. The frame rates listed started at 144 fps and went up to 300 fps; I’ve not seen any evidence for those having much benefit. I should say that I worked for 33 years in BBC Research & Development who researched this very subject.

I agree, ideally they should. But that doesn’t mean you need a timer resolution better than 1 millisecond (see below).

No, that doesn’t follow: having a timer resolution of 1 ms doesn’t necessarily mean that there will be any dropped or repeated frames.

As I’ve said before, the key thing is to ensure that, when you render a frame, any moving objects are positioned according to when that frame will eventually be presented on the display device. However fine a timer resolution is available, you cannot determine that accurately; for example SDL_GetPerformanceCounter() only gives you a timestamp for now, not for when the last frame was presented nor for when the next one will be!

So what can you do? One approach is initially to measure the average frame rate (perhaps by timing a few hundred frames, a 1 ms timer is good enough for that) and then to assume that the true frame rate is absolutely constant. Then you don’t use any kind of timer API when rendering frames in real-time, you simply add the measured frame period to an accumulated time for each frame, and render your objects according to that value.

That should result in a totally smooth, jitter-free, output, except that your animation may run fractionally too slowly or too quickly, if your original measurement of the frame rate wasn’t exact. In practice a problem with this approach is if you ever drop a frame (through running out of CPU or GPU time for example). So an improvement is to incorporate a dropped-frame detection mechanism, which does need a timer (but not an accurate one, SDL_GetTicks() is perfectly good enough). Then if dropped frames are detected you add two (or more) of your frame periods to the accumulated time instead of one.

Up to 300hz monitor refresh rate. The most common use case is to have low latency while still having vsync turned on. A game could run at any framerate that divides evenly into 300hz.

It’s not dropped or repeated frames per se, but that the timing error introduced eventually produces what will look like a dropped frame, because the delta time calculation at 1ms resolution for 60hz means it ping-pongs back and forth between 16 and 17 ms.

I’ve seen this happen in my own apps, and using a more precise timer solved the problem.

Um, what? No.

For anything that needs to happen every frame, AFAIK pretty much every game and game engine just measures the time between the start of the previous frame and the start of the current one (aka right now) and updates things based on that time delta.

Some do that, yes, but it is likely to cause exactly the problem you are concerned about: jitter, for the reasons I have already explained.

The one thing that you can be pretty confident about is that the frame rate is constant: it probably ultimately derives from a crystal oscillator on the graphics card.

Since it’s constant, it makes no sense to measure the duration of individual frames with a sub-millisecond timer when you can time (say) 100 frames using SDL_GetTicks() and divide the result by 100. There will be less jitter as a result.

A career spent in broadcast TV taught me that programmers rarely have a good understanding of video!

You can measure the time between frames and it’ll be slightly different every frame (hopefully a sub-millisecond difference) thanks to things like the fact that your program isn’t the only one running on the system.

You’re assuming the app won’t have any changes in scene complexity that result in corresponding changes in how long it takes to process and render each frame.

And you’re aware that not only are there monitors with high refresh rates, but variable refresh rates as well, yes?

Which is exactly what you don’t want: that difference results in jitter, which may be visible.

Not at all. In fact quite the opposite, the method I have described is ideal for that situation, because the frame period it uses in its rendering calculations is unaffected by any such factors. All you have to be careful of is compensating for dropped frames, if there are any.

For your interest, in 1979 I designed the very first electronically-generated moving graphic ever broadcast by the BBC. All I have tried to do is pass on the knowledge I have acquired over decades of experience in this field.

1 Like

Pretty any TV made from the late 1980s on is capable of 60Hz as well. Are you sure that you are actually outputting 50Hz from your RPi?

Absolutely certain, yes (it’s a Panasonic TV):

scrot