I am new here so I want to quickly introduce myself: IT student at university, loving using SDL, getting into game development…
So since I am not into OS much, I have no idea if this is possible…
To begin with explanation, I would like to say, that I haven’t studied any of the inner code, so this solution might not me possible, but I would love to hear your opinions.
Ok, imagine this:
Let’s say you are calling SDL_PumpEvents() every ten seconds. (ups, what a poor design). And user presses a button and of course, just right after you called SDL_PumpEvents(), your timestamp will then be somewhat 10 seconds later to reality…
I have googled and read something on this topic, why the timestamp cannot be real (OS ticks difference from SDL_GetTicks()).
So lets solve this with our SDL timer.
Prerequisite: Event structure also holds OS timestamp (privately - ups, no C compatibility, let’s believe C users wont touch that)
Let’s imagine that after every single SDL_PumpEvents() call, you would SDL_PushEvent() a dummy keystroke with SDL_GetTicks() timestamp and OS timestamp to set time axis for upcoming keystrokes. Now, when you call SDL_PumpEvents() again, in your queue, there will always be that dummy keystroke first so you can compare OS timestamp to the dummy OS timestamp and get the difference and set the SDL timestamp to dummy SDL timestamp + that difference.
Obviously, OS and SDL ticks are not synchronized, but they beat at same pace, which gets us one milisecond posibility of inexact precision.
And also we lose precision of time between last pumped event and pushing dummy keystroke (If PC lags there, you would get huge imprecission) but I think it still aproximates the events to real-world time more, than every 10 seconds precision (in this case) by vanilla SDL_PumpEvents() call.
Would that work throughout all OS featured for SDL? On different setups?
Thanks for considering this,