I am currently employing a system of keeping track of the time
between seconds, every time i call UpdateScreen() (which essentialy
calls SDL_Flip()) I calculate the difference in time between that call
and the last one, which I then use to calculate a number of seconds per
frame. When my application runs in fullscreen mode my fps log looks
like this:
50,100,100,50,100,…
but in windowed mode:
it is usually 33.33 (and occasionaly a bit lower)
Why is there such a large change in framerate between the two modes?
Is there any reason why the framerate in fullscreen toggles in a
fairly predictible pattern?
Also my player moves at a rate of 1/timeChange (timeChange is the
seconds per frame or 1/FPS) which in windowed mode works perfectly but
in fullscreen mode it is extremely fast.
3. What can I do to better regulate the timing of the game?
(This is all when using VC++.net on WinXP with SDL 1.2.5a.
DirectX 8.1 is installed.)
I am currently employing a system of keeping track of the time
between seconds, every time i call UpdateScreen() (which essentialy
calls SDL_Flip()) I calculate the difference in time between that call
and the last one, which I then use to calculate a number of seconds
per frame. When my application runs in fullscreen mode my fps log
looks like this:
50,100,100,50,100,…
but in windowed mode:
it is usually 33.33 (and occasionaly a bit lower)
Why is there such a large change in framerate between the two
modes?
Is there any reason why the framerate in fullscreen toggles in a
fairly predictible pattern?
As with any other windows/fullscreen/FPS question, it’s probably down to
the VSync…
Windowed mode uses blitting which will take a constant time and
therefore give you a constant speed. Whereas Fullscreen ends up waiting
for the VSync, and sometimes you’ll UpdateScreen right before the vsync
meaning you get a short frame, and sometimes you’ll UpdateScreen just
after it, meaning the frame takes twice as long.–
Kylotan http://pages.eidosnet.co.uk/kylotan