I’m currently prototyping an arcade-style shooter with SDL2 and I’m trying to find a balance between tight, low lag movement and a smooth picture without too much stuttering and tearing. Ideally I’d like to leave vsync on but drastically reduce the input lag, but if I can find a way to have a relatively smooth timer-based system (for the record I’m okay with screen tearing but would love to eliminate the stuttering, like how a typical 3D game with vsync off runs) I would be very okay with that too.
With vsync on, there is an overwhelming amount of lag. Enough to clearly feel it on inputs, and with a sprite drawing on the cursor’s coordinates it lags heavily behind. With vsync off and the timer in place, there’s only about one frame of lag as expected.
Here’s a summary of my game loop, including the SDL_Delay() waiting code for when I try non-vsync:
// Screen constants
const int SCREEN_FPS = 60;
const int SCREEN_TICK_PER_FRAME = 1000 / SCREEN_FPS;
// Delta Time stuff
Uint64 deltaTimeCountNow = SDL_GetPerformanceCounter();
Uint64 deltaTimeCountLast = 0;
Uint64 freq = 0;
double deltaTime = 0;
// Frame limiter stuff
Uint64 frameCapTimeStart = 0;
Uint64 frameCapTimeNow = 0;
int main( int argc, char* args[] ) {
SDL_Event event;
// Get the Performance Frequency
freq = SDL_GetPerformanceFrequency();
//===== MAIN LOOP =====
bool quit = false;
while( !quit ) {
// Start cap timer
frameCapTimeStart = SDL_GetPerformanceCounter();
// Handle all events on queue
while( SDL_PollEvent( &event )) {
// snipped event stuff here
}
// Update delta time
deltaTimeCountLast = deltaTimeCountNow;
deltaTimeCountNow = SDL_GetPerformanceCounter();
deltaTime = (double)((deltaTimeCountNow - deltaTimeCountLast)*1000 / freq );
deltaTime *= 0.060;
// Update all game logic
game->update();
//======= RENDERING =======
//Clear screen
SDL_RenderClear(renderer);
//Draw all of Game
game->draw();
//Update screen
SDL_RenderPresent(renderer);
//===== END RENDERING =====
// IF USING TIMER INSTEAD OF VSYNC
frameCapTimeNow = SDL_GetPerformanceCounter();
Uint64 elapsedFrameTime = ((frameCapTimeNow - frameCapTimeStart)*1000 / freq );
if (!quit) {
SDL_Delay(SCREEN_TICK_PER_FRAME - elapsedFrameTime);
}
}
//===== END MAIN LOOP =====
//Free resources and close SDL
graphics->freeAndClose();
return 0;
}
I’ve definitely seen games made in SDL that aren’t stuttery or tearing but still have tight lightning fast inputs so there has to be some kind of solution here. I wouldn’t imagine it would have an effect, but if this could be at all related I am drawing my game onto an intermediary texture and then drawing that scaled up to fill the screen (to have a native 320x240 arcade resolution that scales to fill the screen with various filter methods), but if that might be related I can provide any more code that’s needed.
So, are there any tricks I might be missing here for tight fast vsync or smoother timer-based frame updating? Knowing I still have this to sort out is kind of killing my motivation and I’d really love to find a solution so I can have more energy developing.