I seem to be having trouble where SDL is being initialized in one
thread, and then I’m using SDL_GetTicks() in that thread and another
thread. In the thread where SDL was not initialized I’m getting huge
numbers for SDL_GetTicks() while in the other thread I’m getting
realistic numbers that represent the actual number of milliseconds since
initialization.
Do I have to initialize SDL in both threads if I want to use
SDL_GetTicks() in both threads?
You’re talking about on Windows, right?
SDL currently uses timeGetTime() by default on Windows, since there are
problems with both GetTickCount() (low resolution) and QueryPerformanceCounter()
(laptop power savings, multi-CPU issues, etc.)
Sooo, it’s very possible that timeGetTime() does something funky with threads.
Try editing src/timer/win32/SDL_systimer.c, and change #if 0 to #if 1
in the SDL_StartTicks() function.
See ya!
-Sam Lantinga, Software Engineer, Blizzard Entertainment