I ran into a strange problem with my project. When I changed the basecode, IDE and introduced VBO, the identical code suddenly slowed down by more than 50 times! After making some experiments, I think it might have something to do with SDL. Any one having similar observations?
My project so far has a very simple OpenGl 3D world and a flight simulator that uses physical simulation (needs a lot of CPU). I first programmed it with DEV-C++ on top of NeHe tutorial 14, which uses the old NeHe basecode. The program works so, that between drawing every screen, it measures the time from the previous update, and runs the simulation for correct amount of time steps to get the new position of the plane to draw the next screen. It ran smoothly with timestep of 0.001 seconds.
Then I took New NeHe Lesson03 basecode which uses SDL, and switced to CodeBlocks IDE. Then I copy pasted my classes to it and added VBO support. The graphics run smoothly, but when I start the simulation (Code of which had not been touched), the framerate quickly slows down below 1! I have to change the simulation timestep to 0.1 before it’s able to maintain framerate. Without simulation running, frame rate is about 60 (never exceeds, a system default?) so the graphics should not be a problem.
Removing all VBO code and switching back to DEV-C++ don’t help, so I think the problem is the third thing I changed, the SDL. Any ideas what to do?
This is the code I use in the draw function to run the simulation, each time before drawing anything. The class of the plane1, or any code it uses has not been changed:
deltaT = mTimer.getElapsedTime();
for (float time=0.000f; time < deltaT; time+=timeStep)
So the for loop here suddenly takes more than 50 times more time. What to do? :?