I’ve noticed that some programmers animate objects based on the difference in time. I am not sure why or even if this is logical. Does anyone know the significance?
Below is a snippet of code that explains what I mean:
var timePassed:int = getTimer()-lastTime;
lastTime += timePassed;
var newBallX = ball.x + ballDX*timePassed;
var newBallY = ball.y + ballDY*timePassed;
When you animate based on time, you make yourself independent of the framerate. No matter how many frames have passed, your ball will move the same distance in a given amount of time. Compare that to depending on the framerate, which is dependent on many variables, like how much processing power is available to do the animation.
This is a common game-physics issue — check out Glenn Fiedler’s excellent “Fix Your Timestep!” article for a more detailed take on this. (Doing it right is slightly more complicated than just multiplying your direction vectors by the timestep.)