I've noticed that some programmers animate objects based on the difference in time. I am not sure why or even if this is logical. Does anyone know the significance?
Below is a snippet of code that explains what I mean:
var timePassed:int = getTimer()-lastTime;
lastTime += timePassed;
var newBallX = ball.x + ballDX*timePassed;
var newBallY = ball.y + ballDY*timePassed;
When you animate based on time, you make yourself independent of the framerate. No matter how many frames have passed, your ball will move the same distance in a given amount of time. Compare that to depending on the framerate, which is dependent on many variables, like how much processing power is available to do the animation.
This is a common game-physics issue -- check out Glenn Fiedler's excellent "Fix Your Timestep!" article for a more detailed take on this. (Doing it right is slightly more complicated than just multiplying your direction vectors by the timestep.)
The logic is simple.
BallDX => Ball Delta X => The distance the ball can move on the x coordinate in one second
timepassed => amount of time passed
if OldBallX = 0
if BallDX = 10
if TimePassed = 1 sec
Then NewBallX = OldBallX + (BallDX*TimePassed)
Which means
NewBallX = 0 + (10 * 1) = 10 pixels
In that case
if TimePassed = 0.5 sec (half a second)
Then
NewBallX = 0 + (10 * 0.5) = 5 pixels
Logical?
Why NOT do it that way? As opposed to doing what? It is a simple linear motion right? Here is a thought: this allows for the ball to catch up with its intended position in the case other programs are slowing down the computer.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With