I'm learning game development in Unity. I recently got struct to the Time.deltaTime function during the code i was learning from the tutorials. I have searched about it for better understanding but not learned the main purpose of using it as it is explained in a professional way. In short I want some easy explanation. So, I can understand from it.
Rendering and script execution takes time. It differs every frame. If you want ~60 fps, you will not have stable fps - but differing amounts of time passing each frame. You could wait if you are too fast, but you cannot skip rendering when you are slower than expected.
To handle different lenghts of frames, you get the "Time.deltaTime". In Update it will tell you how many seconds have passed (usually a fraction like 0.00132) to finish the last frame.
If you now move an object from A to B by using this code:
object.transform.position += Vector3.forward * 0.05f;
It will move 0.05 Units per frame. After 100 frames it has moved 5 Units.
Some pcs may run this game at 60fps, others at 144 hz. Plus the framerate is not constant.
So your object will take 1-5 seconds to move this distance.
But if you do this:
object.transform.position += Vector3.forward * 5f * Time.deltaTime;
it will move 5 units in 1 second. Independent of the framerate. Because if you have 10000 fps the deltaTime will be very very small so it only moves a very tiny bit each frame.
Note: In FixedUpdate() its technically not 100% the same every frame, but you should act like it's always 0.02f or whatever you set the physics interval to. So independent from the framerate, the Time.deltaTime in FixedUpdate() will return the fixedDeltaTime
edit:
Here is a nice comparison video I made:
please watch the video on imgur
Red Cube: transform.position += Vector3.right * (1f / 60f);
Green Cube: transform.position += Vector3.right * Time.deltaTime;
framerate is locked to 60 fps! (using Application.targetFrameRate = 60;
You can see, the green cube is hitting the right wall always at 5, 10, 15, 20, 25 ...
At first they look synchronized, but once I jiggle the window, some frames are dropped (may be an editor thing, but there are tons of reasons for frame-drops in a real game). The red cube never catches up to the green one, the dropped frames are never repeated.
That shows using deltaTime is a lot more consistent + it gives you a nice bonus: You can adjust the timescale using Time.timeScale = .5f; to make a slowmotion!
Time.deltaTime is simply the time in seconds between the last frame and the current frame.
Since Update is called once per frame, Time.deltaTime can be used to make something happen at a constant rate regardless of the (possibly wildly fluctuating) framerate.
So if we just say obj.transform.position += new Vector3(offset, 0, 0); in our update function then obj will move by offset units in the x direction every frame, regardless of FPS.
However if we instead say obj.transform.position += new Vector3(offset * Time.deltaTime, 0, 0); then we know that obj will move offset units every second as every frame it will move the fraction of offset corresponding to how much time that frame took.
As said here:
This property provides the time between the current and previous frame.
You can use this to check your framerate, for example. Also, as said here
If you want to move something at a constant velocity speed, for instance, multiply speed by Time.deltaTime and you will get exactly the distance moved by the object since last Update (remember: distance = velocity * time)
Related
Game: In a simple 2D Portrait Game made in Unity, I have a GameObject (Player) that has a fixed location and which is moving upwards. The Camera follows the Player and animated Obstacles are spawning from time to time moving left to right. The attached Screenshot shows the Scene.
The Problem:
The Movement is not smooth, as it seems like the Player is jittering. I think I already identified one of the causes: Big variation of Time.deltaTime. Average value is 0.0167, but I had variations. Minimum was 0.00177, maximum value was 0.2249519.
Settings:
Target Framerate is 60. I use Unity 2019.4.2f1 and as build target an iPhone X with iOS 14.2.
Scripts
public class Player: MonoBehaviour
{
float speed = 5f;
void Update()
{
transform.Translate(0,speed*Time.deltaTime,0);
}
}
public class CamFollow : MonoBehaviour
{
public Transform Player;
private Vector3 FollowVector;
void LateUpdate()
{
FollowVector = Player.position - new Vector3(0, -4.0f, 10);
transform.position = Vector3.Lerp(transform.position, FollowVector, Time.deltaTime * 4f);
}
}
Note: I need to use Lerp, because the Player may lower or increase the speed for one second, then the camera gently moves to the new position, before changing back. For the Obstacles I don't have a Script. They are moving, by using the Animation Component. For the Obstacles I only loop a change of the x value of the position.
My alternative solutions:
1. Changing the value for Time.deltaTime to a constant value of 0.01666667f:
void Update()
{
transform.Translate(0,speed*0.01666667f,0);
}
This makes the Player Object jitter a lot in the Unity Editor but only a little on the device
2. Using Fixed Update both for the Camera Follow and the Player Movement
This makes the movement and camera follow perfectly smooth, but the animated objects jitter a lot. I know Unity wants to adress the deltaTime issue in one of the next updates. But there should be a solution for my problem, so did anybody have a similiar problem, which could be solved? I prefer the 2nd alternative, because the movement looked really smooth and nice, so can I somehow make the animation part of "fixedUpdate"?
The variation in the 'deltaTime' is to be expected.
The variation is large on the PC because you are running on a complex computer with a complex operating system and lots of other applications running simultaneously, each with a multitude of threads, which every once in a while want to do some work. Thus, the scheduler of the operating system cannot guarantee that you are going to get a time slice at the precise moment that you want it in order to render your next frame.
The variation is smaller on the mobile device because it is a much simpler machine with a lot less going on, so the scheduler is able to give you time slices close to the precise intervals that you are asking.
You are already taking this variation into account when you do
transform.Translate( 0, speed * Time.deltaTime, 0 );
This is a fundamental technique in game development: the frame rate is never constant, so the distance by which you must move an object on each frame depends on the precise amount of time elapsed between this frame and the previous frame.
So, that part is fine.
Your problem lies in
transform.position = Vector3.Lerp( transform.position, FollowVector, Time.deltaTime * 4f );
Here you are passing Time.deltaTime * 4f for parameter t of Vector3.Lerp(). I have no idea what you are trying to accomplish, but the number you need to pass there needs to be a gradual transition between 0 and 1, and instead you are passing a randomly varying number of seconds multiplied by some magic constant 4. This does not look correct.
A couple of options that I can think of:
Always use 0.5 for t so that the camera always rushes to the right position and then slows down as it gets closer to it.
Calculate a separate speed vector for the camera, then move the camera using a translation just as you do for the player.
I've been confusing myself about Update() vs FixedUpdate() and time.deltaTime vs time.fixedDeltaTime for this situation.
I want frame-rate independent movement for the camera mouse-look but at the same time I want it to be as smooth as possible for 60 fps. For some reason, 60 fps seems slightly jittery.
My question is, do I put the function below inside of Update() or FixedUpdate()? Should the deltaTime variable be time.deltaTime or time.fixedDeltaTime to achieve this? Or am I doing this all wrong?
Thanks for any input.
BTW the SmoothDamp function below is a smoothing function based on lerp that allows for the feedback
void MouseLook()
{
if (_isEscaped)
return;
if (!_isAllowedToLook)
return;
var deltaTime = Time.fixedDeltaTime;
var adjustedSensitivity = _lookSensitivity * 100f; //Keeps sensitivity between 0.1 and 10f but makes the sensitivity in math actually work
_lookInput.x = SmoothDamp.Float(_lookInput.x, Input.GetAxis("Mouse X"), _lookSmoothSpeed, deltaTime);
_lookInput.y = SmoothDamp.Float(_lookInput.y, Input.GetAxis("Mouse Y"), _lookSmoothSpeed, deltaTime);
// Mouse look
var newVerticalPitch = _verticalPitch + _lookInput.y * -adjustedSensitivity * deltaTime;
_verticalPitch = SmoothDamp.Float(_verticalPitch, newVerticalPitch, 15f, deltaTime);
_verticalPitch = Mathf.Clamp(_verticalPitch, -_maxVerticalLookAngle, _maxVerticalLookAngle); //Clamp vertical look
// Camera vertical rotation
var newCameraRotation = Quaternion.Euler(Vector3.right * _verticalPitch);
_camTransform.localRotation =
SmoothDamp.Quaternion(_camTransform.localRotation, newCameraRotation, 15f, deltaTime);
// Player horizontal rotation
var newPlayerRotation = _transform.rotation * Quaternion.Euler(_lookInput.x * adjustedSensitivity * deltaTime * Vector3.up);
_transform.rotation = SmoothDamp.Quaternion(_transform.rotation, newPlayerRotation, 15f, deltaTime); //Player rotation
}
Update and FixedUpdate are both callback functions that Unity invokes on your Monobehaviours at certain times during their lifetime. More specifically, Update is called every frame (assuming the Monobehaviour is enabled) and FixedUpdate is called every physics frame.
Physics frames are independent of graphical frame-rate and are set from within Unity to occur at a specific interval. The default is 50 times per second (in other words FixedUpdate runs at 50fps by default).
FixedUpdate is generally a function that you should reserve for your physics-based needs (such as using RigidBodies or ray-casting etc.) and most certainly not one to use for things such as getting user input.
Now that that is hopefully cleared up, let's address Time.deltaTime and Time.fixedDeltaTime:
Time.fixedDeltaTime returns the amount of time that has passed since the previous FixedUpdate() frame ran. Now, since we know from above that FixedUpdate() runs at a constant frame-rate (50 by default) this value will more or less always be the same between frames
Time.deltaTime similarly returns the amount of time that has passed since the previous Update() frame ran. Now, since Update() does not run at a constant frame-rate, but is dependent on the user's machine, this value is almost always going to be a little different each frame.
So with all of this in mind, should this go in Update() or FixedUpdate()? And should it use Time.deltaTime or Time.fixedDeltaTime?
If I've done a decent job at explaining the differences between these things, then I would expect you to now know that it ought to go in Update() and use Time.deltaTime.
I do not know what that SmoothDamp.Float() method is doing exactly, but by lerping between two positions using Time.deltaTime as the t parameter, you will ensure that no matter what the frame-rate of the machine running this is, that it will be smooth and consistent.
I made a simple networking xna game using c# and lidgren. But velocity of objects are different on friend's computer - resolution of window is the same, fps are the same (both 60), application is the same, but he's character is just slower for some reason. For some time his velocity is equal to mine, but then it slows down again.
Other friends do have equal velocities as I do. What could be a problem? The program adjusts movement according to fps, so there is probably no problem in fps.
Without seeing your code its hard to say for sure.
The program adjusts movement according to fps
Well it's generally better to think of animation as a function elapsed since the last update rather than frames per second. That way, the animation will take the same amount of real world time to complete regardless of FPS whether it be fixed or variable update.
Movement deltas should take into account time elapsed since last game update not last/current FPS.
e.g.
public Vector3 Velocity;
public Vector3 Position;
public void Update(GameTime gameTime)
{
float elapsed = (float)gameTime.ElapsedGameTime.TotalSeconds;
// Apply acceleration
Vector3 acceleration = force / Mass;
Velocity += acceleration * elapsed;
// Apply psuedo drag
Velocity *= DragFactor;
// Apply velocity
Position += Velocity * elapsed;
.
.
.
You said you're using lindgren to make it multiplayer? Maybe it's rubberbanding due to latency. Try checking it on 2 computers on lan to see if it resolves the issue.
Possible there's different settings to lindgren to specify how to handle syncing or prediction.
I've got an empty game object filled with all my player components, within it is a model which animates itself when the controls are pressed which are instantiated by GetButtonDown. Currently everything in the player game object moves instantly to the next point (basically teleports there with no transition). Is there a way I can get it all to move over say 1 second rather than going there instantly? Here's what happens on GetButtonDown
if (Input.GetButtonDown ("VerticalFwd"))
{
amountToMove.y = 1f;
transform.Translate (amountToMove);
}
I've tried transform.Translate (amountToMove * Time.deltaTime * randomint);
and all sorts of ways to use Time
however that doesn't seem to work even though it seems the most logical way. I'm guessing because the GetButtonDown only "runs" when it is pressed and has no update function to "time" the move every frame? Any ideas?
amountToMove is saved as a Vector3 aswell.
The first formula is wrong. The correct formula would be:
this.transform.position += (Distance/MoveTime) * time.deltatime
Try .Lerp, it runs as a coroutine interpolating a value over some amount of time Vector3.Lerp(Vector3 start, Vector3 end, float time)
See documentation here
This should give you a rough idea of whats going on
Vector3 Distance = End - Start;
// this will return the difference
this.transform.position += Distance/time.deltatime * Movetime
// this divides the Distance equal to the time.deltatime.. (Use Movetime to make the movement take more or less then 1 second
IE: If the Distance is 1x.1y.1z, and time.deltatime is .1 and Movetime is 1.... you get a movement of .1xyz per tick, taking 1 second to reach the end point
FYI: transform.Translate works as a fancy .position +=, for this purpose you can use them interchangeably
EDIT
Here are a few solutions
Easy ::
Vector3 amountToMove = new Vector3(0,1,0); // Vector3.up is short hand for (0,1,0) if you want to use that instead
if (Input.GetButtonDown ("VerticalFwd"))
{
transform.position = Vector3.Lerp(transform.position ,transform.position + amountToMove, 1);
}
Hard and probably unnecessary ::
Write a Coroutine that does a kind of Linear interpolation for your specific application See documentation here
I am trying to plan for a game I started coding. (Very much in the beginnings)
My problem is that I want the acceleration / movement portion of all game objects to be based on acceleration force vs drag (thus resulting in terminal velocity as the upper limit of speed available)
While I could go another route, I'd rather not if possible. Additionally, it has been suggested (by a friend) that I could use a physics library, but that seems overkill and besides, I'd like to learn and understand these concepts myself - I always feel like I better understand my own programs when I do.
I am making a 2D game and using Vector2 variables for position, heading and thrust force (the acceleration force applied). It's top-down so gravity is not a part of the equation.
Before I code it, I'm working out test cases in Excel - which is how I check my math before committing math to code. And I'm discovering that my use of the drag equation is making the object in question framerate dependent!! Specifically, the better the framerate, the lower the resultant terminal velocity.
I've been trying to modify the equations as necessary to account for framerate, but it eludes me.
If you want to work with the same spreadsheet I am, you can download the spreadsheet here.
But you don't have to - here are the specifics.
The drag equation as I understand it is:
Drag = 0.5 * FluidDensity * Velocity * Velocity * DragCoefficient * IncidenceArea
Using some numbers picked from thin air for calculations, if Fluid Density is 0.233 and the Drag Coefficient is 0.4 and the Incidental Area is 0.1 and the Acceleration force is 50 pixels per second, then here is what happens:
If I calculate that acceleration is applied every 0.25 seconds (once every quarter second) at 1/4 the Acceleration force (to match the timing) then we reach terminal velocity at about 39.3 pixels per second.
If I calculate acceleration instead at every second, we reach terminal velocity at about 53.6 pixels per second.
Specifically, every time I calculate for a given DeltaTime, the resultant speed is calculated as (code is from my head - not from an IDE - apologies if there's a bug in it):
//In globals / initialization:
Vector2 Position;
Vector2 Speed;
Vector2 ThrustForce;
float Density = 0.233f;
float DragCoefficient = 0.4f;
float IncidentalArea = 0.1f;
//In the update loop
//DeltaTime is a float based upon how much of a second passed
Vector2 AccelerationToApply = ThrustForce * DeltaTime;
Vector2 NewSpeed = Speed + AccelerationToApply;
Vector2 Drag = Speed * Speed * 0.5f * Density * DragCoefficient * IncidentalArea;
NewSpeed -= Drag;
Speed = NewSpeed;
That's the problem math. Here is the question:
How should this be expressed so that it's framerate independent?
The classic approach is to step the simulated physical time independent from the game loop frame rate, calculating multiple sub-iterations per frame if necessary to advance the physics. This allows you to control your time step (generally making it smaller than the main frame rate), which also helps to keep other potentially unstable calculations under control (such as oscillators.) This of course means that your physics has to compute faster than real time for the fixed time step chosen, as otherwise your world goes into slow motion.
Speaking of instability, I imagine that you'll see some oscillation effects in your current implementation, depending on whether you're overshooting the terminal velocity in a given time step. One way to resolve this is to compute the speed via analytical integration instead of approximating using a incremental step. To do that, express your formula as a differential equation and see if it is of a form that can be readily solved analytically.
There were two parts missing from the code above. While I had played with turning one part "on" and "off" to experimentally determine if it was needed, without the other I was having problems finding the right answer.
The two parts are this: The resultant drag does need to be multiplied by the time step in order to reduce its effect upon the acceleration, but also and perhaps more importantly - the acceleration force to be applied on this frame needs the drag subtracted from it before it is applied to the speed - not after like I had above.
The modified (and now framerate independent) code looks like this:
Also, I reduced having 4 "constant" coefficients to just one coefficient for the sake of simplicity.
//In globals / initialization:
Vector2 Position;
Vector2 Speed;
Vector2 ThrustForce;
float Coefficient = 0.009f;
float PreviousDrag = 0.000f;
//In the update loop
//DeltaTime is a float based upon how much of a second passed
Vector2 AccelerationToApply = ThrustForce * DeltaTime + PreviousDrag * DeltaTime;
Vector2 NewSpeed = Speed + AccelerationToApply;
PreviousDrag = Coefficient * NewSpeed * NewSpeed;
Speed = NewSpeed;
Running this logic through excel, I find that at approximately the same times I reach the same approximate terminal velocity no matter how often (or not) I calculate a change in velocity.