I'm making a project with Augmented Reality, using Unity and Vuforia extensions. I'm new to C#, but I was looking for a method similar to ARToolKit's getFrame()
, and I'm really not finding anything.
My questions are:
To calculate frames per second, you just take the number of rendered frames and divide it by the seconds passed.
Usually, the default frame rate on mobile platforms is 30 fps. To target the default frame rate, set Application. targetFrameRate to -1. To target a frame rate other than the maximum achievable frame rate or the platform default on mobile platforms, set Application.
The simplest way to start controlling frame rate is to explicitly set the QualitySettings. vSyncCount so that rendering will occur at an interval related to the display device's refresh rate (e.g., for a 60Hz display, setting vSyncCount=2 will cause Unity to render at 30fps in sync with the display).
Thats as simple as:
public float avgFrameRate;
public void Update()
{
avgFrameRate = Time.frameCount / Time.time;
}
Put this code in any MonoBehaviour
and attatch it to any GameObject
in the scene hierarchy.
Please note: this will only give you an average frame-rate. For a more current frame-rate, other answers have addressed effective ways of accomplishing that.
None of the answers here consider the fact that the timescale
can be modified in Unity and if it is, all the above approaches will be incorrect. This is because Time.Delta
time is influenced by the timescale
.
As such, you need to use Time.unscaledDeltaTime
:
int fps = 0;
void Update () {
fps = (int)(1f / Time.unscaledDeltaTime);
}
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With