Debug.Log((int)(4.2f * 10.0f));
Result : 41 (in Unity 2017.2.1p4)
Result : 42 (in Unity 5.4.3p4)
why it differs??
The exact result of product of (single precision) float 4.2f and 10.f is
41.9999980926513671875
The nearest float to this exact result is 42.0f
But this exact result fits in a double precision, so if ever the expression is evaluated in single precision it will print 42, but if it is evaluated in double precision, then it will print 41.
IOW, the expression can be decomposed into:
x = 4.2f;
x = x * 10.0f;
Debug.Log((int)(x));
If the compiler decided to preserve a single precision x, it will print 42, but if it decided to evaluate x in double precision, it will print 41.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With