Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Why does PHP say a microsecond is 1/10000th of a second?

Tags:

php

As far as I know, a microsecond is 1/1000000th of a second (one millionth). However for some reason, my php.exe v5.4.12 x64 (CLI on windows 7 x64) seems 'think' it is really 1/10000th (one thenthousand'th).

If I run the following php script:

<?php
while(true)
{
    echo microtime(true)."\r";
}
?>

The counter I see on screen never counts past .9999. It jumps from 1381994204.9999 to 1381994205.0

I was going insane over this last night while writing a script that calculates something to a second, after realising this I changed the formula to divide the microtime(true) output by 10000 instead of 1000000 and it worked perfectly...

like image 678
Alex Avatar asked Mar 23 '23 00:03

Alex


1 Answers

By setting the parameter to true, you are getting a float value – and that has a limited precision. 1381994986.3488 has 14 significant digits, and that is what you typically get with a float:

“The size of a float is platform-dependent, although a maximum of ~1.8e308 with a precision of roughly 14 decimal digits is a common value (the 64 bit IEEE format).”

When not setting the parameter, you can see that the values you are getting are actually microseconds - but you are getting them in string form.

like image 126
CBroe Avatar answered Mar 29 '23 22:03

CBroe