In python 2, we have:
>>> datetime.datetime.utcfromtimestamp(1000000000005.0/1000.0)
datetime.datetime(2001, 9, 9, 1, 46, 40, 5000)
But in python 3, we have:
>>> datetime.datetime.utcfromtimestamp(1000000000005.0/1000.0)
datetime.datetime(2001, 9, 9, 1, 46, 40, 4999)
What is the cause of this strange rounding behavior and is it intended? Isn't 1000000000005.0
still within the range of doubles with a few digits to spare?
Below I have essentially included utcfromtimestamp
(I modified it slightly to standalone).
In Python 2:
import time, datetime
def utcfromtimestamp(t):
y, m, d, hh, mm, ss, weekday, jday, dst = time.gmtime(t)
us = int((t % 1.0) * 1000000)
ss = min(ss, 59)
return datetime.datetime(y, m, d, hh, mm, ss, us)
In Python 3:
import time, datetime
def utcfromtimestamp(t):
t, frac = divmod(t, 1.0)
us = int(frac * 1e6)
if us == 1000000:
t += 1
us = 0
y, m, d, hh, mm, ss, weekday, jday, dst = time.gmtime(t)
ss = min(ss, 59)
return datetime.datetime(y, m, d, hh, mm, ss, us)
(The input 1000000000005.0/1000.0
is calculated to 1000000000.005
.)
In my standalone version:
Python 2 uses the modulus operator %
to determine if the input is a whole number or a fraction.
The statement (t % 1.0) * 1000000
then multiplies the fraction (in our case 0.004999995231628418
) by 1000000
. This returns 4999.995231628418
, which is rounded down to 4999
by int
.
Python 3 uses divmod
to return the whole number (t
) 1000000000.0
and the fraction (frac
) 0.005
.
Instead of returning this, it returns t
as 1000000000
and frac
as 0.004999995231628418
.
It then goes to calculating us
using frac * 1e6
. This is multiplying your 0.004999995231628418
by 1000000
resulting in 4999.995231628418
, which is rounded down to 4999
by int
.
There is no real difference in the methods used. Both are accurate and return the same result. My conclusion is that Python 2 is rounding microseconds up, while Python 3 is rounding them down.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With