Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

time.time() not working to run while loop for predetermined time in Cython

Tags:

python

cython

I had a Python module which included a while loop which was supposed to run for a fixed amount of time. I did this by adding a constant to the output of time.time() and running until time.time() was greater than that variable. This did not present any issues, but the same thing is not working for me in Cython. Now I'm getting wildly off timings.

Just for a minimal example demonstrating this:

import time

cdef float wait_time = 3

def slow():
    cdef float end_time = time.time() + wait_time

    while time.time() < end_time:
        pass
    print("Done")
%timeit -r1 -n1 slow()
Done
44.2 µs ± 0 ns per loop (mean ± std. dev. of 1 run, 1 loop each)

%timeit -r1 -n1 slow()
Done
35.5 µs ± 0 ns per loop (mean ± std. dev. of 1 run, 1 loop each)

%timeit -r1 -n1 slow()
Done
35.5 µs ± 0 ns per loop (mean ± std. dev. of 1 run, 1 loop each)

%timeit -r1 -n1 slow()
Done
19.5 µs ± 0 ns per loop (mean ± std. dev. of 1 run, 1 loop each)

%timeit -r1 -n1 slow()
Done
35.5 µs ± 0 ns per loop (mean ± std. dev. of 1 run, 1 loop each)

%timeit -r1 -n1 slow()
Done
20.6 µs ± 0 ns per loop (mean ± std. dev. of 1 run, 1 loop each)

%timeit -r1 -n1 slow()
Done
20 µs ± 0 ns per loop (mean ± std. dev. of 1 run, 1 loop each)

%timeit -r1 -n1 slow()
Done
56 µs ± 0 ns per loop (mean ± std. dev. of 1 run, 1 loop each)

%timeit -r1 -n1 slow()
Done
1min 3s ± 0 ns per loop (mean ± std. dev. of 1 run, 1 loop each)

%timeit -r1 -n1 slow()
Done
32.9 µs ± 0 ns per loop (mean ± std. dev. of 1 run, 1 loop each)

%timeit -r1 -n1 slow()
Done
1min 5s ± 0 ns per loop (mean ± std. dev. of 1 run, 1 loop each)

The general behavior this tends to follow is that there will be essentially no wait, except for after pausing for a while before running the function, in which case there is an excessive wait.

like image 609
Zachary Matson Avatar asked May 05 '19 23:05

Zachary Matson


1 Answers

Python’s float is C’s double. C’s float usually has only 24 significand bits (one of them implicit), giving it a precision of 128 seconds (since 2004). When your addition changes from rounding down to rounding up, it moves from one minute in the past to a value one minute in the future.

like image 172
Davis Herring Avatar answered Oct 13 '22 01:10

Davis Herring