I am trying to compute the time-complexity and compare it with the actual computation times.
If I am not mistaken, the time-complexity is O(log(n)), but looking at the actual computation times it looks more like O(n) or even O(nlog(n)).
What could be reason for this difference?
def pow(n):
"""Return 2**n, where n is a nonnegative integer."""
if n == 0:
return 1
x = pow(n//2)
if n%2 == 0:
return x*x
return 2*x*x
Theoretical time-complexity:
Actual run times:
A function with a quadratic time complexity has a growth rate of n2. If the input is size 2, it will do four operations. If the input is size 8, it will take 64, and so on.
Let's use T(n) as the total time in function of the input size n , and t as the time complexity taken by a statement or group of statements. T(n) = t(statement1) + t(statement2) + ... + t(statementN); If each statement executes a basic operation, we can say it takes constant time O(1) .
Time Complexity: The time complexity of an algorithm quantifies the amount of time taken by an algorithm to run as a function of the length of the input. Note that the time to run is a function of the length of the input and not the actual execution time of the machine on which the algorithm is running on.
I was suspecting your time calculation is not accurate, so I did it using timeit
, here're my stats:
import timeit
# N
sx = [10, 100, 1000, 10e4, 10e5, 5e5, 10e6, 2e6, 5e6]
# average runtime in seconds
sy = [timeit.timeit('pow(%d)' % i, number=100, globals=globals()) for i in sx]
Update:
Well, the code did run with O(n*log(n))...! A possible explanation is that multiplication / division is not O(1) for large numbers so this part doesn't hold:
T(n) = 1 + T(n//2)
= 1 + 1 + T(n//4)
# ^ ^
# mul>1
# div>1
# when n is large
Experiment with multiplication and division:
mul = lambda x: x*x
div = lambda y: x//2
s1 = [timeit.timeit('mul(%d)' % i, number=1000, globals=globals()) for i in sx]
s2 = [timeit.timeit('div(%d)' % i, number=1000, globals=globals()) for i in sx]
And plots, same for mul
and div
- they are not O(1) (?) small integers seem to be more efficient but no big difference for large integers. I don't know what could be the cause then. (though, I should keep the answer here if it can help)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With