I'm trying to find out the time complexity (Big-O) of functions and trying to provide appropriate reason.
First function goes:
r = 0
# Assignment is constant time. Executed once. O(1)
for i in range(n):
for j in range(i+1,n):
for k in range(i,j):
r += 1
# Assignment and access are O(1). Executed n^3
like this.
I see that this is triple nested loop, so it must be O(n^3). but I think my reasoning here is very weak. I don't really get what is going on inside the triple nested loop here
Second function is:
i = n
# Assignment is constant time. Executed once. O(1)
while i>0:
k = 2 + 2
i = i // 2
# i is reduced by the equation above per iteration.
# so the assignment and access which are O(1) is executed
# log n times ??
I found out this algorithm to be O(1). But like the first function, I don't see what is going on in the while-loop.
Can someone explain thoroughly about the time complexity of the two functions? Thanks!
Let's use T(n) as the total time in function of the input size n , and t as the time complexity taken by a statement or group of statements. T(n) = t(statement1) + t(statement2) + ... + t(statementN); If each statement executes a basic operation, we can say it takes constant time O(1) .
The average complexity of d(n) = O(log n) , so, as stated by Ben Voigt, the original function will have an average complexity of O(n log n loglog n ...) . More in detail: you have the for loop, which is O(n) , in this for loop you will recurse an average of d(n) = O(log n) times.
When we analyse an algorithm, we use a notation to represent its time complexity and that notation is Big O notation. For Example: time complexity for Linear search can be represented as O(n) and O(log n) for Binary search (where, n and log(n) are the number of operations).
It takes the same amount of time to run regardless of the input, therefore it is O(1) by definition. it takes no time to run regardless of input, therefore it is O(0) by definition.
For such a simple case, you could find the number of iterations of the innermost loop as a function of n
exactly:
sum_(i=0)^(n-1)(sum_(j=i+1)^(n-1)(sum_(k=i)^(j-1) 1)) = 1/6 n (n^2-1)
i.e., Θ(n**3)
time complexity (see Big Theta): it assumes that r += 1
is O(1) if r
has O(log n)
digits (model has words with log n
bits).
The second loop is even simpler: i //= 2
is i >>= 1
. n
has Θ(log n)
digits and each iteration drops one binary digit (shift right) and therefore the whole loop is Θ(log n)
time complexity if we assume that the i >> 1
shift of log(n)
digits is O(1) operation (same model as in the first example).
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With