Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to calculate the algorithmic complexity of Python functions? [duplicate]

When required to show how efficient the algorithm is, we need to show the algorithmic complexity of functions - Big O and so on. In Python code, how can we show or calculate the bounds of functions?

like image 513
coolharsh55 Avatar asked Oct 24 '13 04:10

coolharsh55


People also ask

How do you find the complexity of a Python algorithm?

Time complexity is commonly estimated by counting the number of elementary operations performed by the algorithm, supposing that each elementary operation takes a fixed amount of time to perform. When analyzing the time complexity of an algorithm we may find three cases: best-case, average-case and worst-case.

How is algorithmic complexity calculated?

If your algorithm runs in a time proportional to the logarithm of the input data size, that is log ⁡ ( n ) \log(n) log(n), then you have O ( log ⁡ ( n ) ) \mathcal{O}(\log(n)) O(log(n)) complexity. This type of complexity is usually present in algorithms that somehow divide the input size.

What is the complexity of for loop in Python?

The time Complexity of a loop is considered as O(Logn) if the loop variables are divided/multiplied by a constant amount. And also for recursive calls in the recursive function, the Time Complexity is considered as O(Logn).

What is algorithmic complexity and how do you measure it?

Algorithmic complexity is a measure of how long an algorithm would take to complete given an input of size n. If an algorithm has to scale, it should compute the result within a finite and practical time bound even for large values of n. For this reason, complexity is calculated asymptotically as n approaches infinity.


1 Answers

In general, there's no way to do this programmatically (you run into the halting problem).

If you have no idea where to start, you can gain some insight into how a function will perform by running some benchmarks (e.g. using the time module) with inputs of various sizes. You can even collect enough data to form a suspicion about what the runtime might be. But this won't give you a rigorous answer - for that, you need to prove mathematically that your suspected bound is in fact true.

For instance, if I'm playing with a sorting function and observe that the time is increasing roughly proportionally to the square of the input size, I might suspect that the complexity of this sort is O(n**2). But this does not constitute proof - in particular, some algorithms that perform well under typical inputs have pathological inputs that result in very poor performance.

To prove that the bound is in fact O(n**2), I need to look at what the algorithm is doing in the worst case - in this example, I might be analysing a selection sort, which repeatedly sweeps across the entire unsorted portion of the list and picks the lowest unsorted number. It should be evident that I'm examining something like n*(n-1) == O(n**2) elements. If examining elements is a constant-time operation, and placing the final element in the correct place is also not worse than O(n**2), then it follows that my entire algorithm is O(n**2).

like image 186
atomicinf Avatar answered Oct 14 '22 10:10

atomicinf