I'm studying time complexity in school and our main focus seems to be on polynomial time O(n^c)
algorithms and quasi-linear time O(nlog(n))
algorithms with the occasional exponential time O(c^n)
algorithm as an example of run-time perspective. However, dealing with larger time complexities was never covered.
I would like to see an example problem with an algorithmic solution that runs in factorial time O(n!)
. The algorithm may be a naive approach to solve a problem but cannot be artificially bloated to run in factorial time.
Extra street-cred if the factorial time algorithm is the best known algorithm to solve the problem.
Linear time complexity O(n) means that the algorithms take proportionally longer to complete as the input grows. Examples of linear time algorithms: Get the max/min value in an array. Find a given element in a collection.
To represent in Big-Oh notation, T(N) is directly proportional to n, Therefore, The time complexity of recursive factorial is O(n). As there is no extra space taken during the recursive calls,the space complexity is O(N).
O(2n) An example of an O(2n) function is the recursive calculation of Fibonacci numbers. O(2n) denotes an algorithm whose growth doubles with each addition to the input data set.
You have n!
lists, so you cannot achieve better efficiency than O(n!)
.
Traveling Salesman has a naive solution that's O(n!), but it has a dynamic programming solution that's O(n^2 * 2^n)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With