I have gotten into an argument/debate recently and I am trying to get a clear verdict of the correct solution.
It is well known that n!
grows very quickly, but exactly how quickly, enough to "hide" all additional constants that might be added to it?
Let's assume I have this silly & simple program (no particular language):
for i from 0 to n! do:
; // nothing
Given that the input is n
, then the complexity of this is obviously O(n!)
(or even ϴ(n!)
but this isn't relevant here).
Now let's assume this program:
for i from 0 to n do:
for j from 0 to n do:
for k from 0 to n! do:
; // nothing
Bob claims: "This program's complexity is obviously O(n)O(n)O(n!) = O(n!n^2) = O((n+2)!)
."
Alice responds: "I agree with you bob, but actually it would be sufficient if you said that the complexity is O(n!)
since O(n!n^k) = O(n!)
for any k >= 1
constant."
Is Alice right in her note of Bob's analysis?
Alice is wrong, and Bob is right.
Recall an equivalent definition to big O notation when using limit:
f(n) is in O(g(n)) iff
lim_n->infinity: f(n)/g(n) < infinity
For any k>0
:
lim_n->infinity: (n!*n^k) / n! = lim_n->infinity n^k = infinity
And thus, n!*n^k
is NOT in O(n!)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With