Is time complexity O(n^2)
or O (n(logn)^2)
better?
I know that when we simplify it, it becomes
O(n) vs O((logn)^2)
and logn
< n
, but what about logn^2
?
So, O(N*log(N)) is far better than O(N^2) . It is much closer to O(N) than to O(N^2) . But your O(N^2) algorithm is faster for N < 100 in real life. There are a lot of reasons why it can be faster.
O(log(n^2)) is simply O(2 log(n)) = O(log(n)) . It is a logarithmic function. Its value is much smaller than the linear function O(n) .
Is it always better to choose nlogn if the size n is not given? Or can we say on an average nlogn outperforms n2. Strictly speaking, no to both questions. The only thing we can say for sure is that nlogn algorithm outperforms n2 algorithm for sufficiently large n.
O(n) means that the algorithm's maximum running time is proportional to the input size. basically, O(something) is an upper bound on the algorithm's number of instructions (atomic ones). therefore, O(logn) is tighter than O(n) and is also better in terms of algorithms analysis.
n is only less than (log n)2 for values of n less than 0.49...
So in general (log n)2 is better for large n...
But since these O(something)-notations always leave out constant factors, in your case it might not be possible to say for sure which algorithm is better...
Here's a graph:
(The blue line is n and the green line is (log n)2)
Notice, how the difference for small values of n isn't so big and might easily be dwarfed by the constant factors not included in the Big-O notation.
But for large n, (log n)2 wins hands down:
For each constant k
asymptotically log(n)^k < n
.
Proof is simple, do log on both sides of the equation, and you get:
log(log(n))*k < log(n)
It is easy to see that asymptotically, this is correct.
Semantic note: Assuming here log(n)^k == log(n) * log(n) * ... * log(n) (k times)
and NOT log(log(log(...log(n)))..) (k times)
as it is sometimes also used.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With