Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Is O(n Log n) in polynomial time?

Tags:

big-o

Is O(n Log n) in polynomial time? If so, could you explain why?

I am interested in a mathematical proof, but I would be grateful for any strong intuition as well.

Thanks!

like image 263
Yahiko Kikikoto Avatar asked Nov 09 '13 22:11

Yahiko Kikikoto


People also ask

What is considered polynomial time?

An algorithm is said to be of polynomial time if its running time is upper bounded by a polynomial expression in the size of the input for the algorithm, that is, T(n) = O(nk) for some positive constant k.

Is O n2 polynomial time?

O(n^2) is polynomial time.

What does O log n time complexity mean?

Logarithmic time complexity log(n): Represented in Big O notation as O(log n), when an algorithm has O(log n) running time, it means that as the input size grows, the number of operations grows very slowly. Example: binary search.

What is O log n equal to?

O(log N) basically means time goes up linearly while the n goes up exponentially. So if it takes 1 second to compute 10 elements, it will take 2 seconds to compute 100 elements, 3 seconds to compute 1000 elements, and so on. ​It is O(log n) when we do divide and conquer type of algorithms e.g binary search.


3 Answers

Yes, O(nlogn) is polynomial time.

From http://mathworld.wolfram.com/PolynomialTime.html,

An algorithm is said to be solvable in polynomial time if the number of steps required to complete the algorithm for a given input is O(n^m) for some nonnegative integer m, where n is the complexity of the input.

From http://en.wikipedia.org/wiki/Big_O_notation,

f is O(g) iff

$ \hspace{1px} \exists k > 0 \hspace{3px} \exists n_0 \hspace{3px} \forall n > n_0: \hspace{3px} | f(n) | \leq | g(n) \cdot k | \hspace{1px}$

I will now prove that n log n is O(n^m) for some m which means that n log n is polynomial time.

Indeed, take m=2. (this means I will prove that n log n is O(n^2))

For the proof, take k=2. (This could be smaller, but it doesn't have to.) There exists an n_0 such that for all larger n the following holds.

n_0 * f(n) <= g(n) * k

Take n_0 = 1 (this is sufficient) It is now easy to see that

n log n <= 2n*n

log n <= 2n

n > 0 (assumption)

Click here if you're not sure about this.

This proof could be a lot nicer in latex math mode, but I don't think stackoverflow supports that.

like image 88
Syd Kerckhove Avatar answered Oct 27 '22 02:10

Syd Kerckhove


It is, because it is upper-bounded by a polynomial (n). You could take a look at the graphs and go from there, but I can't formulate a mathematical proof other than that :P

EDIT: From the wikipedia page, "An algorithm is said to be of polynomial time if its running time is upper bounded by a polynomial expression in the size of the input for the algorithm".

like image 44
Dylan Meeus Avatar answered Oct 27 '22 01:10

Dylan Meeus


It is at least not worse than polynomial time. And still not better: n < n log n < n*n.

like image 42
nullptr Avatar answered Oct 27 '22 00:10

nullptr