Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Why a programmer would prefer O(N^3) instead of O(N^2)

People also ask

Which is better O n2 or O n3?

A description of a function in terms of big O notation only provides an upper bound on the growth rate of the function. However, we generally seek to provide the tightest possible bound. If you say an algorithm is O(n3), but it is also O(n2), it is generally best to say O(n2).

Which is faster O n 2 or O n 3?

O(2n) is actually C*2n, where C is an arbitrary chosen positive constant. Likewise, O(n3) is D*n3, where D is another arbitrary chosen positive constant. The claim "O(n3) is more efficient than O(2n)" means that, given any fixed C and D, it is always possible to find such n0 that for any n >= n0, D*n3 < C*2n.

Is O n2 always slower than O n )?

The answer is no, for several reason: The expression O(f(n)) means that the running time is at most C f(n) with C a constant whose value is left undefined. It is well possible that the constant for O(log n) is much much larger than the one for O(n).

Is O log n faster than O n 2?

So, O(N*log(N)) is far better than O(N^2) . It is much closer to O(N) than to O(N^2) . But your O(N^2) algorithm is faster for N < 100 in real life. There are a lot of reasons why it can be faster.


I can think of the following three reasons:

  • Ease of initial implementation.
  • Ease of maintenance in the future.
  • The O(N^3) algorithm may have a lower space complexity than the O(N^2) algorithm (i.e., it uses less memory).

Probably the #1 reason: because the O(N2) algorithm has enough higher constants that for the size of task being contemplated, the O(N3) version is faster.


Here is are examples to convince you that O(N³) can be in some cases better than O(N²).

  1. O(N²) algorithm is very complex to code whereas if input size is say N ≤ 100 then for practical use O(N³) can be fast enough

  2. O(N²) has a large constant multiplied to it for example c = 1000 hence for N = 100, c⋅N² = 1000⋅100² = 10⁷ whereas if c = 1 for O(N³) then c⋅N³ = 10⁶

  3. O(N²) algorithm has very high space complexity as compared to O(N³)


another thing is, some algorithms have a big constant factor. a O (N^2) might have a big constant factor that won't make it really practical to use ( if N is small enough as kindly noted by Thorban)


The only reason for choosing one algorithm is not the order-of-growth of the running time. You must analyze:

  • How easy it is to implement
  • The space required
  • How big are the real life inputs (sometimes the time difference is only observable for input sizes that never occurs in reality)

Adding to the already posted answers I'd like to mention cache behaviour. A particular memory access pattern might be so much slower due to repeated cache misses that a theoretically slower algorithm with a more cache friendly memory access pattern performs much better.