Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Asymptotic complexity of logarithmic functions

I know that in terms of complexity, O(logn) is faster than O(n), which is faster than O(nlogn), which is faster than O(n2). But what about O(n2) and O(n2log), or O(n2.001) and O(n2log):

T1(n)=n^2 + n^2logn

What is the big Oh and omega of this function? Also, what's little oh? versus:

T2(n)=n^2.001 + n^2logn

Is there any difference in big Oh now? I'm having trouble understanding how to compare logn with powers of n. As in, is logn approximately n^0.000000...1 or n^1.000000...1?

like image 615
Joseph D Avatar asked Sep 26 '22 15:09

Joseph D


2 Answers

O(n^k) is faster than O(n^k') for all k, k' >= 0 and k' > k

O(n^2) would be faster than O(n^2*logn)

Note that you can only ignore constants, nothing involving the input size can be ignored.

Thus, complexity of T(n)=n^2 + n^2logn would be the worse of the two, which is O(n^2logn).


Little-oh

Little oh in loose terms is a guaranteed upper bound. Yes, it is called little, and it is more restrictive.

n^2 = O(n^k) for k >= 2 but n^2 = o(n^k) for k > 2

Practically, it is Big-Oh which takes most of the limelight.


What about T(n)= n^2.001 + n^2logn?

We have n2.001 = n2*n0.001 and n2 * log(n).

To settle the question, we need to figure out what would eventually be bigger, n0.001 or log(n).

It turns out that a function of the form nk with k > 0 will eventually take over log(n) for a sufficiently large n.

Same is the case here, and thus T(n) = O(n2.001).

Practically though, log(n) will be larger than n0.001.

(103300)0.001 < log(103300) (1995.6 < 3300), and the sufficiently large n in this case would be just around 103650, an astronomical number.

Worth mentioning again, 103650. There are 1082 atoms in the universe.

like image 100
axiom Avatar answered Oct 20 '22 15:10

axiom


T(n)=n^2 + n^2logn

What is the big Oh and omega of this function? Also, what's little oh?

Quoting a previous answer:

Don't forget big O notation represents a set. O(g(n)) is the set of of all function f such that f does not grows faster than g, formally is the same is saying that there exists C and n0 such that we have |f(n)| <= C|g(n)| for every n >= n0. The expression f(n) = O(g(n)) is a shorthand for saying that f(n) is in the set O(g(n))

Also you can think of big O as and of small o as < (reference). So you care of more of finding relevant big O bound than small o. In your case it's even appropriate to use big theta which is =. Since n^2 log n dominates n^2 it's true that

T1(n)=n^2 + n^2logn = Ө(n^2 logn)

Now the second part. log n grows so slowly that even n^e, e > 0 dominates it. Interestingly, you can even prove that lim n^e/(logn)^k=inf as n goes to infinity. From this you have that n^0.001 dominates log n then

T2(n)=n^2.001 + n^2logn = Ө(n^2.001).

If f(n) = Ө(g(n)) it's also true that f(n) = O(g(n)) so to answer your question:

  • T1(n)=O(n^2 logn)
  • T2(n)=O(n^2.001)
like image 36
sve Avatar answered Oct 20 '22 15:10

sve