Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Order of growth

for

f = n(log(n))^5
g = n^1.01

is

f = O(g)
f = 0(g)
f = Omega(g)?

I tried dividing both by n and i got

f = log(n)^5
g = n^0.01

But I am still clueless to which one grows faster. Can someone help me with this and explain the reasoning to the answer? I really want to know how (without calculator) one can determine which one grows faster.

like image 693
denniss Avatar asked Sep 02 '10 05:09

denniss


People also ask

How do you find the order of growth?

We say that R(n) has order of growth Θ(f (n)), written R(n) = Θ(f (n)) (pronounced “theta of f (n)”), if there are positive constants k1 and k2 independent of n such that k1 f (n) ≤ R(n) ≤ k2 f (n) for any sufficiently large value of n.

What is constant order of growth?

When an algorithm has a constant order of growth, it means that it always takes a fixed number of steps, no matter how large the input size increases. Even if the list grows to be a million items long, that operation will always require a single step.

What is the order of growth of the running time?

What is the order of growth of its running time as a function of n? Solution: In theory, the order of growth is n log log n. Follows from Mertens' theorem in number theory.

What is the order of algorithm?

The order of a function (or an algorithm) can be defined as such: Let f, g : N → R be real-valued functions on N. We say that f is of order g, written O(g), if there exists a constant c ∈ R such that f(n) ≤ c g(n) for every n ∈ N. In simpler terms?


2 Answers

Probably easiest to compare their logarithmic profiles:

If (for some C1, C2, a>0)

f < C1 n log(n)^a
g < C2 n^(1+k)

Then (for large enough n)

log(f) < log(n) + a log(log(n)) + log(C1)
log(g) < log(n) + k log(n) + log(C2)

Both are dominated by log(n) growth, so the question is which residual is bigger. The log(n) residual grows faster than log(log(n)), regardless of how small k or how large a is, so g would grow faster than f.

So in terms of big-O notation: g grows faster than f, so f can (asymptotically) be bounded from above by a function like g:

f(n) < C3 g(n)

So f = O(g). Similarly, g can be bounded from below by f, so g = Omega(f). But f cannot be bounded from below by a function like g, since g will eventually outgrow it. So f != Omega(g) and f != Theta(g).

But aaa makes a very good point: g does not begin to dominate over f until n becomes obscenely large.

I don't have a lot of experience with algorithm scaling, so corrections are welcome.

like image 55
marshall.ward Avatar answered Oct 08 '22 02:10

marshall.ward


I would break this up into several easy, reusable lemmas:

Lemma 1: For a positive constant k, f = O(g) if and only if f = O(k g).

Proof: Suppose f = O(g). Then there exist constants c and N such that |f(n)| < c |g(n)| for n > N. Thus |f(n)| < (c/k) (k |g(n)| ) for n > N and constant (c/k), so f = O (k g). The converse is trivially similar.


Lemma 2: If h is a positive monotonically increasing function and f and g are positive for sufficiently large n, then f = O(g) if and only if h(f) = O( h(g) ).

Proof: Suppose f = O(g). Then there exist constants c and N such that |f(n)| < c |g(n)| for n > N. Since f and g are positive for n > M, f(n) < c g(n) for n > max(N, M). Since h is monotonically increasing, h(f(n)) < c h(g(n)) for n > max(N, M), and lastly |h(f(n))| < c |h(g(n))| for n > max(N, M) since h is positive. Thus h(f) = O(h(g)). The converse follows similarly; the key fact being that if h is monotonically increasing, then h(a) < h(b) => a < b.


Lemma 3: If h is an invertible monotonically increasing function, then f = O(g) if and only if f(h) + O(g(h)).

Proof: Suppose f = O(g). Then there exist constants c, N such that |f(n)| < c |g(n)| for n > N. Thus |f(h(n))| < c |g(h(n))| for h(n) > N. Since h(n) is invertible and monotonically increasing, h(n) > N whenever n > h^-1(N). Thus h^-1(N) is the new constant we need, and f(h(n)) = O(g(h(n)). The converse follows similarly, using g's inverse.


Lemma 4: If h(n) is nonzero for n > M, f = O(g) if and only if f(n)h(n) = O(g(n)h(n)).

Proof: If f = O(g), then for constants c, N, |f(n)| < c |g(n)| for n > N. Since |h(n)| is positive for n > M, |f(n)h(n)| < c |g(n)h(n)| for n > max(N, M) and so f(n)h(n) = O(g(n)h(n)). The converse follows similarly by using 1/h(n).


Lemma 5a: log n = O(n).

Proof: Let f = log n, g = n. Then f' = 1/n and g' = 1, so for n > 1, g increases more quickly than f. Moreover g(1) = 1 > 0 = f(1), so |f(n)| < |g(n)| for n > 1 and f = O(g).

Lemma 5b: n != O(log n).

Proof: Suppose otherwise for contradiction, and let f = n and g = log n. Then for some constants c, N, |n| < c |log n| for n > N.

Let d = max(2, 2c, sqrt(N+1) ). By the calculation in lemma 5a, since d > 2 > 1, log d < d. Thus |f(2d^2)| = 2d^2 > 2d(log d) >= d log d + d log 2 = d (log 2d) > 2c log 2d > c log (2d^2) = c g(2d^2) = c |g(2d^2)| for 2d^2 > N, a contradiction. Thus f != O(g).


So now we can assemble the answer to the question you originally asked.

Step 1:

log n = O(n^a)
n^a != O(log n)

For any positive constant a.

Proof: log n = O(n) by Lemma 5a. Thus log n = 1/a log n^a = O(1/a n^a) = O(n^a) by Lemmas 3 (for h(n) = n^a), 4, and 1. The second fact follows similarly by using Lemma 5b.

Step 2:

log^5 n = O(n^0.01)
n^0.01 != O(log^5 n)

Proof: log n = O(n^0.002) by step 1. Then by Lemma 2 (with h(n) = n^5), log^5 n = O( (n^0.002)^5 ) = O(n^0.01). The second fact follows similarly.

Final answer:

n log^5 n = O(n^1.01)
n^1.01 != O(n log^5 n)

In other words,

f = O(g)
f != 0(g)
f != Omega(g)

Proof: Apply Lemma 4 (using h(n) = n) to step 2.

With practice these rules become "obvious" and second nature. and unless your test requires that you prove your answer you'll find yourself whipping through these kinds of big-O problems.

like image 24
user168715 Avatar answered Oct 08 '22 04:10

user168715