Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Algorithm complexity, solving recursive equation

I'm taking Data Structures and Algorithm course and I'm stuck at this recursive equation:

T(n) = logn*T(logn) + n

obviously this can't be handled with the use of the Master Theorem, so I was wondering if anybody has any ideas for solving this recursive equation. I'm pretty sure that it should be solved with a change in the parameters, like considering n to be 2^m , but I couldn't manage to find any good fix.

like image 561
Ashkan Kazemi Avatar asked Nov 01 '22 13:11

Ashkan Kazemi


1 Answers

The answer is Theta(n). To prove something is Theta(n), you have to show it is Omega(n) and O(n). Omega(n) in this case is obvious because T(n)>=n. To show that T(n)=O(n), first

  1. Pick a large finite value N such that log(n)^2 < n/100 for all n>N. This is possible because log(n)^2=o(n).
  2. Pick a constant C>100 such that T(n)<Cn for all n<=N. This is possible due to the fact that N is finite.

We will show inductively that T(n)<Cn for all n>N. Since log(n)<n, by the induction hypothesis, we have:

T(n) < n + log(n) C log(n) 
     = n + C log(n)^2
     < n + (C/100) n 
     = C * (1/100 + 1/C) * n
     < C/50 * n
     < C*n

In fact, for this function it is even possible to show that T(n) = n + o(n) using a similar argument.

like image 90
mrip Avatar answered Nov 09 '22 05:11

mrip