For master's theorem T(n) = a*T(n/b) + f(n)
I am using 3 cases:
a*f(n/b) = c*f(n)
for some constant c > 1
then T(n) = (n^log(b) a)
a*f(n/b) = f(n)
then T(n) = (f(n) log(b) n)
a*f(n/b) = c*f(n)
for some constant c < 1
then T(n) = (f(n))
But when f(n) = log n
or n*log n
, the value of c
is dependent on value of n. How do I solve the recursive function using master's theorem?
Usually, f(n) must be polynomial for the master theorem to apply - it doesn't apply for all functions. However, there is a limited "fourth case" for the master theorem, which allows it to apply to polylogarithmic functions.
If f(n) = O(nlogba logk n), then T(n) = O(nlogba log k+1 n).
In other words, suppose you have T(n) = 2T (n/2) + n log n. f(n) isn't a polynomial, but f(n)=n log n, and k = 1. Therefore, T(n) = O(n log2 n)
See this handout for more information: http://cse.unl.edu/~choueiry/S06-235/files/MasterTheorem-HandoutNoNotes.pdf
You might find these three cases from the Wikipedia article on the Master theorem a bit more useful:
Now there is no direct dependence on the choice of n anymore - all that matters is the long-term growth rate of f and how it relates to the constants a and b. Without seeing more specifics of the particular recurrence you're trying to solve, I can't offer any more specific advice.
Hope this helps!
When f(n)=log(n), the Master theorem is not applicable. You should use the more generalized theorem, Akra–Bazzi.
In result, T(n)=O(n).
source.
Another way to find a more specific proof of this result is looking for the proof of the computational complexity of the "Optimal Sorted Matrix Search" algorithm.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With