Generic form: T(n) = aT(n/b) + f(n)
So i must compare n^logb(a) with f(n)
if n^logba
> f(n)
is case 1 and T(n)=Θ(n^logb(a))
if n^logba
< f(n)
is case 2 and T(n)=Θ((n^logb(a))(logb(a)))
Is that correct? Or I misunderstood something?
And what about case 3? When its apply?
The master method is a formula for solving recurrence relations of the form: T(n) = aT(n/b) + f(n), where, n = size of input a = number of subproblems in the recursion n/b = size of each subproblem. All subproblems are assumed to have the same size.
Master's theorem is used for? Explanation: Master's theorem is a direct method for solving recurrences. We can solve any recurrence that falls under any one of the three cases of master's theorem.
Note here, that the Master Theorem does not solve a recurrence relation.
Recurrences occur in a divide and conquer strategy of solving complex problems.
What does it solve?
T(n) = aT(n/b) + f(n)
.a
should be greater than or equal to 1. This means that the problem is at least reduced to a smaller sub problem once. At least one recursion is needed.b
should be greater than 1. Which means at every recursion, the size of the problem is reduced to a smaller size. If b
is not greater than 1, that means our sub problems are not of smaller size.f(n)
must be positive for relatively larger values of n
.Consider the below image:
Let's say we have a problem of size n
to be solved. At each step, the problem can be divided into a
sub problems and each sub problem is of smaller size, where the size is reduced by a factor of b
.
The above statement in simple words means that a problem of size n
can be divided into a
sub problems of relatively smaller sizes n/b
.
Also, the above diagram shows that at the end when we have divided the problems multiple times, each sub problem would be so small that it can be solved in constant time.
For the below derivation consider log
to the base b
.
Let us say that H
is the height of the tree, then H = logn
. The number of leaves = a^logn
.
f(n)
a * f(n/b)
a * a * f(n/b2)
number of leaves * θ(1)
. This is equal to n^loga
Now let us assume that the cost of operation is increasing by a significant factor at each level and by the time we reach the leaf level the value of f(n)
becomes polynomially smaller than the value n^loga
. Then the overall running time will be heavily dominated by the cost of the last level. Hence T(n) = θ(n^loga)
.
Let us assume that the cost of the operation on each level is roughly equal. In that case f(n)
is roughly equal to n^loga
. Hence, the total running time would be f(n)
times the total number of levels.
T(n) = θ(n^loga * logn)
where k
can be >=0
. Where logn
would be the height of a tree for k >= 0
.
Note: Here k+1 is the base of log in logn
Let us assume that the cost of the operation on each level is decreasing by a significant factor at each level and by the time we reach the leaf level the value of f(n)
becomes polynomially larger than the value n^loga
. Then the overall running time will be heavily dominated by the cost of the first level. Hence T(n) = θ(f(n))
.
If you are interested in more detailed reading and examples to practice, visit my blog entry Master Method to Solve Recurrences
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With