I am trying to solve a recurrence using substitution method. The recurrence relation is:
T(n) = 4T(n/2)+n2
My guess is T(n) is Θ(nlogn) (and i am sure about it because of master theorem), and to find an upper bound, I use induction. I tried to show that T(n)<=cn2logn, but that did not work.
I got T(n)<=cn2logn+n2. Then I tried to show that, if T(n)<=c1n2logn-c2n2, then it is also O(n2logn), but that also didn't work and I got T(n)<=c1n2log(n/2)-c2n2+n2`.
How can I solve this recurrence?
T(n)=4T(n/2)+n. For this recurrence, there are a=4 subproblems, each dividing the input by b=2, and the work done on each call is f(n)=n. Thus nlogba is n2, and f(n) is O(n2-ε) for ε=1, and Case 1 applies. Thus T(n) is Θ(n2).
Example 1: T(n)=9T(n/3)+n. Here a = 9, b = 3, f(n) = n, and nlogb a = nlog3 9 = Θ(n2). Since f(n) = O(nlog3 9−ϵ) for ϵ = 1, case 1 of the master theorem applies, and the solution is T(n) = Θ(n2).
Can masters theorem solve the recurrence 4T(n/2) + n2. logn ? it is said that it falls between the case 2 & 3 and no solution possible with this method .
Based on CLRS Theorem 4.1, master theorem doesn't apply to T(n)=4T(n/2)+n2logn.
T(n) = 4T(n/2) + n2 = n2 + 4[4T(n/4) + n²/4] = 2n2 + 16T(n/4) = ... = k⋅n2 + 4kT(n/2k) = ...
The process stops when 2k
reaches n
.
⇒ k = log2n
⇒ T(n) = O(n2logn)
You can rewrite your equation in a non-recursive form
Your recursive equation is pretty simple: every time you expand T(n/2) only one new term appears. So in the non-recursive form you'll have a sum of quadratic terms. You only have to show that A(n) is log(n)
(I leave it to you as an easy exercise).
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With