I'm given this function and want to know the time complexity:
int f2(int n) {
if(n<2) return 1;
return f2(n/2)+f2(n-2);
}
I calculated its runtime to be O(n2) using the telescopic expansion method. Is this correct?
Edit: After reconsidering, I found that this function has a similar structure to mergesort, which has complexity Θ(n log n). Is that correct?
Suppose that it's polynomial in n, i.e. that the running time T(n) is at most a*nb for some constants a
and b
. Then, we have:
T(n) = T(n/2) + T(n-2)
= a*(n/2)^b + a*(n-2)^b
= a/2^b * n^b + a*n^b + (lower-order terms)
= a*n^b * (1/2^b + 1) + (lower-order terms)
Where the (lower-order terms)
consist of sums and differences of n
raised to powers strictly less than b
. For sufficiently large n
, the a*nb * (1/2b + 1) term will dominate the lower-order terms, and that will always be larger than a*nb, so it's not true that T(n) is O(nb) for any constant b
.
Now suppose that it's exponential in n, i.e. that T(n) is at most a*bn for some constants a
and b
. Then:
T(n) = T(n/2) + T(n-2)
= a*b^(n/2) + a*b^(n-2)
= a*b^n * (b^(-n/2) + 1/b^2)
Now if b > 1 and n >= -2 log(1 - 1/b2) / log(b), then:
n >= -2 log(1 - 1/b^2) / log(b)
-n/2 <= log(1 - 1/b^2) / log(b) = log_b(1 - 1/b^2)
b^(-n/2) <= 1 - 1/b^2
b^(-n/2) + 1/b^2 <= 1
T(n) = a*b^n * (b^(-n/2) + 1/b^2) <= a*b^n * 1 = a*b^n
Hence, for any a > 0 and b > 1, there exists an N (equal to -2 log(1 - 1/b2) / log(b)) such that for all n >= N, T(n) <= a*b^n.
So what does this say? T(n) is more than polynomial (for any polynomial; that also includes polylogarithmic functions, such any polylogarithmic function is o(a larger non-logarithmic polynomial)), but T(n) is less than any exponential. I'm not sure if there's a simple formulation for the exact result, but those are some useful bounds.
EDIT
As nhahtdh points out, this makes the running time something called quasi-polynomial time.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With