I have an algorithm, and I have figured out that its run-time complexity follows the following formula:
[log(1)]^2 + [log(2)]^2 + [log(3)]^2 + ....... + [log(n)]^2
The base of log is 2.
How do I figure out what the Θ/Ο algorithmic complexity is from this formula?
for the upper bound you can reason similat to log(n!) = O(nlog(n))
[log(1)]^2 + [log(2)]^2 + [log(3)]^2 + ....... + [log(n)]^2 < [log(n)]^2 + ... + [log(n)]^2
= n[log(n)]^2
[log(1)]^2 + [log(2)]^2 + [log(3)]^2 + ....... + [log(n)]^2 = O( n[log(n)]^2 )
To prove for lower bound, need to show that given sum is >= constant multiple of n[log(n)]^2
Deleting first half of the terms from the sum
[log(1)]^2 + [log(2)]^2 + [log(3)]^2 + ....... + [log(n)]^2 >= [log(n/2)]^2 + [log(n/2 + 1)]^2 + [log(3)]^2 + ....... + [log(n)]^2
Replacing each term with log(n/2) ^ 2
[log(1)]^2 + [log(2)]^2 + [log(3)]^2 + ....... + [log(n)]^2 >= (n/2) * [log(n/2)]^2
Lower bound = (n/2) * [log(n) - 1] ^ 2
We can prove that log(n) - 1 >= (1/2) * log(n)
Hence lower bound = (n/8) * [log(n)] ^ 2
and Upper bound = n * [log(n)] ^ 2
So Θ([log(1)]^2 + [log(2)]^2 + [log(3)]^2 + ....... + [log(n)]^2) = n * [log(n)] ^ 2
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With