If g(n) = sqrt(n)sqrt(n), does the complexity of g(n) = O(2n)?
Any help is appreciated.
Square root time complexity means that the algorithm requires O(N^(1/2)) evaluations where the size of input is N. As an example for an algorithm which takes O(sqrt(n)) time, Grover's algorithm is one which takes that much time.
They are not equivalent: sqrt(N) will increase a lot more quickly than log2(N). There is no constant C so that you would have sqrt(N) < C. log(N) for all values of N greater than some minimum value. So you need to take the logarithm(!) of sqrt(N) to bring it down to the same order of complexity as log2(N).
Any root function grows faster than any power of the natural log function.
n^n grows larger than n!
A useful technique when comparing two exponential functions is to get them to have the same base:
√n√n = (2lg √n)√n = 2√n lg √n
Now you're comparing 2√n lg √n against 2n, and hopefully from that it's easy to see that the former function does not grow as rapidly as the latter, so √n√n = O(2n) is indeed true.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With