I came across this time complexity function and according to me, it is actually constant. Please correct me if I am wrong.
n^(1/logn) => (2^m)^(1/log(2^m)) => (2^m)^(1/m) => 2
Since any n can be written as a power of 2, I can do the above simplification and prove that it is constant, right?
So no, O(1) and O(logn) are not the same.
O(1) means the running time of an algorithm is independent of the input size and is bounded by a constant 'c'. Whereas, O(log n) means when input size 'n' increases exponentially, our running time will increase linearly.
Logarithmic time complexity log(n): Represented in Big O notation as O(log n), when an algorithm has O(log n) running time, it means that as the input size grows, the number of operations grows very slowly.
→ At exactly 50 elements the two algorithms take the same number of steps. → As the data increases the O(N) takes more steps. Since the Big-O notation looks at how the algorithm performs as the data grows to infinity, this is why O(N) is considered to be less efficient than O(1) .
Assuming log
is the natural log, then this is equivalent to e
, not 2
, but either way it's a constant.
First, let:
k = n^(1 / log n)
Then take the log of both sides:
log k = (1 / log n) * log n
So:
log k = 1
Now raise both sides to the power of e
to get:
e^(log k) = e^(1)
And thus:
k = e.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With