I am a complete newbie at Big O and I am a bit stumped by this. I have:
for (int i = 1; i < n*n; i *= 2)
In my mind this would equate to
Am I right or can it be simplified to N as you are doubling the inputs with n*n
and halving it with i *= 2
?
Logarithmic time complexity log(n): Represented in Big O notation as O(log n), when an algorithm has O(log n) running time, it means that as the input size grows, the number of operations grows very slowly.
O(2n) An example of an O(2n) function is the recursive calculation of Fibonacci numbers. O(2n) denotes an algorithm whose growth doubles with each addition to the input data set. The growth curve of an O(2n) function is exponential - starting off very shallow, then rising meteorically.
O(nlogn) is known as loglinear complexity. O(nlogn) implies that logn operations will occur n times. O(nlogn) time is common in recursive sorting algorithms, sorting algorithms using a binary tree sort and most other types of sorts. The above quicksort algorithm runs in O(nlogn) time despite using O(logn) space.
So, O(N*log(N)) is far better than O(N^2) . It is much closer to O(N) than to O(N^2) . But your O(N^2) algorithm is faster for N < 100 in real life. There are a lot of reasons why it can be faster.
In this case you have
O(log2(n ^ 2))
which is
O(2 * log2(n))
or just
O(ln N)
note if n * n > (1 << 30)
you will have an infinite loop.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With