What would be the time complexity for Babylonian Method? is it log(n) where n is the number we want to find the squared root for? If so, why is this so?
This simple example has two 'for' loops, and for each element 'N' it will execute 'N' times. So in total it will execute N^2 times. In big O notation we would say that this algorithm has a complexity of O(N^2).
Hence O(n(n+1)/2) = O(n^2).
You can calculate big O like this: Any number of nested loops will add an additional power of 1 to n. So, if we have three nested loops, the big O would be O(n^3). For any number of loops, the big O is O(n^(number of loops)).
For any loop, we find out the runtime of the block inside them and multiply it by the number of times the program will repeat the loop. All loops that grow proportionally to the input size have a linear time complexity O(n) . If you loop through only half of the array, that's still O(n) .
Looking at the wikipedia section for the Babylonian method we can see that the relative error at step k, ek, satisfies the equation ek < 2-f(k), where f(k) is defined recursively as follows:
f(1) = 2 and f(k+1) = 2 * f(k) + 1 for n > 1
By induction, f(k) = 3 * 2k-1 - 1
Let n be the input to our algorithm, which stops when we are sure that the total error is less than a constant m
The error at step k, Ek, satisfies the equation Ek = ek * n
Therefore the algorithm will terminate once ek * n < m
This will occur when 2f(k) > n/m which is equivalent to f(k) > log2(n/m)
That equation is true when 2k-1 > (log2(n/m) - 1)/3 which is true when k > log2((log2(n/m) - 1)/3)+ 1
Therefore the algorithm will terminate in O(log(log(n/m)+1)) steps.
Using this logarithmic summation formula you can show that log(log(x)+c)) = O(log(log(x)).
Therefore the Babylonian method takes O(log(log(n/m)) steps
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With