Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Time complexity for Babylonian Method

What would be the time complexity for Babylonian Method? is it log(n) where n is the number we want to find the squared root for? If so, why is this so?

like image 869
kchoi Avatar asked Sep 06 '12 22:09

kchoi


People also ask

How does the Babylonian method work?

method itself is very simple: if you want to calculate √p, choose any initial value as your first guess, call it x, and then iterate by repeatedly finding a new value for x according to the following formula. Basically, the idea is that if x is greater (smaller) than √p, then p/x will be smaller (greater) than √p.

What is the complexity of square root?

Square root time complexity means that the algorithm requires O(N^(1/2)) evaluations where the size of input is N. As an example for an algorithm which takes O(sqrt(n)) time, Grover's algorithm is one which takes that much time.

What is the algorithm for square root?

Newton's method for square root If we have to find the square root of a number n, the function would be f(x) = x² - N and we would have to find the root of the function, f(x). Now, the better approximation can be found using (1). This is how the algorithm for finding square root of a number comes.

How many methods are there to find square root?

Hence, the methods for finding the square roots are prime factorization, repeated subtraction method, and long division method.


1 Answers

Looking at the wikipedia section for the Babylonian method we can see that the relative error at step k, ek, satisfies the equation ek < 2-f(k), where f(k) is defined recursively as follows:

f(1) = 2 and f(k+1) = 2 * f(k) + 1 for n > 1

By induction, f(k) = 3 * 2k-1 - 1

Let n be the input to our algorithm, which stops when we are sure that the total error is less than a constant m

The error at step k, Ek, satisfies the equation Ek = ek * n

Therefore the algorithm will terminate once ek * n < m

This will occur when 2f(k) > n/m which is equivalent to f(k) > log2(n/m)

That equation is true when 2k-1 > (log2(n/m) - 1)/3 which is true when k > log2((log2(n/m) - 1)/3)+ 1

Therefore the algorithm will terminate in O(log(log(n/m)+1)) steps.

Using this logarithmic summation formula you can show that log(log(x)+c)) = O(log(log(x)).

Therefore the Babylonian method takes O(log(log(n/m)) steps

like image 104
laurie Avatar answered Nov 13 '22 10:11

laurie