Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

When can an algorithm have square root(n) time complexity?

Can someone give me example of an algorithm that has square root(n) time complexity. What does square root time complexity even mean?

like image 658
vaibhav9225 Avatar asked Oct 18 '15 06:10

vaibhav9225


People also ask

What is the time complexity of square root of N?

Time Complexity: O(√ n).

Which grows faster sqrt N or Logn?

They are not equivalent: sqrt(N) will increase a lot more quickly than log2(N). There is no constant C so that you would have sqrt(N) < C. log(N) for all values of N greater than some minimum value. So you need to take the logarithm(!) of sqrt(N) to bring it down to the same order of complexity as log2(N).

Is there an algorithm for square roots?

Algorithm. Take a reasonable guess (approximate root) for the square root. Add the approximate root with the original number divided by the approximate root and divide by 2. Continue step 2 until the difference in the approximate root along the iterations is less than the desired value (or precision value).

Is sqrt n bigger than log n?

we can clearly say that sqrt(n) is larger then logn.


1 Answers

  • Square root time complexity means that the algorithm requires O(N^(1/2)) evaluations where the size of input is N.
  • As an example for an algorithm which takes O(sqrt(n)) time, Grover's algorithm is one which takes that much time. Grover's algorithm is a quantum algorithm for searching an unsorted database of n entries in O(sqrt(n)) time.
  • Let us take an example to understand how can we arrive at O(sqrt(N)) runtime complexity, given a problem. This is going to be elaborate, but is interesting to understand. (The following example, in the context for answering this question, is taken from Coding Contest Byte: The Square Root Trick , very interesting problem and interesting trick to arrive at O(sqrt(n)) complexity)

    • Given A, containing an n elements array, implement a data structure for point updates and range sum queries.

      • update(i, x)-> A[i] := x (Point Updates Query)
      • query(lo, hi)-> returns A[lo] + A[lo+1] + .. + A[hi]. (Range Sum Query)
    • The naive solution uses an array. It takes O(1) time for an update (array-index access) and O(hi - lo) = O(n) for the range sum (iterating from start index to end index and adding up).

    • A more efficient solution splits the array into length k slices and stores the slice sums in an array S.
    • The update takes constant time, because we have to update the value for A and the value for the corresponding S. In update(6, 5) we have to change A[6] to 5 which results in changing the value of S1 to keep S up to date. Updating element
    • The range-sum query is interesting. The elements of the first and last slice (partially contained in the queried range) have to be traversed one by one, but for slices completely contained in our range we can use the values in S directly and get a performance boost. Range-sum query
    • In query(2, 14) we get,

 query(2, 14) = A[2] + A[3]+ (A[4] + A[5] + A[6] + A[7]) + (A[8] + A[9] + A[10] + A[11]) + A[12] + A[13] + A[14] ;   query(2, 14) = A[2] + A[3] + S[1] + S[2] + A[12] + A[13] + A[14] ;  query(2, 14) = 0 + 7 + 11 + 9 + 5 + 2 + 0;  query(2, 14) = 34; 
  • The code for update and query is:

  def update(S, A, i, k, x):       S[i/k] = S[i/k] - A[i] + x       A[i] = x 

  def query(S, A, lo, hi, k):       s = 0       i = lo       //Section 1 (Getting sum from Array A itself, starting part)       while (i + 1) % k != 0 and i <= hi:           s += A[i]           i += 1       //Section 2 (Getting sum from Slices directly, intermediary part)       while i + k <= hi:           s += S[i/k]           i += k       //Section 3 (Getting sum from Array A itself, ending part)       while i <= hi:           s += A[i]           i += 1   return s 
  • Let us now determine the complexity.
  • Each query takes on average
    • Section 1 takes k/2 time on average. (you might iterate atmost k/2)
    • Section 2 takes n/k time on average, basically number of slices
    • Section 3 takes k/2 time on average. (you might iterate atmost k/2)
    • So, totally, we get k/2 + n/k + k/2 = k + n/k time.
  • And, this is minimized for k = sqrt(n). sqrt(n) + n/sqrt(n) = 2*sqrt(n)
  • So we get a O(sqrt(n)) time complexity query.
like image 116
a3.14_Infinity Avatar answered Oct 20 '22 00:10

a3.14_Infinity