Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

how to write a recurrence relation for a given piece of code

In my algorithm and data structures class we were given a few recurrence relations either to solve or that we can see the complexity of an algorithm.

At first, I thought that the mere purpose of these relations is to jot down the complexity of a recursive divide-and-conquer algorithm. Then I came across a question in the MIT assignments, where one is asked to provide a recurrence relation for an iterative algorithm.

How would I actually come up with a recurrence relation myself, given some code? What are the necessary steps?

Is it actually correct that I can jot down any case i.e. worst, best, average case with such a relation?

Could possibly someone give a simple example on how a piece of code is turned into a recurrence relation?

Cheers, Andrew

like image 570
Andrew Tobey Avatar asked May 12 '15 21:05

Andrew Tobey


People also ask

How do you create a recurrence equation?

A linear recurrence equation of degree k or order k is a recurrence equation which is in the format xn=A1xn−1+A2xn−1+A3xn−1+… Akxn−k(An is a constant and Ak≠0) on a sequence of numbers as a first-degree polynomial.

What is a recurrence relation in programming?

A recurrence relation is an equation that uses recursion to relate terms in a sequence or elements in an array. It is a way to define a sequence or array in terms of itself.

What is recurrence relation give an example?

A recurrence relation is an equation that defines a sequence based on a rule that gives the next term as a function of the previous term(s). for some function f. One such example is xn+1=2−xn/2.


2 Answers

Okay, so in algorithm analysis, a recurrence relation is a function relating the amount of work needed to solve a problem of size n to that needed to solve smaller problems (this is closely related to its meaning in math).

For example, consider a Fibonacci function below:

Fib(a) 
{
  if(a==1 || a==0)
    return 1;
  return Fib(a-1) + Fib(a-2);
}

This does three operations (comparison, comparison, addition), and also calls itself recursively. So the recurrence relation is T(n) = 3 + T(n-1) + T(n-2). To solve this, you would use the iterative method: start expanding the terms until you find the pattern. For this example, you would expand T(n-1) to get T(n) = 6 + 2*T(n-2) + T(n-3). Then expand T(n-2) to get T(n) = 12 + 3*T(n-3) + 2*T(n-4). One more time, expand T(n-3) to get T(n) = 21 + 5*T(n-4) + 3*T(n-5). Notice that the coefficient of the first T term is following the Fibonacci numbers, and the constant term is the sum of them times three: looking it up, that is 3*(Fib(n+2)-1). More importantly, we notice that the sequence increases exponentially; that is, the complexity of the algorithm is O(2n).

Then consider this function for merge sort:

Merge(ary)
{
  ary_start = Merge(ary[0:n/2]);
  ary_end = Merge(ary[n/2:n]);

  return MergeArrays(ary_start, ary_end);
}

This function calls itself on half the input twice, then merges the two halves (using O(n) work). That is, T(n) = T(n/2) + T(n/2) + O(n). To solve recurrence relations of this type, you should use the Master Theorem. By this theorem, this expands to T(n) = O(n log n).

Finally, consider this function to calculate Fibonacci:

Fib2(n)
{
  two = one = 1;
  for(i from 2 to n)
  {
    temp = two + one;
    one = two;
    two = temp;
  }
  return two;
}

This function calls itself no times, and it iterates O(n) times. Therefore, its recurrence relation is T(n) = O(n). This is the case you asked about. It is a special case of recurrence relations with no recurrence; therefore, it is very easy to solve.

like image 155
Kittsil Avatar answered Oct 23 '22 19:10

Kittsil


To find the running time of an algorithm we need to firstly able to write an expression for the algorithm and that expression tells the running time for each step. So you need to walk through each of the steps of an algorithm to find the expression. For example, suppose we defined a predicate, isSorted, which would take as input an array a and the size, n, of the array and would return true if and only if the array was sorted in increasing order.

bool isSorted(int *a, int n) {
   if (n == 1)
      return true;       // a 1-element array is always sorted
   for (int i = 0; i < n-1; i++) {
      if (a[i] > a[i+1]) // found two adjacent elements out of order
         return false;
   }
   return true;         // everything's in sorted order
}

Clearly, the size of the input here will simply be n, the size of the array. How many steps will be performed in the worst case, for input n?

The first if statement counts as 1 step

The for loop will execute n−1 times in the worst case (assuming the internal test doesn't kick us out), for a total time of n−1 for the loop test and the increment of the index.

Inside the loop, there's another if statement which will be executed once per iteration for a total of n−1 time, at worst.

The last return will be executed once.

So, in the worst case, we'll have done 1+(n−1)+(n−1)+1

computations, for a total run time T(n)≤1+(n−1)+(n−1)+1=2n and so we have the timing function T(n)=O(n).

So in brief what we have done is-->>

1.For a parameter 'n' which gives the size of the input we assume that each simple statements that are executed once will take constant time,for simplicity assume one

2.The iterative statements like loops and inside body will take variable time depending upon the input. Which has solution T(n)=O(n), just as with the non-recursive version, as it happens.

3.So your task is to go step by step and write down the function in terms of n to calulate the time complexity

For recursive algorithms, you do the same thing, only this time you add the time taken by each recursive call, expressed as a function of the time it takes on its input. For example, let's rewrite, isSorted as a recursive algorithm:

bool isSorted(int *a, int n) {
   if (n == 1)
      return true;
   if (a[n-2] > a[n-1])         // are the last two elements out of order?
      return false;
   else
      return isSorted(a, n-1); // is the initial part of the array sorted?
}

In this case we still walk through the algorithm, counting: 1 step for the first if plus 1 step for the second if, plus the time isSorted will take on an input of size n−1, which will be T(n−1), giving a recurrence relation

T(n)≤1+1+T(n−1)=T(n−1)+O(1)

Which has solution T(n)=O(n), just as with the non-recursive version, as it happens.

Simple Enough!! Practice More to write the recurrence relation of various algorithms keeping in mind how much time each step will be executed in algorithm

like image 21
Gaurav Chandani Avatar answered Oct 23 '22 18:10

Gaurav Chandani