Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How would I find the time-complexity of a recursive method in Java?

I haven't been able to grasp the concept of complexity fully and I was wondering how I would be able to calculate it for method f(n) in this code:

import java.util.Random;

public class Main {
  public static void main(String[] args) {
    Random r = new Random();
    r.setSeed(System.currentTimeMillis());
    int n = r.nextInt(20) + 1;
    f(n);
  }

  private static void f(int n){
    if(n > 0){
      g(n);
      System.out.println();
      f(n-1);
    }
  }

  private static void g(int n){
    if(n > 0){
      System.out.print('X');
      g(n-1);
    }
  }
}

I understand that this is a recursive method which is what is confusing me. I see that every time the function f() is called, g() is is called, and run n times, and then f() calls itself as n-1 again until n=0. I don't know where to start Any help would be great. thanks.

like image 423
user5470380 Avatar asked Oct 25 '15 03:10

user5470380


People also ask

What is the time complexity of recursive power function?

The time complexity of this algorithm is O(log(b)) while computing power(a,b). This is because at every level in recursion sub-tree, we are doing only one computation(and using that value sub-sequently) and there are log(b) levels overall.

Does recursion reduce time complexity?

Recursion can reduce time complexity. An example of this is calculating fibonacci numbers. If you calculate the fibonacci sequence up to a number n using recursion rather than iteration, the time to complete the task when compared to that of the iterative approach was much greater.


1 Answers

A common technique for determining the runtime of recursive functions is to write out recurrence relations that describe the runtime as a quantity defined in terms of itself. Let's start with g. If we let cn denote the runtime of g(n), then we have

  • c0 = 1 (calling g(0) takes a constant amount of work)
  • cn+1 = cn + 1 (calling g(n) does a constant amount of its own work, then calls g(n-1)

We can look over a couple of values to c to see if we spot a pattern:

  • c0 = 1
  • c1 = c0 + 1 = 1 + 1 = 2
  • c2 = c1 + 1 = 2 + 1 = 3
  • c3 = c2 + 1 = 3 + 1 = 4

Generally speaking, it looks like cn = n + 1. You can formalize this by using a proof by induction if you'd like, but for now we're going to take it on faith. This means that the runtime of g is O(n).

Now, let's let dn be the runtime of calling f(n). Notice that

  • d0 = 1 (calling f(0) does a constant amount of work)
  • dn+1 = dn + cn+1 (calling f(n+1) calls g(n+1), requiring cn+1 work, then calls f(n)

We can expand this out to see if we see a pattern.

  • d0 = 1
  • d1 = c1 + d0 = 2 + 1
  • d2 = c2 + d1 = 3 + 2 + 1
  • d3 = c3 + d2 = 4 + 3 + 2 + 1

Generally, it looks like dn = n + (n-1) + (n-2) + ... + 1. You can formalize this by induction if you'd like. This is a famous sum, and it works out to n(n+1) / 2. This quantity happens to be Θ(n2), so the overall runtime is Θ(n2).

like image 200
templatetypedef Avatar answered Sep 20 '22 23:09

templatetypedef