Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Time complexity : Getting incorrect result

Tags:

The following code is from the book "Cracking the coding interview". The code prints all permutations of a string.

Question : What is the time complexity of the code below.

void permutation(String str) {
    permutation(str, "");
}

private void permutation(String str, String prefix) {

    if (str.length() == 0){
        System.out.println(prefix);
    } else {
        for (int i = 0; i <str.length(); i++) {
            String rem = str.substring(0, i) + str.substring(i + 1);
            permutation(rem, prefix + str.charAt(i));
        }
    }
}

My Understanding:

I am able to derive the time complexity to : n * n!.

I am sure I am missing the time complexity of the blue, green and yellow nodes. But despite repeated attempts have not been clearly able to figure out the way out.

Can someone share some inputs, preferably with examples?

enter image description here

like image 315
Deep Avatar asked May 01 '20 08:05

Deep


People also ask

What is time complexity error?

Time complexity "quantifies the amount of time taken by an algorithm to run as a function of the length of the string representing the input" (wikipedia). The only reason a runtime time complexity error would occur is if something was tracking the time taken for a given input.

Which time complexity is best?

1. O(1) has the least complexity. Often called “constant time”, if you can create an algorithm to solve the problem in O(1), you are probably at your best.

How is worst case time complexity calculated?

Let T1(n), T2(n), … be the execution times for all possible inputs of size n. The worst-case time complexity W(n) is then defined as W(n) = max(T1(n), T2(n), …).

What is time complexity of an algorithm explain with example?

When we analyse an algorithm, we use a notation to represent its time complexity and that notation is Big O notation. For Example: time complexity for Linear search can be represented as O(n) and O(log n) for Binary search (where, n and log(n) are the number of operations).


1 Answers

You're very much on the right track here, and your overall assessment (that the runtime is Θ(n · n!)) is correct! One technique we can use to derive the runtime would be to sum up, the work done at each layer in the tree. We know that

  • in layer 0 (the top layer), there's 1 node processing a string of length n,
  • in layer 1, there are n nodes each processing strings of length n - 1,
  • in layer 2, there are n(n-1) nodes each processing strings of length n - 2,
  • in layer 3, there are n(n-1)(n-2) nodes each processing strings of length n -3,
  • ...
  • in layer n, there are n! nodes each processing strings of length 0.

To account for the total work done here, let's assume that each recursive call does O(1) work at baseline, plus work proportional to the length of the string that it's processing. This means that we need to work out two sums to determine the total work done:

Sum 1: Number of Calls

1 + n + n(n-1) + n(n-1)(n-2) + ... + n!

and

Sum 2: Work Processing Strings

1 · n + n · (n-1) + n(n-1)·(n-2) + ... + n! · 0

But there's one other factor to account for - each recursive call that hits a base case prints out the string produced this way, which takes O(n) time. So that adds in a final factor of Θ(n · n!). The total work done, therefore, will be Θ(n · n!), plus the work done by all the intermediate recursive calls building up the answers.

Let's treat each of those sums individually.

Sum 1: Number of Calls

We're dealing with this unusual sum:

1 + n + n(n-1) + n(n-1)(n-2) + ... + n!

The main observation we need is that

  • 1 = n! / n!,
  • n = n! / (n-1)!,
  • n(n-1) = n! / (n-2)!
  • ...
  • n! = n! / (n-n)!

So, in other words, this sum is

n! / n! + n! / (n-1)! + n! / (n-2)! + ... + n! / 0!

= n!(1 / n! + 1/(n-1)! + 1/(n-2)! + ... + 1/0!)

≤ en!

= Θ(n!)

Here, that last step follows from the fact that the sum

1/0! + 1/1! + 1/2! + 1/3! + ...

out to infinity is one of the ways of defining the number e. This means that the number of total recursive calls made here is Θ(n!).

And, intuitively, that should make sense. Each recursive call, except for recursive calls on strings of length one, makes two other recursive calls, so the recursion tree is mostly branching. And there's a nice fact about trees that says that a tree where every node is branching will not have more internal nodes than leaves. Since there are n! leaves, it's not surprising that the remaining number of nodes comes out to something that's not asymptotically bigger than n!.

Sum 2: Work Processing Strings

This is the sum

1 · n + n · (n-1) + n(n-1)·(n-2) + ... + n! · 0

We can rewrite this as

n + n(n-1) + n(n-1)(n-2) + ...

and hey! We know that sum - it's almost literally the same one we just saw. That works out to Θ(n!).

Putting It All Together

To summarize, this recursive algorithm does

  • Θ(n!) work simply due to the number of recursive calls, with each call taking some base amount of time;
  • Θ(n!) work done by the recursive calls as they form and concatenate substrings; and
  • Θ(n · n!) work done by the final recursive calls to print out all the permutations.

Summing this all up gives the Θ(n · n!) runtime that you proposed in your question.

Hope this helps!

like image 142
templatetypedef Avatar answered Oct 02 '22 15:10

templatetypedef