Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Heap's algorithm for permutations

Tags:

I'm preparing for interviews and I'm trying to memorize Heap's algorithm:

procedure generate(n : integer, A : array of any):
    if n = 1 then
          output(A)
    else
        for i := 0; i < n; i += 1 do
            generate(n - 1, A)
            if n is even then
                swap(A[i], A[n-1])
            else
                swap(A[0], A[n-1])
            end if
        end for
    end if

This algorithm is a pretty famous one to generate permutations. It is concise and fast and goes hand-in-hand with the code to generate combinations.

The problem is: I don't like to memorize things by heart and I always try to keep the concepts to "deduce" the algorithm later.

This algorithm is really not intuitive and I can't find a way to explain how it works to myself.

Can someone please tell me why and how this algorithm works as expected when generating permutations?

like image 617
Dean Avatar asked Jul 15 '15 08:07

Dean


People also ask

How does Heap's algorithm work?

Heap's Algorithm Heap. This algorithm is based on swapping elements to generate the permutations. It produces every possible permutation of these elements exactly once. This method is a systematic algorithm, which at each step chooses a pair of elements to switch in order to generate new permutations.

What is the time case complexity of Heap's algorithm?

The algorithm you mentioned is for generating all permutations, and because there are N! permutations for every N-element array, the complexity is O(N!).


2 Answers

Heap's algorithm is probably not the answer to any reasonable interview question. There is a much more intuitive algorithm which will produce permutations in lexicographical order; although it is amortized O(1) (per permutation) instead of O(1), it is not noticeably slower in practice, and it is much easier to derive on the fly.

The lexicographic order algorithm is extremely simple to describe. Given some permutation, find the next one by:

  1. Finding the rightmost element which is smaller than the element to its right.

  2. Swap that element with the smallest element to its right which is larger than it.

  3. Reverse the part of the permutation to the right of where that element was.

Both steps (1) and (3) are worst-case O(n), but it is easy to prove that the average time for those steps is O(1).


An indication of how tricky Heap's algorithm is (in the details) is that your expression of it is slightly wrong because it does one extra swap; the extra swap is a no-op if n is even, but significantly changes the order of permutations generated when n is odd. In either case, it does unnecessary work. See https://en.wikipedia.org/wiki/Heap%27s_algorithm for the correct algorithm (at least, it's correct today) or see the discussion at Heap's algorithm permutation generator

To see how Heap's algorithm works, you need to look at what a full iteration of the loop does to the vector, in both even and odd cases. Given a vector of even length, a full iteration of Heap's algorithm will rearrange the elements according to the rule

[1,...n] → [(n-2),(n-1),2,3,...,(n-3),n,1]

whereas if the vector is of odd length, it will be simply swap the first and last elements:

[1,...n] → [n,2,3,4,...,(n-2),(n-1),1]

You can prove that both of these facts are true using induction, although that doesn't provide any intuition as to why it's true. Looking at the diagram on the Wikipedia page might help.

like image 77
rici Avatar answered Sep 18 '22 13:09

rici


I found an article that tries to explain it here: Why does Heap's algorithm work?

However, I think it is hard to understand it, so came up with an explanation that is hopefully easier to understand:


Please just assume that these statements are true for a moment (i'll show that later):

Each invocation of the "generate" function

(I) where n is odd, leaves the elements in the exact same ordering when it is finished.

(II) where n is even, rotates the elements to the right, for example ABCD becomes DABC.

So in the "for i"-loop

when

  • n is even

    The recursive call "generate(n - 1, A)" does not change the order.

    So the for-loop can iteratively swap the element at i=0..(n-1) with the element at (n - 1) and will have called "generate(n - 1, A)" each time with another element missing.

  • n is odd

    The recursive call "generate(n - 1, A)" has rotated the elements right.

    So the element at index 0 will always be a different element automatically.

    Just swap the elements at 0 and (n-1) in each iteration to produce a unique set of elements.


Finally, let's see why the initial statements are true:

Rotate-right

(III) This series of swaps result in a rotation to the right by one position:

A[0] <-> A[n - 1]
A[1] <-> A[n - 1]
A[2] <-> A[n - 1]
...
A[n - 2] <-> A[n - 1]

For example try it with sequence ABCD:

A[0] <-> A[3]: DBCA
A[1] <-> A[3]: DACB
A[2] <-> A[3]: DABC

No-op

(IV) This series of steps leaves the sequence in the exact same ordering as before:

Repeat n times:
    Rotate the sub-sequence a[0...(n-2)] to the right
    Swap: a[0] <-> a[n - 1]

Intuitively, this is true:

If you have a sequence of length 5, then rotate it 5 times, it ends up unchanged.

Taking the element at 0 out before the rotation, then after the rotation swapping it with the new element at 0 does not change the outcome (if rotating n times).

Induction

Now we can see why (I) and (II) are true:

If n is 1: Trivially, the ordering is unchanged after invoking the function.

If n is 2: The recursive calls "generate(n - 1, A)" leave the ordering unchanged (because it invokes generate with first argument being 1). So we can just ignore those calls. The swaps that get executed in this invocation result in a right-rotation, see (III).

If n is 3: The recursive calls "generate(n - 1, A)" result in a right-rotation. So the total steps in this invocation equal (IV) => The sequence is unchanged.

Repeat for n = 4, 5, 6, ...

like image 24
Stephen Friedrich Avatar answered Sep 18 '22 13:09

Stephen Friedrich