Given a collection of distinct numbers, return all possible permutations.
For example, [1,2,3] have the following permutations:
[ [1,2,3], [1,3,2], [2,1,3], [2,3,1], [3,1,2], [3,2,1] ]
My Iterative Solution is :
public List<List<Integer>> permute(int[] nums) {
List<List<Integer>> result = new ArrayList<>();
result.add(new ArrayList<>());
for(int i=0;i<nums.length;i++)
{
List<List<Integer>> temp = new ArrayList<>();
for(List<Integer> a: result)
{
for(int j=0; j<=a.size();j++)
{
a.add(j,nums[i]);
List<Integer> current = new ArrayList<>(a);
temp.add(current);
a.remove(j);
}
}
result = new ArrayList<>(temp);
}
return result;
}
My Recursive Solution is:
public List<List<Integer>> permuteRec(int[] nums) {
List<List<Integer>> result = new ArrayList<List<Integer>>();
if (nums == null || nums.length == 0) {
return result;
}
makePermutations(nums, result, 0);
return result;
}
void makePermutations(int[] nums, List<List<Integer>> result, int start) {
if (start >= nums.length) {
List<Integer> temp = convertArrayToList(nums);
result.add(temp);
}
for (int i = start; i < nums.length; i++) {
swap(nums, start, i);
makePermutations(nums, result, start + 1);
swap(nums, start, i);
}
}
private ArrayList<Integer> convertArrayToList(int[] num) {
ArrayList<Integer> item = new ArrayList<Integer>();
for (int h = 0; h < num.length; h++) {
item.add(num[h]);
}
return item;
}
According to me the time complexity(big-Oh) of my iterative solution is: n * n(n+1)/2~ O(n^3)
I am not able to figure out the time complexity of my recursive solution.
Can anyone explain complexity of both?
The time complexity is equal to the number of combinations there are. In this case it is n choose k . – Nikos M.
The time complexity of Backtracking For example, if the function calls itself two times, then its time complexity is O(2 ^ N), and if it calls three times, then O(3 ^ N) and so on. Hence the time complexity of backtracking can be defined as O(K ^ N), where 'K' is the number of times the function calls itself.
For any given string of length n there are n! possible permutations, and we need to print all of them so Time complexity is O(n * n!). The function will be called recursively and will be stored in call stack for all n!
Talking about the time complexity of the combination function, we can say that to generate valid combinations exactly once, it will take O(n C r), and for getting r combinations, it is O(r). Hence, the total time complexity is O( r * [nCr]).
The recursive solution has a complexity of O(n!)
as it is governed by the equation: T(n) = n * T(n-1) + O(1)
.
The iterative solution has three nested loops and hence has a complexity of O(n^3)
.
However, the iterative solution will not produce correct permutations for any number apart from 3
.
For n = 3
, you can see that n * (n - 1) * (n-2) = n!
. The LHS is O(n^3)
(or rather O(n^n)
since n=3
here) and the RHS is O(n!)
.
For larger values of the size of the list, say n
, you could have n
nested loops and that will provide valid permutations. The complexity in that case will be O(n^n)
, and that is much larger than O(n!)
, or rather, n! < n^n
. There is a rather nice relation called Stirling's approximation which explains this relation.
It's the output (which is huge) matters in this problem, not the routine's implementation. For n
distinct items, there are n!
permutations to be returned as the answer, and thus we have at least O(n!)
complexity.
With a help of Stirling's approximation
O(n!) = O(n^(1/2+n)/exp(n)) = O(sqrt(n) * (n/e)^n)
we can easily see, that O(n!) > O(n^c)
for any constant c
, that's why it doesn't matter if the implementation itself adds another O(n^3)
since
O(n!) + O(n^3) = O(n!)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With