result = False
def permute(a,l,r,b):
global result
if l==r:
if a==b:
result = True
else:
for i in range(l, r+1):
a[l], a[i] = a[i], a[l]
permute(a, l+1, r, b)
a[l], a[i] = a[i], a[l]
string1 = list("abc")
string2 = list("ggg")
permute(string1, 0, len(string1)-1, string2)
So basically I think that finding each permutation takes n^2 steps (times some constant) and to find all permutations should take n! steps. So does this make it O(n^2 * n!) ? and if so does the n! take over, making it just O(n!)?
Thanks
edit: this algorithm might seem weird for just finding permutations, and that is because i'm also using it to test for anagrams between the two strings. I just haven't renamed the method yet sorry
The number of permutations for n elements is n!, so an algorithm to produce all n! permutations would have time complexity O(n!).
Big O Notation is a way to measure an algorithm's efficiency. It measures the time it takes to run your function as the input grows. Or in other words, how well does the function scale. There are two parts to measuring efficiency — time complexity and space complexity.
Big-O is an important technique for analyzing algorithms. It allows software engineers to assess the time and space requirements of algorithms, and how these change in relation to the input. This means engineers can compare different algorithms and assess their suitability for specific tasks.
The time complexity of this approach is O(N * N!), Where N is the length of the given string. Reason: The recursive function uses the O(N) recursion stack. We also store the permutations in a list that occupies O(N * N!)
Finding each permutation doesn't take O(N^2)
. Creating each permutation happens in O(1)
time. While it is tempting to say that this O(N)
because you assign a new element to each index N
times per permutation, each permutation shares assignments with other permutations.
When we do:
a[l], a[i] = a[i], a[l]
permute(a, l+1, r, b)
All subsequent recursive calls of permute
down the line have this assignment already in place.
In reality, assignments only happen each time permute
is called, which is times. We can then determine the time complexity to build each permutation using some limit calculus. We take the number of assignments over the total number of permutations as N approaches infinity.
We have:
Expanding the sigma:
The limit of the sum is the sum of the limits:
At this point we evaluate our limits and all of the terms except the first collapse to zero. Since our result is a constant, we get that our complexity per permutation is O(1)
.
However, we're forgetting about this part:
if l==r:
if a==b:
result = True
The comparison of a == b
(between two lists) occurs in O(N)
. Building each permutation takes O(1)
, but our comparison at the end, which occurs for each permutation, actually takes O(N)
. This gives us a time complexity of O(N)
per permutation.
This gives you N!
permutations times O(N)
for each permutation giving you a total time complexity of O(N!) * O(N)
= O(N * N!)
.
Your final time complexity doesn't reduce to O(N!)
, since O(N * N!)
is still an order of magnitude greater than O(N!)
, and only constant terms get dropped (same reason why O(NlogN) != O(N)
).
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With