Question
I have two arrays of integers A[]
and B[]
. Array B[]
is fixed, I need to to find the permutation of A[]
which is lexiographically smaller than B[]
and the permutation is nearest to B[]
. Here what I mean is:
for i in (0 <= i < n) abs(B[i]-A[i]) is minimum and
A[]
should be smaller thanB[]
lexiographically.
For Example:
A[]={1,3,5,6,7}
B[]={7,3,2,4,6}
So,possible nearest permutation of A[]
to B[]
is
A[]={7,3,1,6,5}
My Approach
Try all permutation of A[]
and then compare that with B[]
. But the time complexity would be (n! * n)
So is there any way to optimize this?
EDIT
n
can be as large as 10^5
For better understanding
First, build an ordered map of the counts of the distinct elements of A
.
Then, iterate forward through array indices (0 to n−1), "withdrawing" elements from this map. At each point, there are three possibilities:
i < n-1
, and it's possible to choose A[i] == B[i]
, do so and continue iterating forward.A[i] < B[i]
, choose the greatest possible value for A[i] < B[i]
. Then proceed by choosing the largest available values for all subsequent array indices. (At this point you no longer need to worry about maintaining A[i] <= B[i]
, because we're already after an index where A[i] < B[i]
.) Return the result.A[i] < B[i]
, then use the approach in the previous bullet-point.
A[i] < B[i]
was possible, and then a final forward pass using the logic in the second bullet-point.Because of the overhead of maintaining the ordered map, this requires O(n log m) time and O(m) extra space, where n is the total number of elements of A
and m is the number of distinct elements. (Since m ≤ n, we can also express this as O(n log n) time and O(n) extra space.)
Note that if there's no solution, then the backtracking step will reach all the way to i == -1
. You'll probably want to raise an exception if that happens.
Edited to add (2019-02-01):
In a now-deleted answer, גלעד ברקן summarizes the goal this way:
To be lexicographically smaller, the array must have an initial optional section from left to right where
A[i] = B[i]
that ends with an elementA[j] < B[j]
. To be closest toB
, we want to maximise the length of that section, and then maximise the remaining part of the array.
So, with that summary in mind, another approach is to do two separate loops, where the first loop determines the length of the initial section, and the second loop actually populates A
. This is equivalent to the above approach, but may make for cleaner code. So:
A
.initial_section_length := -1
.A
that's less than the current element of B
, set initial_section_length
equal to the current array index. (Otherwise, don't.)A
that's equal to the current element of B
, break out of this loop. (Otherwise, continue looping.)initial_section_length == -1
, then there's no solution; raise an exception.initial_section_length-1
, "withdrawing" elements from the map. For each index, choose an as-yet-unused element of A
that's equal to the current element of B
. (The existence of such an element is ensured by the first loop.)initial_section_length
, choose the greatest as-yet-unused element of A
that's less than the current element of B
(and "withdraw" it from the map). (The existence of such an element is ensured by the first loop.)initial_section_length+1
to n−1, continuing to "withdraw" elements from the map. For each index, choose the greatest element of A
that hasn't been used yet.This approach has the same time and space complexities as the backtracking-based approach.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With