Given a set of elements, how can I find difference between MAX and MIN in all subsets of this list.
Eg:
set = 1 2 3
Subset = {1}, max(s)-min(s) = 0.
Subset = {2}, max(s)-min(s) = 0.
Subset = {3}, max(s)-min(s) = 0.
Subset = {1,2}, max(s)-min(s) = 1.
Subset = {2,3}, max(s)-min(s) = 1.
Subset = {1,3}, max(s)-min(s) = 2.
Subset = {1,2,3}, max(s)-min(s) = 2.
So the output will be 1+1+2+2 = 6
Sort the list.
After sorting, the i'th element will be minimum in all (and only) subsets that do not include the i-1 first elements, and does include this element. There are 2^(n-i)
of those (when i
is 1 based).
Similarly, i
will be the highest element in each subset that does not include any number after i
, and do include i
, and there are 2^(i-1)
such subsets (again, 1 based).
So, after sorting, just iterate, and for each i
add:
arr[i] * (2^(i-1) - 2^(n-i))
Effectively adding the sum by arr[i] * #times_i_is_max
, and reducing it by arr[i] * #times_i_is_min
In your example:
sorted=1,2,3
1* (2^0 - 2^2) + 2*(2^1 - 2^1) + 3*(2^2 - 2^0) =
1*(-3) + 2*0 + 3*(3) = -3 + 0 + 9 = 6
The bottleneck of this algorithm is the sorting, which is O(nlogn)
- after that, everything is done in linear scan of the array.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With