I'm writing a program:
For example input is 5 ( it can be not only 5 ) numbers and I read data in array: 1, 2, 3, 4, 5. I can choose some element from this array ( not first or last ), e.g. 3, then I delete this number in array, and to sum (which initially is 0 ) adding multiplied first-to-the-left with first-to-the-right elements ( means 2*4 in this case ). Resulted array is 1, 2, 4, 5, then I do it again and again until number of elements equals 2 ( exactly 1 and 5 as we can't delete those numbers ).
For example: (where A, B, C, D are pairs of numbers 1 and 2, 2 and 3 and etc.. )
A B C D
1 2 3 4 5
There are 6 possible combinations of the order deleting elements ( and adding left-right multiplication to sum ):
A (B (C D))
A ((B C) D)
(A B) (C D)
(A (B C)) D
((A B) C) D
A (B (C D))
The goal is to find smallest sum! There are 2 ways of solving, some clever algorithm or using recursion for every combination and then choosing smallest one. Can anyone give me a tip how to write such recursion, where to start writing ( or perhaps some clever algorithm ). Tnx
The recursive backtracking solution is fairly straightforward (pseudocode):
def solve (list):
if list.length == 2:
return 0
ans = INFINITY
for each interior element:
remove element from list
ans = min(ans, leftelement * rightelement + solve(list))
place element back in original position in list
return ans
However, this algorithm is not fast enough to work on non-trivial datasets, as its runtime is factorial (O(n!)). The usual method to optimize recursive solutions is dynamic programming. Let's come up with the substates:
dp[a][b]: minimum cost to reduce array[a .. b] to two elements on the edge
(array[a] and array[b])
The base cases are dp[i][i + 1], i = {0 .. size - 1) (two adjacent elements). Since there are no interior elements to delete, this substate is set to 0.
For all other cases dp[a][b] where b - a >= 2, we can divide array[a .. b] by removing any interior element indexed between [a + 1, b - 1]. If we divide the subarray on element i, the cost is dp[a][i] + dp[i][b] + array[a] * array[b]. We want to minimize the cost for each substate, so we will take the minimum of these values for all possible dividing elements. The final answer is simply dp[0][size - 1].
Since there are O(n^2) substates, each with an average of O(n) dividing elements to consider, the total runtime is cubic (O(n ^ 3)), which should run for small to medium datasets in a reasonable time.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With