I was wondering, given a list of integers, say l
, and if we are allowed to select 3 integers from this list, say left
, middle
, right
, where middle > left, right
and left, middle, right
appear in that order in the list (ie. index(left)<index(middle)<index(right)
), does there exist an O(n)
solution for finding the maximum of middle - left + middle - right
? You may suppose that lists that do not satisfy these conditions do not appear (eg. [5, 0, 5] as pointed out by Eric Duminil)
Currently, I am able to come up with what I believe is (roughly) an O(n^2)
solution (correct me if I am wrong).
Essentially, my current idea is to do:
maximum = 0
for idx in range(1, N - 1):
left = min(l[0: idx])
right = min(l[idx + 1:])
middle = l[idx]
if left < middle and right < middle:
new_max = middle - left + middle - right
maximum = max(new_max, maximum)
Help/hints would be greatly appreciated.
Thanks!
You can run through your numbers once, keeping a running minimum value, and storing it at each step, so that at the end you know what the minimum value is to the left of each index. That's O(n).
Similarly, you can run through all your numbers once from right to left, and work out what the minimum value is to the right of each index. That is O(n).
Then you can run through each possible middle
value, and take the left
and right
values from your earlier computations. That is O(n).
O(n) + O(n) + O(n) = O(n).
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With