Suppose we've got an array of intervals [(a1, b1), (a2, b2), ... , (an, bn)]
sorted with respect to starting positions and length. We want to unite all intersecting intervals. Here is a small sample data set that contains at least 2 isolated groups of intervals:
from random import randint
def gen_interval(min, max):
return sorted((randint(min, max), randint(min, max)))
sample = sorted([gen_interval(0, 100) for _ in xrange(5)] +
[gen_interval(101, 200) for _ in xrange(5)],
key=lambda (a, b): (a, b - a))
And a couple of functions we need to check for intersection and to extend intervals.
def intersects(interval1, interval2):
a1, b1 = interval1
a2, b2 = interval2
return (a1 <= a2 <= b1) or (a1 <= b2 <= b1)
def extend(interval1, interval2):
a1, b1 = interval1
a2, b2 = interval2
return (a1, b2) if b2 > b1 else (a1, b1)
We can simply accomplish the task using standard imperative programming:
result = []
for interval in sample:
if result and intersects(result[-1], interval):
result[-1] = extend(result[-1], interval)
else:
result.append(interval)
But I want to rewrite this using functional programming. My closest shot is:
subsets = []
for interval in sample:
if subsets and any(intersects(x, interval) for x in subsets[-1]):
subsets[-1].append(interval)
else:
subsets.append([interval])
result = map(lambda x: reduce(extend, x), subsets)
Here half of the work is done functionally, but I still have to split the initial array using an imperative approach. How can I get the thing done using pure functional programming? Thank you in advance.
You were getting close with the use of reduce
. This solution accumulates the list of collapsed intervals with reduce.
def unite_intervals(intervals):
def f(acc, element):
if acc and intersects(acc[-1], element):
return acc[:-1] + [extend(acc[-1], element)]
else:
return acc + [element]
return reduce(f, intervals, [])
Also, this does a ton of reallocation since I'm using +
on list objects to accumulate the result. For very large lists this will be inefficient. You may look into using something like the pyrsistent
library for more efficient data structures to accumulate into.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With