EDIT 2: To answer the OP new requirement
ranges = []
for key, group in groupby(enumerate(data), lambda (index, item): index - item):
group = map(itemgetter(1), group)
if len(group) > 1:
ranges.append(xrange(group[0], group[-1]))
else:
ranges.append(group[0])
Output:
[xrange(2, 5), xrange(12, 17), 20]
You can replace xrange with range or any other custom class.
Python docs have a very neat recipe for this:
from operator import itemgetter
from itertools import groupby
data = [2, 3, 4, 5, 12, 13, 14, 15, 16, 17]
for k, g in groupby(enumerate(data), lambda (i,x):i-x):
print map(itemgetter(1), g)
Output:
[2, 3, 4, 5]
[12, 13, 14, 15, 16, 17]
If you want to get the exact same output, you can do this:
ranges = []
for k, g in groupby(enumerate(data), lambda (i,x):i-x):
group = map(itemgetter(1), g)
ranges.append((group[0], group[-1]))
output:
[(2, 5), (12, 17)]
EDIT: The example is already explained in the documentation but maybe I should explain it more:
The key to the solution is differencing with a range so that consecutive numbers all appear in same group.
If the data was: [2, 3, 4, 5, 12, 13, 14, 15, 16, 17]
Then groupby(enumerate(data), lambda (i,x):i-x)
is equivalent of the following:
groupby(
[(0, 2), (1, 3), (2, 4), (3, 5), (4, 12),
(5, 13), (6, 14), (7, 15), (8, 16), (9, 17)],
lambda (i,x):i-x
)
The lambda function subtracts the element index from the element value. So when you apply the lambda on each item. You'll get the following keys for groupby:
[-2, -2, -2, -2, -8, -8, -8, -8, -8, -8]
groupby groups elements by equal key value, so the first 4 elements will be grouped together and so forth.
I hope this makes it more readable.
python 3
version may be helpful for beginners
import the libraries required first
from itertools import groupby
from operator import itemgetter
ranges =[]
for k,g in groupby(enumerate(data),lambda x:x[0]-x[1]):
group = (map(itemgetter(1),g))
group = list(map(int,group))
ranges.append((group[0],group[-1]))
more_itertools.consecutive_groups
was added in version 4.0.
Demo
import more_itertools as mit
iterable = [2, 3, 4, 5, 12, 13, 14, 15, 16, 17, 20]
[list(group) for group in mit.consecutive_groups(iterable)]
# [[2, 3, 4, 5], [12, 13, 14, 15, 16, 17], [20]]
Code
Applying this tool, we make a generator function that finds ranges of consecutive numbers.
def find_ranges(iterable):
"""Yield range of consecutive numbers."""
for group in mit.consecutive_groups(iterable):
group = list(group)
if len(group) == 1:
yield group[0]
else:
yield group[0], group[-1]
iterable = [2, 3, 4, 5, 12, 13, 14, 15, 16, 17, 20]
list(find_ranges(iterable))
# [(2, 5), (12, 17), 20]
The source implementation emulates a classic recipe (as demonstrated by @Nadia Alramli).
Note: more_itertools
is a third-party package installable via pip install more_itertools
.
The "naive" solution which I find somewhat readable atleast.
x = [2, 3, 4, 5, 12, 13, 14, 15, 16, 17, 22, 25, 26, 28, 51, 52, 57]
def group(L):
first = last = L[0]
for n in L[1:]:
if n - 1 == last: # Part of the group, bump the end
last = n
else: # Not part of the group, yield current group and start a new
yield first, last
first = last = n
yield first, last # Yield the last group
>>>print list(group(x))
[(2, 5), (12, 17), (22, 22), (25, 26), (28, 28), (51, 52), (57, 57)]
Assuming your list is sorted:
>>> from itertools import groupby
>>> def ranges(lst):
pos = (j - i for i, j in enumerate(lst))
t = 0
for i, els in groupby(pos):
l = len(list(els))
el = lst[t]
t += l
yield range(el, el+l)
>>> lst = [2, 3, 4, 5, 12, 13, 14, 15, 16, 17]
>>> list(ranges(lst))
[range(2, 6), range(12, 18)]
Here it is something that should work, without any import needed:
def myfunc(lst):
ret = []
a = b = lst[0] # a and b are range's bounds
for el in lst[1:]:
if el == b+1:
b = el # range grows
else: # range ended
ret.append(a if a==b else (a,b)) # is a single or a range?
a = b = el # let's start again with a single
ret.append(a if a==b else (a,b)) # corner case for last single/range
return ret
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With