How do you iterate over a Pandas Series generated from a .groupby('...').size()
command and get both the group name and count.
As an example if I have:
foo -1 7 0 85 1 14 2 5
how can I loop over them so the that each iteration I would have -1 & 7, 0 & 85, 1 & 14 and 2 & 5 in variables?
I tried the enumerate option but it doesn't quite work. Example:
for i, row in enumerate(df.groupby(['foo']).size()): print(i, row)
it doesn't return -1, 0, 1, and 2 for i
but rather 0, 1, 2, 3.
Using the iloc() method to iterate rows The . iloc. () method is used to access the rows and columns of the DataFrame by using their integer-value locations in the DataFrame. Therefore, by specifying the integer value of the row and column index, you can iterate over the rows of the pandas DataFrame.
get_group() to get all the groups. First we'll get all the keys of the group and then iterate through that and then calling get_group() method for each key. get_group() method will return group corresponding to the key.
Example #1: Use Series. get() function to get the value for the passed index label in the given series object. Output : Now we will use Series.
Update:
Given a pandas Series:
s = pd.Series([1,2,3,4], index=['a', 'b', 'c', 'd']) s #a 1 #b 2 #c 3 #d 4 #dtype: int64
You can directly loop through it, which yield one value from the series in each iteration:
for i in s: print(i) 1 2 3 4
If you want to access the index at the same time, you can use either items
or iteritems
method, which produces a generator that contains both the index and value:
for i, v in s.items(): print('index: ', i, 'value: ', v) #index: a value: 1 #index: b value: 2 #index: c value: 3 #index: d value: 4 for i, v in s.iteritems(): print('index: ', i, 'value: ', v) #index: a value: 1 #index: b value: 2 #index: c value: 3 #index: d value: 4
Old Answer:
You can call iteritems()
method on the Series:
for i, row in df.groupby('a').size().iteritems(): print(i, row) # 12 4 # 14 2
According to doc:
Series.iteritems()
Lazily iterate over (index, value) tuples
Note: This is not the same data as in the question, just a demo.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With