Using more than 1 function in a groupby-aggregate results in a multi-index which I then want to flatten.
example:
df = pd.DataFrame(
{'A': [1,1,1,2,2,2,3,3,3],
'B': np.random.random(9),
'C': np.random.random(9)}
)
out = df.groupby('A').agg({'B': [np.mean, np.std], 'C': np.median})
# example output
B C
mean std median
A
1 0.791846 0.091657 0.394167
2 0.156290 0.202142 0.453871
3 0.482282 0.382391 0.892514
Currently, I do it manually like this
out.columns = ['B_mean', 'B_std', 'C_median']
which gives me the result I want
B_mean B_std C_median
A
1 0.791846 0.091657 0.394167
2 0.156290 0.202142 0.453871
3 0.482282 0.382391 0.892514
but I'm looking for a way to automate this process, as this is monotonous, time consuming and allows me to make typos as I rename the columns.
Is there a way to return a flattened index instead of a multi-index when doing a groupby-aggregate?
I need to flatten the columns to save to a text file, which will then be read by a different program that doesn't handle multi-indexed columns.
Flatten columns: use get_level_values() Flatten columns: use to_flat_index() Flatten columns: join column labels. Flatten rows: flatten all levels.
Index.flatten(order='C') Return a copy of the array collapsed into one dimension. Parameters : order : {'C', 'F', 'A'}, optional. Whether to flatten in C (row-major), Fortran (column-major) order, or preserve the C/Fortran ordering from a .
Since version 0.24.0, you can just use to_flat_index.
out.columns = [f"{x}_{y}" for x, y in out.columns.to_flat_index()]
B_mean B_std C_median
A
1 0.779592 0.137168 0.583211
2 0.158010 0.229234 0.550383
3 0.186771 0.150575 0.313409
You can do a map
join
with columns
out.columns = out.columns.map('_'.join)
out
Out[23]:
B_mean B_std C_median
A
1 0.204825 0.169408 0.926347
2 0.362184 0.404272 0.224119
3 0.533502 0.380614 0.218105
For some reason (when the column contain int) I like this way better
out.columns.map('{0[0]}_{0[1]}'.format)
Out[27]: Index(['B_mean', 'B_std', 'C_median'], dtype='object')
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With