I am trying to tidy up an already existing pandas dataframe.
I have csv file that I import and looks something like the following (this is a simplified version of it):
trial_num trial_name unit_1_act unit_2_act unit_3_act unit_4_act
0 face 0.0 0.000000 0.272244 0.006428
1 face 0.0 0.000000 0.898450 0.000000
2 face 0.0 0.893845 0.000000 0.000000
3 scene 0.0 0.879367 0.000000 0.006312
4 scene 0.0 0.000000 0.000000 0.000000
In this form, I have multiple observations within each row (each "unit_X_act" is a separate observation.) I want to separate these out so that there is one observation per row.
In other words, I would like to change this so that rather than have the columns titled "unit_1_act," "unit_2_act," etc, I would have the following: One column titled "unit number," for which the value can be "unit_1," "unit_2," and so on, and then one column titled "activity," which has the value that was previously under each of the "unit_X_act" columns.
You could also use stack after renaming your columns.
df.set_index(['trial_num','trial_name'],inplace=True)
df.stack(inplace=True)
df.stack().reset_index()
We can also use pd.lreshape()
method:
In [74]: x = np.repeat(df.columns[df.columns.str.contains(r'^unit_')].str.replace('_act','').values,
...: len(df))
...:
...: pd.lreshape(df, {'activity': df.columns[df.columns.str.contains(r'^unit_')]}) \
...: .assign(unit_number=x)
...:
Out[74]:
trial_name trial_num activity unit_number
0 face 0 0.000000 unit_1
1 face 1 0.000000 unit_1
2 face 2 0.000000 unit_1
3 scene 3 0.000000 unit_1
4 scene 4 0.000000 unit_1
5 face 0 0.000000 unit_2
6 face 1 0.000000 unit_2
7 face 2 0.893845 unit_2
8 scene 3 0.879367 unit_2
9 scene 4 0.000000 unit_2
10 face 0 0.272244 unit_3
11 face 1 0.898450 unit_3
12 face 2 0.000000 unit_3
13 scene 3 0.000000 unit_3
14 scene 4 0.000000 unit_3
15 face 0 0.006428 unit_4
16 face 1 0.000000 unit_4
17 face 2 0.000000 unit_4
18 scene 3 0.006312 unit_4
19 scene 4 0.000000 unit_4
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With