I'd like to calculate a rolling_max of a pandas column, where the window size varies and is a difference between current row index and a row where a certain condition was met.
So, as an example, I have:
df = pd.DataFrame({'a': [0,1,0,0,0,1,0,0,0,0,1,0],
'b': [5,4,3,6,1,2,3,4,2,1,7,8]})
I want a rolling_max of df.b since df.a == 1 the previous time. I.e. I want to get this:
a b rm
0 0 5 NaN <- no previous a==1
1 1 4 4 <- a==1
2 0 3 4
3 0 6 6
4 0 1 6
5 1 2 2 <- a==1
6 0 3 3
7 0 4 4
8 0 2 4
9 0 1 4
10 1 7 7 <- a==1
11 0 8 8
My df has an integer index without gaps, so I tried to do this:
df['last_a'] = np.where(df.a == 1, df.index, np.nan)
df['last_a'].fillna(method='ffill', inplace=True)
df['rm'] = pd.rolling_max(df['b'], window = df.index - df['last_a'] + 1)
but I'm getting a TypeError: an integer is required.
This is a part of a long script operating on quite a big data frame, so I need the fastest solution possible. I have successfully tried to do this with a loop instead of rolling_max, but it's very slow. Could you please help?
Just for reference. The ugly and long loop that I have now, and which, regardless its ugliness, seems to be quite fast on my data frame (50,000 x 25 for a test), is as follows:
df['rm2'] = df.b
df['rm1'] = np.where( (df['a'] == 1) | (df['rm2'].diff() > 0), df['rm2'], np.nan)
df['rm1'].fillna(method = 'ffill', inplace = True)
df['Dif'] = (df['rm1'] - df['rm2']).abs()
while df['Dif'].sum() != 0:
df['rm2'] = df['rm1']
df['rm1'] = np.where( (df['a'] == 1) | (df['rm2'].diff() > 0), df['rm2'], np.nan)
df['rm1'].fillna(method = 'ffill', inplace = True)
df['Dif'] = (df['rm1'] - df['rm2']).abs()
I would create an index and groupby
this index to use cummax
:
import numpy as np
df['index'] = df['a'].cumsum()
df['rm'] = df.groupby('index')['b'].cummax()
df.loc[df['index']==0, 'rm'] = np.nan
In [104]: df
Out[104]:
a b index rm
0 0 5 0 NaN
1 1 4 1 4
2 0 3 1 4
3 0 6 1 6
4 0 1 1 6
5 1 2 2 2
6 0 3 2 3
7 0 4 2 4
8 0 2 2 4
9 0 1 2 4
10 1 7 3 7
11 0 8 3 8
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With