I have 2-time series data frames. Both contain values [0,1] only. The first one is called init_signal
and the second is called end_signal
. The idea is to create a new data frame when init_signal
has a 1
, it will find the NEXT 1 in end_signal
.
The example below merges both the init_signal and end_signal as one data frame. the first column is init_signal and the second column is end_signal.
For example:
2016-06-13 1 0
2016-06-14 0 0
2016-06-15 0 1
2016-06-16 0 0
Will become:
2016-06-13 1
2016-06-14 1
2016-06-15 1
2016-06-16 0
For example 2:
2016-06-13 1 1
2016-06-14 0 0
2016-06-15 0 1
2016-06-16 0 0
Will become:
2016-06-13 1
2016-06-14 1
2016-06-15 1
2016-06-16 0
In the 2nd example, if there is also a 1 in the same row as the init_signal, it will be ignored.
I've been using a complex loop. Is there a non-looping way?
I hope this helps and covers all the boundary conditions.
import pandas as pd
import datetime as dt
# Dummy data
df = pd.DataFrame()
df['init_signal'] = [0,0,0,1,0,0,0,0,1,0,0,0,0,0,0,0,1,0,0]
df['end_signal'] = [0,1,0,0,0,0,1,0,1,0,0,1,0,1,0,0,0,1,0]
df.index = [dt.date.today()+dt.timedelta(-i) for i in reversed(range(len(df)))]
# Cumsum for calculation
df['init_c'] = df['init_signal'].cumsum()
df['end_c'] = df['end_signal'].cumsum()
# result calculation
func = lambda x: (x==x.min()).shift().fillna(True)
df['result'] = df.groupby('init_c')['end_c'].transform(func)
df['result'] = df['result']&df['init_c'].astype(bool)
df['result'] = df['result'].astype(int)
# dropping temp columns
df.drop(['init_c','end_c'], axis=1, inplace=True)
print(df)
Output:
init_signal end_signal result
2018-12-12 0 0 0
2018-12-13 0 1 0
2018-12-14 0 0 0
2018-12-15 1 0 1
2018-12-16 0 0 1
2018-12-17 0 0 1
2018-12-18 0 1 1
2018-12-19 0 0 0
2018-12-20 1 1 1
2018-12-21 0 0 1
2018-12-22 0 0 1
2018-12-23 0 1 1
2018-12-24 0 0 0
2018-12-25 0 1 0
2018-12-26 0 0 0
2018-12-27 0 0 0
2018-12-28 1 0 1
2018-12-29 0 1 1
2018-12-30 0 0 0
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With