After getting data of the SPY from Yahoo I create a channel of closing prices using, as you can see below, the Max and Min rolling window. The columns are HC and HL.
I need to create a column (which I have called flag) that shows 1 when the close price is equal to HC and this value continues until close will be equal to HL. At this point the value of Flag will be -1. As you can see it is very simple, the Flag can have just two values; 1 or -1.
The simple formula would be something like:
I have tried several things, including the code below, with no luck. The problem with this code is that the 0 value is shown. And I don't know how to make it disappear thru a conditional:
import pandas as pd
import pandas_datareader as dr
import numpy as np
from datetime import date
df = dr.data.get_data_yahoo('SPY',start='01-01-2019',end=date.today())
df['HC'] = df['Close'].rolling(20).max()
df['LC'] = df['Close'].rolling(20).min()
df['Flag'] = [1 if (df.loc[ei, 'Close'] == df.loc[ei, 'HC']) else
-1 if (df.loc[ei, 'Close'] == df.loc[ei, 'LC']) else
0 for ei in df.index]
Below you can see in blue the results of my code and in red the result that I need.
Is there a simple way to do this? I would appreciate if someone could help me with it. Thank you!
We create three rectangles of same width and draw them with appropriate colours and borders. Use pyplot function to draw the circle of the Ashok Chakra at the center of the middle rectangle. Use numpy and matplotlib to draw the 24 lines inside the Ashok Chakra.
Python Language Regular Expressions (Regex) FlagsFor some special cases we need to change the behavior of the Regular Expression, this is done using flags. Flags can be set in two ways, through the flags keyword or directly in the expression.
Although this is already answered, however, the fastest way to compute such result is usually using np.where
as follows:
import pandas as pd
import pandas_datareader as dr
import numpy as np
from datetime import date
df = dr.data.get_data_yahoo('SPY',start='01-01-2019',end=date.today())
df['HC'] = df['Close'].rolling(20).max()
df['LC'] = df['Close'].rolling(20).min()
There is a nested logic in the following:
df['Flag'] = np.where((df.Close == df.HC), 1,
np.where(df.Close == df.LC, -1, np.full(df.Close.count(), np.nan)))
df.Flag.fillna(method='ffill', inplace=True)
In terms of performance:
%%timeit
df['Flag'] = np.where((df.Close == df.HC), 1,
np.where(df.Close == df.LC, -1, np.full(df.Close.count(), np.nan)))
df.Flag.fillna(method='ffill', inplace=True)
912 µs ± 49.2 µs per loop (mean ± std. dev. of 7 runs, 1000 loops each)
This is definitely better that loops or nested if conditions.
For example @Tim Mironov answer:
%%timeit
pos_indexes = (df.Close == df.HC)
neg_indexes = (df.Close == df.LC)
df.loc[pos_indexes, 'Good_Flag'] = 1
df.loc[neg_indexes, 'Good_Flag'] = -1
df.fillna(method='ffill', inplace=True)
4.43 ms ± 92 µs per loop (mean ± std. dev. of 7 runs, 100 loops each)
You can use more built-in functionality of Pandas, specifically fillna
method and usage of logical indexing.
I've added a portion of code to your original one to create an additional Good_Flag
:
import pandas_datareader as dr
from datetime import date
df = dr.data.get_data_yahoo('SPY',start='01-01-2019',end=date.today())
df['HC'] = df['Close'].rolling(20).max()
df['LC'] = df['Close'].rolling(20).min()
df['Flag'] = [1 if (df.loc[ei, 'Close'] == df.loc[ei, 'HC']) else
-1 if (df.loc[ei, 'Close'] == df.loc[ei, 'LC']) else
0 for ei in df.index]
pos_indexes = df.Close == df.HC
neg_indexes = df.Close == df.LC
df.loc[pos_indexes, 'Good_Flag'] = 1
df.loc[neg_indexes, 'Good_Flag'] = -1
df = df.fillna(method='ffill')
Pay attention that I used the fillna
method with ffill
attribute to specify a "Forward Pass".
Edit:
To make thigs clear, the original Flag
column is left here on purpose and the calculation of the new Good_Flag
column doesn't rely on Flag
.
To demonstrate the desirable behaviour:
plt.plot(df.Flag, color='blue')
plt.plot(df.Good_Flag, color='red')
plt.legend(['Flag', 'Good_Flag'])
plt.show()
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With