I have a Dataframe like:
               timestamp  Order     Price  Quantity
0    2019-10-09 09:15:42      0  27850.00      2040
1    2019-10-09 09:15:42      0  27850.00      1980
2    2019-10-09 09:15:53      0  27860.85      1800
3    2019-10-09 09:16:54      0  27860.85      2340
4    2019-10-09 09:18:48      0  27860.85      1500
5    2019-10-09 09:21:08      0  27979.00      1840
6    2019-10-09 09:21:08      0  27979.00      2020
7    2019-10-09 09:21:12      0  27850.00      1800
8    2019-10-09 09:21:15      0  27850.00      1580
9    2019-10-09 09:21:21     35  28000.00      1840
10   2019-10-09 09:21:23     34  28000.00      1800
11   2019-10-09 09:28:17      0  28035.00      2020
12   2019-10-09 09:28:18      0  28035.00      1960
13   2019-10-09 09:28:18      0  28035.00      1920
14   2019-10-09 09:28:24      0  28035.00      1940
15   2019-10-09 09:28:24      0  28035.00      1960
16   2019-10-09 09:28:25      0  28000.00      2140
17   2019-10-09 09:28:25      0  28000.00      2020
18   2019-10-09 09:28:26      0  28000.00      2120
I want to check when successive Price Values are same then return the row with Max Quantity Value.
My Result Dataframe Like:
               timestamp  Order     Price  Quantity
0    2019-10-09 09:15:42      0  27850.00      2040
3    2019-10-09 09:16:54      0  27860.85      2340
6    2019-10-09 09:21:08      0  27979.00      2020
7    2019-10-09 09:21:12      0  27850.00      1800
9    2019-10-09 09:21:21     35  28000.00      1840
11   2019-10-09 09:28:17      0  28035.00      2020
16   2019-10-09 09:28:25      0  28000.00      2140
PS: Here in result table Price Value 27850.00 appears once more in Row No:7 and will be considered as independently. Similarly for 28000.00 also.
First create a price_group column to identify consecutive rows with the same price (as in this answer).
price_group = (df.Price != df.Price.shift()).cumsum()
Then group the rows by this column and find the rows with max quantity for each group (as in these answers).
result = df.loc[df.Quantity.groupby(price_group).idxmax()]
Something like this:
from itertools import groupby
x = [[list(n) for m, n in groupby(df['Price'])]][0]
y = [(ind,val) for ind,val in enumerate(x)]
z = [i[0] for i in y for j in i[1]]
df['label'] = z
# it gives you df like this
#    Unnamed: 0  Unnamed: 1 timestamp  Order     Price  Quantity  label
# 0            0  09.10.2019   9:15:42      0  27850.00      2040      0
# 1            1  09.10.2019   9:15:42      0  27850.00      1980      0
# 2            2  09.10.2019   9:15:53      0  27860.85      1800      1
# 3            3  09.10.2019   9:16:54      0  27860.85      2340      1
# 4            4  09.10.2019   9:18:48      0  27860.85      1500      1
# 5            5  09.10.2019   9:21:08      0  27979.00      1840      2
# 6            6  09.10.2019   9:21:08      0  27979.00      2020      2
# 7            7  09.10.2019   9:21:12      0  27850.00      1800      3
# 8            8  09.10.2019   9:21:15      0  27850.00      1580      3
# 9            9  09.10.2019   9:21:21     35  28000.00      1840      4
# 10          10  09.10.2019   9:21:23     34  28000.00      1800      4
# 11          11  09.10.2019   9:28:17      0  28035.00      2020      5
# 12          12  09.10.2019   9:28:18      0  28035.00      1960      5
# 13          13  09.10.2019   9:28:18      0  28035.00      1920      5
# 14          14  09.10.2019   9:28:24      0  28035.00      1940      5
# 15          15  09.10.2019   9:28:24      0  28035.00      1960      5
# 16          16  09.10.2019   9:28:25      0  28000.00      2140      6
# 17          17  09.10.2019   9:28:25      0  28000.00      2020      6
# 18          18  09.10.2019   9:28:26      0  28000.00      2120      6
# then you able to use groupby
df.groupby('label').max()
Out[27]: 
       Unnamed: 0  Unnamed: 1 timestamp  Order     Price  Quantity
label                                                             
0               1  09.10.2019   9:15:42      0  27850.00      2040
1               4  09.10.2019   9:18:48      0  27860.85      2340
2               6  09.10.2019   9:21:08      0  27979.00      2020
3               8  09.10.2019   9:21:15      0  27850.00      1800
4              10  09.10.2019   9:21:23     35  28000.00      1840
5              15  09.10.2019   9:28:24      0  28035.00      2020
6              18  09.10.2019   9:28:26      0  28000.00      2140
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With