I'm trying to merge one large dataframe with a small dataframe based on the first occurrences of a pair of criteria (month
and num
).
I've cobbled together viable code (actual / desired output at bottom), but it seems like it could be far more efficient.
My question is - am I missing a significantly easier way to do this?
Setup:
import pandas as pd
m = [1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,2,2,2,2,2,2,2,2,2,2,2]
n = [1,1,1,20,20,300,300,20,20,1,1,1,20,300,20,1,1,1,20,20,300,300,300,20,20,1,1]
df = pd.DataFrame({'month':m, 'num':n, 'x':0})
m2 = [1,1,1,2,2,2]
n2 = [1,20,300,1,20,300]
s2 = [11,222,3333,44,555,6666]
df2 = pd.DataFrame({'month':m2, 'num':n2, 'sum':s2})
Current code:
dfx = pd.DataFrame(df.groupby(['month','num'])['x'].idxmax())
dfx.rename(columns = {'x':'find'}, inplace = True)
df2.set_index(['month','num'], inplace = True)
df2 = pd.merge(df2, dfx, left_index = True, right_index = True)
df = df.merge(df2, left_index = True, right_on = 'find', how = 'left')
df = df.drop(['find','x'], axis = 1).reset_index(drop = True).fillna(0)
Output:
month num sum
0 1 1 11.0
1 1 1 0.0
2 1 1 0.0
3 1 20 222.0
4 1 20 0.0
5 1 300 3333.0
6 1 300 0.0
7 1 20 0.0
8 1 20 0.0
9 1 1 0.0
10 1 1 0.0
11 1 1 0.0
12 1 20 0.0
13 1 300 0.0
14 1 20 0.0
15 1 1 0.0
16 2 1 44.0
17 2 1 0.0
18 2 20 555.0
19 2 20 0.0
20 2 300 6666.0
21 2 300 0.0
22 2 300 0.0
23 2 20 0.0
24 2 20 0.0
25 2 1 0.0
26 2 1 0.0
If I understand correctly, you can do a regular merge
between your two DataFrames, then combine loc
and duplicated
to zero out non-first occurences:
df3 = df.merge(df2, how='left', on=['month', 'num'])
df3.loc[df3.duplicated(subset=['month', 'num']), 'sum'] = 0
The resulting output:
month num sum
0 1 1 11
1 1 1 0
2 1 1 0
3 1 20 222
4 1 20 0
5 1 300 3333
6 1 300 0
7 1 20 0
8 1 20 0
9 1 1 0
10 1 1 0
11 1 1 0
12 1 20 0
13 1 300 0
14 1 20 0
15 1 1 0
16 2 1 44
17 2 1 0
18 2 20 555
19 2 20 0
20 2 300 6666
21 2 300 0
22 2 300 0
23 2 20 0
24 2 20 0
25 2 1 0
26 2 1 0
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With