Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Merging with more than one level overlap not allowed

So, I'm using pandas and essentially trying to calculate a normalized weight. For each day in my dataframe, I want the 'SECTOR' weight grouped by 'CAP', but they won't sum up to 1, so I want to normalize them as well. I thought I could accomplish this by dividing two groupbys, but I'm getting an error on my code that I don't quite understand. The code can run if I eliminate the 'CAP' in the second groupby.

Can anyone explain this to me?

df.groupby(['EFFECTIVE DATE','CAP','SECTOR'])['INDEX WEIGHT'].sum() / df.groupby(['EFFECTIVE DATE','CAP'])['INDEX WEIGHT'].sum()

NotImplementedError: merging with more than one level overlap on a multi-index is not implemented

Anybody know what I need to change? As always thank you!!!

like image 267
WhitneyChia Avatar asked Sep 12 '16 00:09

WhitneyChia


Video Answer


1 Answers

Option 1
Very close to what you had

cols = ['EFFECTIVE DATE', 'CAP', 'SECTOR', 'INDEX WEIGHT']
sector_sum = df.groupby(cols[:3])[cols[-1]].sum()
cap_sum = df.groupby(cols[:2])[cols[-1]].transform(pd.Series.sum).values
sector_sum / cap_sum

Option 2
Use a single transform

cols = ['EFFECTIVE DATE', 'CAP', 'SECTOR', 'INDEX WEIGHT']
sumto = lambda x: x / x.sum()
df.groupby(cols[:3])[cols[-1]].sum().groupby(level=cols[:2]).transform(sumto)

If you consider the df

df = pd.DataFrame([
        [0, 'Large', 'A', .1, 'a'],
        [0, 'Large', 'B', .2, 'b'],
        [0, 'Large', 'C', .1, 'c'],
        [0, 'Large', 'D', .3, 'd'],
        [0, 'Large', 'E', .1, 'e'],
        [0, 'Large', 'F', .4, 'f'],
        [0, 'Large', 'G', .1, 'g'],
        [0, 'Small', 'A', .2, 'h'],
        [0, 'Small', 'B', .3, 'i'],
        [0, 'Small', 'C', .4, 'j'],
        [0, 'Small', 'D', .5, 'k'],
        [0, 'Small', 'E', .1, 'l'],
        [0, 'Small', 'F', .2, 'm'],
        [0, 'Small', 'G', .1, 'n'],
        [1, 'Large', 'A', .1, 'a'],
        [1, 'Large', 'B', .2, 'b'],
        [1, 'Large', 'C', .1, 'c'],
        [1, 'Large', 'D', .3, 'd'],
        [1, 'Large', 'E', .1, 'e'],
        [1, 'Large', 'F', .4, 'f'],
        [1, 'Large', 'G', .1, 'g'],
        [1, 'Small', 'A', .2, 'h'],
        [1, 'Small', 'B', .3, 'i'],
        [1, 'Small', 'C', .4, 'j'],
        [1, 'Small', 'D', .5, 'k'],
        [1, 'Small', 'E', .1, 'l'],
        [1, 'Small', 'F', .2, 'm'],
        [1, 'Small', 'G', .1, 'n'],
    ], columns=['EFFECTIVE DATE', 'CAP', 'SECTOR', 'INDEX WEIGHT', 'ID'])

Both options produce

EFFECTIVE DATE  CAP    SECTOR
0               Large  A         0.076923
                       B         0.153846
                       C         0.076923
                       D         0.230769
                       E         0.076923
                       F         0.307692
                       G         0.076923
                Small  A         0.111111
                       B         0.166667
                       C         0.222222
                       D         0.277778
                       E         0.055556
                       F         0.111111
                       G         0.055556
1               Large  A         0.076923
                       B         0.153846
                       C         0.076923
                       D         0.230769
                       E         0.076923
                       F         0.307692
                       G         0.076923
                Small  A         0.111111
                       B         0.166667
                       C         0.222222
                       D         0.277778
                       E         0.055556
                       F         0.111111
                       G         0.055556
Name: INDEX WEIGHT, dtype: float64

If you assign one of the options to df1 then sum the sub-groups

df1.groupby(level=['EFFECTIVE DATE', 'CAP']).sum()


EFFECTIVE DATE  CAP  
0               Large    1.0
                Small    1.0
1               Large    1.0
                Small    1.0
Name: INDEX WEIGHT, dtype: float64

Timing

enter image description here

like image 183
piRSquared Avatar answered Sep 29 '22 16:09

piRSquared