I have a dataframe that look like this:
Date AAPL TSLA NESN FB ROCH TOT VISA JPM
2/1/2019 157.92 310.12 80.17 135.68 30.79 52.79 132.92 99.31
3/1/2019 142.19 300.36 82.21 131.74 31.48 52.91 128.13 97.11
4/1/2019 148.26 317.69 83.59 137.95 31.80 54.46 133.65 100.69
7/1/2019 147.93 334.96 82.71 138.05 31.52 54.36 136.06 100.76
8/1/2019 150.75 335.35 82.97 142.53 32.35 54.19 136.80 100.57
9/1/2019 153.31 338.53 83.74 144.23 32.38 55.38 138.41 100.40
10/1/2019 153.80 344.97 84.39 144.20 32.60 54.96 138.67 100.39
11/1/2019 152.29 347.26 84.51 143.80 32.41 53.89 138.06 99.91
14/1/2019 150.00 334.40 84.43 145.39 32.33 54.30 137.10 100.94
15/1/2019 153.07 344.43 84.69 148.95 32.49 53.81 137.34 101.68
16/1/2019 154.94 346.05 84.01 147.54 32.44 53.60 137.34 102.50
17/1/2019 155.86 347.31 84.52 148.30 32.42 53.76 137.28 102.92
18/1/2019 156.82 302.26 84.63 150.04 32.61 54.56 138.50 104.59
22/1/2019 153.30 298.92 84.17 147.57 32.21 53.48 138.05 102.94
23/1/2019 153.92 287.59 84.65 144.30 32.48 52.95 137.01 102.68
24/1/2019 152.70 291.51 85.13 145.83 32.07 53.34 137.70 102.74
25/1/2019 157.76 297.04 84.54 149.01 32.06 54.21 138.67 103.39
28/1/2019 156.30 296.38 85.30 147.47 32.13 53.59 135.99 103.88
29/1/2019 154.68 297.46 86.75 144.19 32.27 54.07 135.00 104.17
30/1/2019 165.25 308.77 87.06 150.42 32.28 54.17 137.60 104.41
and I would like to obtain another dataframe with the log return [ln(price(t)/price(t-1)] that should look like this:
Date Ln_AAPL Ln_TSLA Ln_NESN Ln_FB Ln_ROCH Ln_TOT Ln_VISA Ln_JPM
2/1/2019 Nan Nan Nan Nan Nan Nan Nan Nan
3/1/2019 -10.49% -3.20% 2.51% -2.95% 2.22% 0.23% -3.67% -2.24%
4/1/2019 4.18% 5.61% 1.66% 4.61% 1.01% 2.89% 4.22% 3.62%
7/1/2019 -0.22% 5.29% -1.06% 0.07% -0.88% -0.18% 1.79% 0.07%
8/1/2019 1.89% 0.12% 0.31% 3.19% 2.60% -0.31% 0.54% -0.19%
9/1/2019 1.68% 0.94% 0.92% 1.19% 0.09% 2.17% 1.17% -0.17%
10/1/2019 0.32% 1.88% 0.77% -0.02% 0.68% -0.76% 0.19% -0.01%
11/1/2019 -0.99% 0.66% 0.14% -0.28% -0.58% -1.97% -0.44% -0.48%
14/1/2019 -1.52% -3.77% -0.09% 1.10% -0.25% 0.76% -0.70% 1.03%
15/1/2019 2.03% 2.96% 0.31% 2.42% 0.49% -0.91% 0.17% 0.73%
16/1/2019 1.21% 0.47% -0.81% -0.95% -0.15% -0.39% 0.00% 0.80%
17/1/2019 0.59% 0.36% 0.61% 0.51% -0.06% 0.30% -0.04% 0.41%
18/1/2019 0.61% -13.89% 0.13% 1.17% 0.58% 1.48% 0.88% 1.61%
22/1/2019 -2.27% -1.11% -0.55% -1.66% -1.23% -2.00% -0.33% -1.59%
23/1/2019 0.40% -3.86% 0.57% -2.24% 0.83% -1.00% -0.76% -0.25%
24/1/2019 -0.80% 1.35% 0.57% 1.05% -1.27% 0.73% 0.50% 0.06%
25/1/2019 3.26% 1.88% -0.70% 2.16% -0.03% 1.62% 0.70% 0.63%
28/1/2019 -0.93% -0.22% 0.89% -1.04% 0.22% -1.15% -1.95% 0.47%
29/1/2019 -1.04% 0.36% 1.69% -2.25% 0.43% 0.89% -0.73% 0.28%
30/1/2019 6.61% 3.73% 0.36% 4.23% 0.03% 0.18% 1.91% 0.23%
I was able to do it only for a single column at the time and appending it. I was wondering if there was a way to apply it to the whole df and create directly a new one.
df['AAPL_log'] = np.log(df['APPL']/df['AAPL'].shift(1))
Thanks for your help.
Other way using diff
:
new_df = np.log(df).diff()
print(new_df)
Output
AAPL TSLA NESN FB ROCH TOT \
Date
2/1/2019 NaN NaN NaN NaN NaN NaN
3/1/2019 -0.104924 -0.031978 0.025128 -0.029469 0.022162 0.002271
4/1/2019 0.041803 0.056094 0.016647 0.046061 0.010114 0.028874
7/1/2019 -0.002228 0.052935 -0.010583 0.000725 -0.008844 -0.001838
8/1/2019 0.018884 0.001164 0.003139 0.031937 0.025992 -0.003132
9/1/2019 0.016839 0.009438 0.009238 0.011857 0.000927 0.021722
10/1/2019 0.003191 0.018845 0.007732 -0.000208 0.006771 -0.007613
11/1/2019 -0.009866 0.006616 0.001421 -0.002778 -0.005845 -0.019661
14/1/2019 -0.015151 -0.037736 -0.000947 0.010996 -0.002471 0.007579
15/1/2019 0.020260 0.029553 0.003075 0.024191 0.004937 -0.009065
16/1/2019 0.012143 0.004692 -0.008062 -0.009511 -0.001540 -0.003910
17/1/2019 0.005920 0.003634 0.006052 0.005138 -0.000617 0.002981
18/1/2019 0.006140 -0.138930 0.001301 0.011665 0.005843 0.014771
22/1/2019 -0.022702 -0.011112 -0.005450 -0.016599 -0.012342 -0.019993
23/1/2019 0.004036 -0.038640 0.005687 -0.022408 0.008348 -0.009960
24/1/2019 -0.007958 0.013538 0.005654 0.010547 -0.012704 0.007338
25/1/2019 0.032600 0.018792 -0.006955 0.021572 -0.000312 0.016179
28/1/2019 -0.009298 -0.002224 0.008950 -0.010389 0.002181 -0.011503
29/1/2019 -0.010419 0.003637 0.016856 -0.022493 0.004348 0.008917
30/1/2019 0.066101 0.037317 0.003567 0.042300 0.000310 0.001848
VISA JPM
Date
2/1/2019 NaN NaN
3/1/2019 -0.036702 -0.022402
4/1/2019 0.042179 0.036202
7/1/2019 0.017872 0.000695
8/1/2019 0.005424 -0.001887
9/1/2019 0.011700 -0.001692
10/1/2019 0.001877 -0.000100
11/1/2019 -0.004409 -0.004793
14/1/2019 -0.006978 0.010257
15/1/2019 0.001749 0.007304
16/1/2019 0.000000 0.008032
17/1/2019 -0.000437 0.004089
18/1/2019 0.008848 0.016096
22/1/2019 -0.003254 -0.015902
23/1/2019 -0.007562 -0.002529
24/1/2019 0.005023 0.000584
25/1/2019 0.007020 0.006307
28/1/2019 -0.019516 0.004728
29/1/2019 -0.007307 0.002788
30/1/2019 0.019076 0.002301
Sure, just remove columns names:
df1 = np.log(df/df.shift(1))
#alternative for lower pandas versions
#df1 = pd.DataFrame(np.log(df/df.shift(1)), index=df.index, columns=df.columns)
Another idea with DataFrame.pct_change
:
df = np.log(df.pct_change().add(1))
print (df1)
AAPL TSLA NESN FB ROCH TOT \
Date
2/1/2019 NaN NaN NaN NaN NaN NaN
3/1/2019 -0.104924 -0.031978 0.025128 -0.029469 0.022162 0.002271
4/1/2019 0.041803 0.056094 0.016647 0.046061 0.010114 0.028874
7/1/2019 -0.002228 0.052935 -0.010583 0.000725 -0.008844 -0.001838
8/1/2019 0.018884 0.001164 0.003139 0.031937 0.025992 -0.003132
9/1/2019 0.016839 0.009438 0.009238 0.011857 0.000927 0.021722
10/1/2019 0.003191 0.018845 0.007732 -0.000208 0.006771 -0.007613
11/1/2019 -0.009866 0.006616 0.001421 -0.002778 -0.005845 -0.019661
14/1/2019 -0.015151 -0.037736 -0.000947 0.010996 -0.002471 0.007579
15/1/2019 0.020260 0.029553 0.003075 0.024191 0.004937 -0.009065
16/1/2019 0.012143 0.004692 -0.008062 -0.009511 -0.001540 -0.003910
17/1/2019 0.005920 0.003634 0.006052 0.005138 -0.000617 0.002981
18/1/2019 0.006140 -0.138930 0.001301 0.011665 0.005843 0.014771
22/1/2019 -0.022702 -0.011112 -0.005450 -0.016599 -0.012342 -0.019993
23/1/2019 0.004036 -0.038640 0.005687 -0.022408 0.008348 -0.009960
24/1/2019 -0.007958 0.013538 0.005654 0.010547 -0.012704 0.007338
25/1/2019 0.032600 0.018792 -0.006955 0.021572 -0.000312 0.016179
28/1/2019 -0.009298 -0.002224 0.008950 -0.010389 0.002181 -0.011503
29/1/2019 -0.010419 0.003637 0.016856 -0.022493 0.004348 0.008917
30/1/2019 0.066101 0.037317 0.003567 0.042300 0.000310 0.001848
VISA JPM
Date
2/1/2019 NaN NaN
3/1/2019 -0.036702 -0.022402
4/1/2019 0.042179 0.036202
7/1/2019 0.017872 0.000695
8/1/2019 0.005424 -0.001887
9/1/2019 0.011700 -0.001692
10/1/2019 0.001877 -0.000100
11/1/2019 -0.004409 -0.004793
14/1/2019 -0.006978 0.010257
15/1/2019 0.001749 0.007304
16/1/2019 0.000000 0.008032
17/1/2019 -0.000437 0.004089
18/1/2019 0.008848 0.016096
22/1/2019 -0.003254 -0.015902
23/1/2019 -0.007562 -0.002529
24/1/2019 0.005023 0.000584
25/1/2019 0.007020 0.006307
28/1/2019 -0.019516 0.004728
29/1/2019 -0.007307 0.002788
30/1/2019 0.019076 0.002301
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With