Working with numpy.diff function, suppose this simple case:
>>> x = np.array([1, 2, 4, 7, 0])
>>> x_diff = np.diff(x)
array([ 1, 2, 3, -7])
How can I get easily x back to original scale not differenced? I suppose there is something with numpy.cumsum().
Concatenate with the first element and then use cumsum
-
np.r_[x[0], x_diff].cumsum()
For concatenating, we can also use np.hstack
, like so -
np.hstack((x[0], x_diff)).cumsum()
Or with np.concatenate
for the concatenation -
np.concatenate(([x[0]], x_diff)).cumsum()
As Divakar proposed a few solutions and I was wondering, what I should take, here the performance benchmark. I also added this answer.
Long story short - just use: np.concatenate(([x[0]], x_diff)).cumsum()
.
x: problem size, y: computing time for 1000 runs
import timeit
import random
import numpy as np
import matplotlib.pyplot as plt
cmds = [
'np.r_[x[0], x_diff].cumsum()',
'np.hstack((x[0], x_diff)).cumsum()',
'np.concatenate(([x[0]], x_diff)).cumsum()',
'csp0 = np.zeros(shape=(len(x) + 1,)); np.cumsum(x, out=csp0[1:])',
]
test_range = [1e0, 1e1, 1e2, 1e3, 1e4, 1e5, 1e6]
# test_range = [1e0, 1e1, 1e2]
ts = np.empty((len(cmds), len(test_range)), dtype=float)
for tt, size_float in enumerate(test_range):
size = round(size_float)
print('array size:', size)
x = np.random.randint(low=0, high=100, size=size)
x_diff = np.diff(x)
n_trials = 1000
for cc, cmd in enumerate(cmds):
t = timeit.Timer(cmd, globals={**globals(), **locals()})
t = t.timeit(n_trials)
ts[cc, tt] = t
print('time for {:d}x \"{:}\": {:.6f}'.format(n_trials, cmd, t))
fig, ax = plt.subplots(1, 1, figsize=(15, 10))
for cc, cmd in enumerate(cmds):
ax.plot(test_range, ts[cc, :], label=cmd)
print(cmd)
ax.legend()
ax.set_xscale('log')
ax.set_yscale('log')
array size: 1
time for 1000x "np.r_[x[0], x_diff].cumsum()": 0.011935
time for 1000x "np.hstack((x[0], x_diff)).cumsum()": 0.006159
time for 1000x "np.concatenate(([x[0]], x_diff)).cumsum()": 0.003221
time for 1000x "csp0 = np.zeros(shape=(len(x) + 1,)); np.cumsum(x, out=csp0[1:])": 0.003482
array size: 10
time for 1000x "np.r_[x[0], x_diff].cumsum()": 0.009031
time for 1000x "np.hstack((x[0], x_diff)).cumsum()": 0.006170
time for 1000x "np.concatenate(([x[0]], x_diff)).cumsum()": 0.003082
time for 1000x "csp0 = np.zeros(shape=(len(x) + 1,)); np.cumsum(x, out=csp0[1:])": 0.003467
array size: 100
time for 1000x "np.r_[x[0], x_diff].cumsum()": 0.009754
time for 1000x "np.hstack((x[0], x_diff)).cumsum()": 0.006332
time for 1000x "np.concatenate(([x[0]], x_diff)).cumsum()": 0.003296
time for 1000x "csp0 = np.zeros(shape=(len(x) + 1,)); np.cumsum(x, out=csp0[1:])": 0.004249
array size: 1000
time for 1000x "np.r_[x[0], x_diff].cumsum()": 0.010550
time for 1000x "np.hstack((x[0], x_diff)).cumsum()": 0.008595
time for 1000x "np.concatenate(([x[0]], x_diff)).cumsum()": 0.005414
time for 1000x "csp0 = np.zeros(shape=(len(x) + 1,)); np.cumsum(x, out=csp0[1:])": 0.006916
array size: 10000
time for 1000x "np.r_[x[0], x_diff].cumsum()": 0.029658
time for 1000x "np.hstack((x[0], x_diff)).cumsum()": 0.028389
time for 1000x "np.concatenate(([x[0]], x_diff)).cumsum()": 0.024410
time for 1000x "csp0 = np.zeros(shape=(len(x) + 1,)); np.cumsum(x, out=csp0[1:])": 0.034652
array size: 100000
time for 1000x "np.r_[x[0], x_diff].cumsum()": 0.221405
time for 1000x "np.hstack((x[0], x_diff)).cumsum()": 0.219564
time for 1000x "np.concatenate(([x[0]], x_diff)).cumsum()": 0.215796
time for 1000x "csp0 = np.zeros(shape=(len(x) + 1,)); np.cumsum(x, out=csp0[1:])": 0.310225
array size: 1000000
time for 1000x "np.r_[x[0], x_diff].cumsum()": 2.660822
time for 1000x "np.hstack((x[0], x_diff)).cumsum()": 2.664244
time for 1000x "np.concatenate(([x[0]], x_diff)).cumsum()": 2.636382
time for 1000x "csp0 = np.zeros(shape=(len(x) + 1,)); np.cumsum(x, out=csp0[1:])": 3.770557
np.r_[x[0], x_diff].cumsum()
np.hstack((x[0], x_diff)).cumsum()
np.concatenate(([x[0]], x_diff)).cumsum()
csp0 = np.zeros(shape=(len(x) + 1,)); np.cumsum(x, out=csp0[1:])
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With