With numpy arrays, I want to perform this operation:
x[1],...,x[n-1]
to x[0],...,x[n-2]
(left shift),x[n-1] = newvalue
.This is similar to a pop()
, push(newvalue)
for a first-in last-out queue (only inverted).
A naive implementation is: x[:-1] = x[1:]; x[-1] = newvalue
.
Another implementation, using np.concatenate
, is slower: np.concatenate((x[1:], np.array(newvalue).reshape(1,)), axis=0)
.
Is there a fastest way to do it?
By explicitly declaring the "ndarray" data type, your array processing can be 1250x faster. This tutorial will show you how to speed up the processing of NumPy arrays using Cython. By explicitly specifying the data types of variables in Python, Cython can give drastic speed increases at runtime.
pandas provides a bunch of C or Cython optimized functions that can be faster than the NumPy equivalent function (e.g. reading text from text files).
NumPy Arrays are faster than Python Lists because of the following reasons: An array is a collection of homogeneous data-types that are stored in contiguous memory locations. On the other hand, a list in Python is a collection of heterogeneous data types stored in non-contiguous memory locations.
NumPy is fast because it can do all its calculations without calling back into Python. Since this function involves looping in Python, we lose all the performance benefits of using NumPy. For a 10,000,000-entry NumPy array, this functions takes 2.5 seconds to run on my computer.
After some experiments, it is clear that:
nparray
(numpy arrays) is a slicing and copying.So the solution is: x[:-1] = x[1:]; x[-1] = newvalue
.
Here is a small benchmark:
>>> x = np.random.randint(0, 1e6, 10**8); newvalue = -100
>>> %timeit x[:-1] = x[1:]; x[-1] = newvalue
1000 loops, best of 3: 73.6 ms per loop
>>> %timeit np.concatenate((x[1:], np.array(newvalue).reshape(1,)), axis=0)
1 loop, best of 3: 339 ms per loop
But if you don't need to have a fast access to all values in the array, but only the first or last ones, using a deque
is smarter.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With