Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

numpy gradient function and numerical derivatives

The array the numpy.gradient function returns depends on the number of data-points/spacing of the data-points. Is this expected behaviour? For example:

y = lambda x: x

x1 = np.arange(0,10,1)
x2 = np.arange(0,10,0.1)
x3 = np.arange(0,10,0.01)

plt.plot(x1,np.gradient(y(x1)),'r--o')
plt.plot(x2,np.gradient(y(x2)),'b--o')
plt.plot(x3,np.gradient(y(x3)),'g--o')

returns the ATTACHED plot.

Only the gradient of y(x1) returns the correct result. What is going on here? Is there a better way to compute the numerical derivative using numpy?

Cheers

like image 479
user1654183 Avatar asked Aug 28 '13 21:08

user1654183


People also ask

Is NumPy gradient a derivative?

The output of numpy. gradient() function is a list of ndarrays (or a single ndarray if there is only one dimension) corresponding to the derivatives of input f with respect to each dimension. Each derivative has the same shape as input f .

How do you find the gradient in NumPy?

We can use the numpy. gradient() function to find the gradient of an N-dimensional array. For gradient approximation, the function uses either first or second-order accurate one-sided differences at the boundaries and second-order accurate central differences in the interior (or non-boundary) points.

What does the gradient function do in Python?

gradient is the function or any Python callable object that takes a vector and returns the gradient of the function you're trying to minimize. start is the point where the algorithm starts its search, given as a sequence (tuple, list, NumPy array, and so on) or scalar (in the case of a one-dimensional problem).


1 Answers

In np.gradient you should tell the sample distance. To get the same results you should type:

plt.plot(x1,np.gradient(y(x1),1),'r--o')
plt.plot(x2,np.gradient(y(x2),0.1),'b--o')
plt.plot(x3,np.gradient(y(x3),0.01),'g--o')

The default sample distance is 1 and that's why it works for x1.

If the distance is not even you have to compute it manually. If you use the forward difference you can do:

d = np.diff(y(x))/np.diff(x) 

If you are interested in computing central difference as np.gradient does you could do something like this:

x = np.array([1, 2, 4, 7, 11, 16], dtype=np.float)
y = lambda x: x**2

z1 = np.hstack((y(x[0]), y(x[:-1])))
z2 = np.hstack((y(x[1:]), y(x[-1])))

dx1 = np.hstack((0, np.diff(x)))
dx2 = np.hstack((np.diff(x), 0))

d = (z2-z1) / (dx2+dx1)
like image 180
pabaldonedo Avatar answered Sep 22 '22 10:09

pabaldonedo