I am using sympy to generate some functions for numerical calculations. Therefore I lambdify an expression an vectorize it to use it with numpy arrays. Here is an example:
import numpy as np
import sympy as sp
def numpy_function():
x, y, z = np.mgrid[0:1:40*1j, 0:1:40*1j, 0:1:40*1j]
T = (1 - np.cos(2*np.pi*x))*(1 - np.cos(2*np.pi*y))*np.sin(np.pi*z)*0.1
return T
def sympy_function():
x, y, z = sp.Symbol("x"), sp.Symbol("y"), sp.Symbol("z")
T = (1 - sp.cos(2*sp.pi*x))*(1 - sp.cos(2*sp.pi*y))*sp.sin(sp.pi*z)*0.1
lambda_function = np.vectorize(sp.lambdify((x, y, z), T, "numpy"))
x, y, z = np.mgrid[0:1:40*1j, 0:1:40*1j, 0:1:40*1j]
T = lambda_function(x,y,z)
return T
The problem between the sympy version and a pure numpy version is the speed i.e.
In [3]: timeit test.numpy_function()
100 loops, best of 3: 11.9 ms per loop
vs.
In [4]: timeit test.sympy_function()
1 loops, best of 3: 634 ms per loop
So is there any way to get closer to the speed of the numpy version ? I think np.vectorize is pretty slow but somehow some part of my code does not work without it. Thank you for any suggestions.
EDIT: So I found the reason why the vectorize function is necessary, i.e:
In [35]: y = np.arange(10)
In [36]: f = sp.lambdify(x,sin(x),"numpy")
In [37]: f(y)
Out[37]:
array([ 0. , 0.84147098, 0.90929743, 0.14112001, -0.7568025 ,
-0.95892427, -0.2794155 , 0.6569866 , 0.98935825, 0.41211849])
this seems to work fine however:
In [38]: y = np.arange(10)
In [39]: f = sp.lambdify(x,1,"numpy")
In [40]: f(y)
Out[40]: 1
So for simple expression like 1
this function doesn't return an array.
Is there a way to fix this and isn't this some kind of bug or at least inconsistent design?
Vectorized implementations (numpy) are much faster and more efficient as compared to for-loops. To really see HOW large the difference is, let's try some simple operations used in most machine learnign algorithms (especially deep learning).
The concept of vectorized operations on NumPy allows the use of more optimal and pre-compiled functions and mathematical operations on NumPy array objects and data sequences. The Output and Operations will speed up when compared to simple non-vectorized operations.
The lambdify function translates SymPy expressions into Python functions. If an expression is to be evaluated over a large range of values, the evalf() function is not efficient. lambdify acts like a lambda function, except it converts the SymPy names to the names of the given numerical library, usually NumPy.
In general, SymPy functions do not work with objects from other libraries, such as NumPy arrays, and functions from numeric libraries like NumPy or mpmath do not work on SymPy expressions.
lambdify
returns a single value for constants because no numpy functions are involved. This is because of the way lambdify
works (see https://stackoverflow.com/a/25514007/161801).
But this is typically not a problem because a constant will automatically broadcast to the correct shape in any operation that you use it in with an array. On the other hand, if you explicitly worked with an array of the same constant, it would be much less efficient because you would compute the same operations multiple times.
Using np.vectorize()
in this case is like looping over the first dimension of x
, y
and z
, and that's why it becomes slower. You don't need np.vectorize()
IF you tell lambdify()
to use NumPy's functions, which is exactly what you are doing. Then, using:
def sympy_function():
x, y, z = sp.Symbol("x"), sp.Symbol("y"), sp.Symbol("z")
T = (1 - sp.cos(2*sp.pi*x))*(1 - sp.cos(2*sp.pi*y))*sp.sin(sp.pi*z)*0.1
lambda_function = sp.lambdify((x, y, z), T, "numpy")
x, y, z = np.mgrid[0:1:40*1j, 0:1:40*1j, 0:1:40*1j]
T = lambda_function(x,y,z)
return T
makes the performance comparable:
In [26]: np.allclose(numpy_function(), sympy_function())
Out[26]: True
In [27]: timeit numpy_function()
100 loops, best of 3: 4.08 ms per loop
In [28]: timeit sympy_function()
100 loops, best of 3: 5.52 ms per loop
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With