I have a simple function
def square(x, a=1):
return [x**2 + a, 2*x]
I want to minimize it over x
, for several parameters a
. I currently have loops that, in spirit, do something like this:
In [89]: from scipy import optimize
In [90]: res = optimize.minimize(square, 25, method='BFGS', jac=True)
In [91]: [res.x, res.fun]
Out[91]: [array([ 0.]), 1.0]
In [92]: l = lambda x: square(x, 2)
In [93]: res = optimize.minimize(l, 25, method='BFGS', jac=True)
In [94]: [res.x, res.fun]
Out[94]: [array([ 0.]), 2.0]
Now, the function is already vectorized
In [98]: square(array([2,3]))
Out[98]: [array([ 5, 10]), array([4, 6])]
In [99]: square(array([2,3]), array([2,3]))
Out[99]: [array([ 6, 12]), array([4, 6])]
Which means it would probably be much faster to run all the optimizations in parallel rather than looping. Is that something that's easily do-able with SciPy? Or any other 3rd party tool?
Here's another try, based on my original answer and the discussion that followed.
As far as I know, the scipy.optimize module is for functions with scalar or vector inputs and a scalar output, or "cost".
Since you're treating each equation as independent of the others, my best idea is to use the multiprocessing module to do the work in parallel. If the functions you're minimizing are as simple as the ones in your question, I'd say it's not worth the effort.
If the functions are more complex, and you'd like to divide the work up, try something like:
import numpy as np
from scipy import optimize
from multiprocessing import Pool
def square(x, a=1):
return [np.sum(x**2 + a), 2*x]
def minimize(args):
f,x,a = args
res = optimize.minimize(f, x, method = 'BFGS', jac = True, args = [a])
return res.x
# your a values
a = np.arange(1,11)
# initial guess for all the x values
x = np.empty(len(a))
x[:] = 25
args = [(square,a[i],x[i]) for i in range(10)]
p = Pool(4)
print p.map(minimize,args)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With