scipy.optimize.minimze
takes obj
and jac
functions as input. and I believe it will call them separately as and when needed. But more often than not we come across objective functions whose gradient computation shares a lot of computations from the objective function. So ideally I would like to compute the obj
and grad
simultaneously. But this doesn't seem to be the case with this library? What is the way to deal with it if one still wants to use scipy.optimize.minimze
if at all there is?
jac : bool or callable, optional Jacobian (gradient) of objective function. Only for CG, BFGS, Newton-CG, L-BFGS-B, TNC, SLSQP, dogleg, trust-ncg. If jac is a Boolean and is True, fun is assumed to return the gradient along with the objective function. If False, the gradient will be estimated numerically.
You totally can. Just use jac=True
:
In [1]: import numpy as np
In [2]: from scipy.optimize import minimize
In [3]: def f_and_grad(x):
...: return x**2, 2*x
...:
In [4]: minimize(f_and_grad, [1], jac=True)
Out[4]:
fun: 1.8367099231598242e-40
hess_inv: array([[ 0.5]])
jac: array([ 2.71050543e-20])
message: 'Optimization terminated successfully.'
nfev: 4
nit: 2
njev: 4
status: 0
success: True
x: array([ 1.35525272e-20])
It's actually documented:
jac : bool or callable, optional Jacobian (gradient) of objective function. Only for CG, BFGS, Newton-CG, L-BFGS-B, TNC, SLSQP, dogleg, trust-ncg. If jac is a Boolean and is True, fun is assumed to return the gradient along with the objective function. If False, the gradient will be estimated numerically. jac can also be a callable returning the gradient of the objective. In this case, it must accept the same arguments as fun.
(emphasis mine)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With