I'm doing an optimization with scipy.optimize.minimize, and attempting to use the following methods: 'Newton-CG', 'dogleg', and 'trust-ncg'. As I understand, for these methods a jacobian of the objective function is needed. However, the documentation suggests that if jac is set to False, that the gradient will be computed numerically.
So I'm trying to call the function like so:
scipy.optimize.minimize(fun,x0,method='Newton-CG',jac=False,options={'disp':True}
When I call this, I get the following error message:
File "/usr/lib/python2.7/dist-packages/scipy/optimize/optimize.py", line 1351, in _minimize_newtoncg
raise ValueError('Jacobian is required for Newton-CG method')
That's surprising, since I thought I just set it to False (and this exception only occurs in */optimize.py if jac is set to None). So I go into /usr/lib/python2.7/dist-packages/scipy/optimize/optimize.py and look at the function
def _minimize_newtoncg(fun, x0, args=(), jac=None, hess=None, hessp=None,
callback=None, xtol=1e-5, eps=_epsilon, maxiter=None,
disp=False, return_all=False,
**unknown_options):
At the beginning of this function I write the following print statements:
print (jac)
_check_unknown_options(unknown_options)
print(jac)
if jac is None:
raise ValueError('Jacobian is required for Newton-CG method')
Surprisingly, 'None' is printed and not False! So I look at the calling function, which is in /usr/lib/python2.7/dist-packages/scipy/optimize/_minimize.py, and I find the code snippet that is setting this to None:
if not callable(jac):
if bool(jac):
fun = MemoizeJac(fun)
jac = fun.derivative
else:
jac = None
So that makes sense why jac is being set to None (though it seems incompatible with the documentation that suggests I'm going to get a numerical approximation to the jacobian by setting jac to False in the original function call).
What am I missing? Is it possible for me to call the 'Newton-CG' method like I'm doing above with Scipy computing the numerical approximation for the Jacobian for me?
Apparently the bug is still there, three years later.
For Newton-CG, the minimizer only takes a callable Jacobian. A quick way of obtaining one is to use scipy.optimize.approx_fprime
as follows:
# x0 is your initial guess.
fprime = lambda x: optimize.approx_fprime(x, f, 0.01)
result = optimize.minimize(f, x0, method = 'Newton-CG', jac = fprime)
As my understanding this should be how "2-point" method is implemented.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With