Is there a way to autogenerate a dictionary list of multiple constraints in scipy.minimize? When I use the following code (where the list constraint is a list of sage multivariate polynomials over the same ring)
cons = [{'type': 'eq', 'fun': lambda s: ((constraint[0])(*s))},
{'type': 'eq', 'fun': lambda s: ((constraint[1])(*s))},
{'type': 'eq', 'fun': lambda s: ((constraint[2])(*s))},
{'type': 'eq', 'fun': lambda s: ((constraint[3])(*s))}]
y0 = [.5 for xx in x]
bnds = tuple([(0.0, 1.0) for xx in x])
ssoln = scipy.optimize.minimize(HH, y0, jac=dHH, method='SLSQP', bounds=bnds, constraints=cons)
print ssoln
My output is
status: 0
success: True
njev: 14
nfev: 22
fun: -2.2669026273652237
x: array([ 0.034829615490635, 0.933405952554424, 0.93340765416238 ,
0.093323548109654, 0.335713397575351, 0.413107862378296])
message: 'Optimization terminated successfully.'
jac: array([-3.321836605297572, 2.640225014918886, 2.640252390205999,
-2.273713195767229, -0.682455873949375, -0.351132324172705, 0. ])
nit: 14
However if I try to create cons by
cons=[]
for ii in range(len(constraint)):
cons.append({'type': 'eq', 'fun': lambda s: ((constraint[ii])(*s))})
minimize fails with
status: 6
success: False
njev: 1
nfev: 1
fun: -4.1588830833596715
x: array([ 0.5, 0.5, 0.5, 0.5, 0.5, 0.5])
message: 'Singular matrix C in LSQ subproblem'
jac: array([ 0., 0., 0., 0., 0., 0., 0.])
nit: 1
My list, constraint, of sage polynomials may change in length, in number of polynomials from problem to problem and I don't want to have to hard code the cons dict list as first given above for each problem. Is there a way to automate?
The following works but I understand it is not best practices to eval strings
str1='{\'type\': \'eq\', \'fun\': lambda s: ((constraint['
str2='])(*s))},'
mystr='['
for ii in range(len(constraint)):
mystr=mystr+str1+str(ii)+str2
mystr=mystr+']'
cons = eval(mystr)
SciPy optimize provides functions for minimizing (or maximizing) objective functions, possibly subject to constraints. It includes solvers for nonlinear problems (with support for both local and global optimization algorithms), linear programing, constrained and nonlinear least-squares, root finding, and curve fitting.
jac : bool or callable, optional Jacobian (gradient) of objective function. Only for CG, BFGS, Newton-CG, L-BFGS-B, TNC, SLSQP, dogleg, trust-ncg. If jac is a Boolean and is True, fun is assumed to return the gradient along with the objective function. If False, the gradient will be estimated numerically.
The problem is in your loop. The lambda
operator performs what is called a lazy evaluation. At the end of your loop, the lambda the cons is performing the function on the last value of ii
, instead of on each index.
To perform a strict evaluation, you can use the partial
object from the python functools
module (in both python 2 or python 3).
To exemplify, with lambda
:
constraint = (lambda x: x, lambda x: x**2, lambda x: x**3, lambda x: x**4)
cons=[]
for ii in range(len(constraint)):
# lambda s will evaluate the last value of ii
cons.append({'type': 'eq', 'fun': lambda s: ((constraint[ii])(s))})
print([i['fun'](2) for i in cons])
# The value of ii is 3, so it will always call lambda x: x**4
>> [16, 16, 16, 16]
from functools import partial
def f_constraint(s, index):
return constraint[index](s)
cons=[]
for ii in range(len(constraint)):
# the value of ii is set in each loop
cons.append({'type': 'eq', 'fun': partial(f_constraint, index=ii)})
print([i['fun'](2) for i in cons])
>> [2, 4, 8, 16]
Replacing s by *s to match your definition:
from functools import partial
def f_constraint(s, index):
return constraint[index](*s)
cons=[]
for ii in range(len(constraint)):
cons.append({'type': 'eq', 'fun': partial(f_constraint, index=ii)})
Hope it helps!
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With