Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Scipy, optimize a function with argument dependent constraints

I am trying to use negative of scipy.optimize.minimize to maximize a function f (a, b, c, d). d is a numpy.array of guess variables.

I am trying to put some bounds on each d. And also a constraint on each d such that (d1 * a1 + d2 * a2 + ... + d3 * a3) < some_Value (a being the other argument to the subject function f).

My problem is how do I define this constraint as an argument to the maximize function.

I could not find any maximize function in the library so we're using the negative of minimize with minimize documentation over here.

Please consider asking for clarifications if the question is not clear enough.

like image 377
Nikhil Girraj Avatar asked Jul 08 '15 09:07

Nikhil Girraj


People also ask

Does SciPy have a maximize function?

SciPy optimize provides functions for minimizing (or maximizing) objective functions, possibly subject to constraints. It includes solvers for nonlinear problems (with support for both local and global optimization algorithms), linear programing, constrained and nonlinear least-squares, root finding, and curve fitting.

Is SciPy optimize multithreaded?

NumPy/SciPy's functions are usually optimized for multithreading. Did you look at your CPU utilization to confirm that only one core is being used while the simulation is being ran? Otherwise you have nothing to gain from running multiple instances.


1 Answers

It's not totally clear from your description which of the parameters of f you are optimizing over. For the purposes of this example I'm going to use x to refer to the vector of parameters you are optimizing over, and a to refer to another parameter vector of the same length which is held constant.

Now let's suppose you wanted to enforce the following inequality constraint:

10 <= x[0] * a[0] + x[1] * a[1] + ... + x[n] * a[n]

First you must define a function that accepts x and a and returns a value that is non-negative when the constraint is met. In this case we could use:

lambda x, a: (x * a).sum() - 10

or equivalently:

lambda x, a: x.dot(a) - 10

Constraints are passed to minimize in a dict (or a sequence of dicts if you have multiple constraints to apply):

con = {'type': 'ineq',
       'fun': lambda x, a: a.dot(x) - 10,
       'jac': lambda x, a: a,
       'args': (a,)}

For greater efficiency I've also defined a function that returns the Jacobian (the sequence of partial derivatives of the constraint function w.r.t. each parameter in x), although this is not essential - if unspecified it will be estimated via first-order finite differences.

Your call to minimize would then look something like:

res = minimize(f, x0, args=(a,), method='SLSQP', constraints=con)

You can find another complete example of constrained optimization using SLSQP in the official documentation here.

like image 159
ali_m Avatar answered Sep 17 '22 04:09

ali_m