Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

LinearConstraint in scipy.optimize

Tags:

python

scipy

I would like to use scipy.optimize to minimize a function (eventually non-linear) over a large set of linear inequalities. As a warm-up, I'm trying to minimize x+y over the box 0<=x<=1, 0<=y<=1. Following the suggestion of Johnny Drama below, I am currently using a dict-comprehesion to produce the dictionary of inequalities, but am not getting the expected answer (min value=0, min at (0,0)).

New section of code (currently relevant):

import numpy as np
from scipy.optimize import minimize



#Create initial point.

x0=[.1,.1]

#Create function to be minimized

def obj(x):
    return x[0]+x[1]


#Create linear constraints  lbnd<= A*(x,y)^T<= upbnd

A=np.array([[1,0],[0,1]])

b1=np.array([0,0])

b2=np.array([1,1])

cons=[{"type": "ineq", "fun": lambda x: np.matmul(A[i, :],x) -b1[i]} for i in range(A.shape[0])]

cons2=[{"type": "ineq", "fun": lambda x: b2[i]-np.matmul(A[i, :], x) } for i in range(A.shape[0])]

cons.extend(cons2)

sol=minimize(obj,x0,constraints=cons)

print(sol)

Original version of question:

I would like to use the LinearConstraint object in scipy.optimize, as described in the tutorial here: "Defining linear constraints"

I've tried to do a simpler example, where it's obvious what the answer should be: minimize x+y over the square 0<=x<=1, 0<=y<=1. Below is my code, which returns the error "'LinearConstraint' object is not iterable", but I don't see how I'm trying to iterate.

EDIT 1: The example is deliberately over simple. Ultimately, I want to minimize a non-linear function over a large number of linear constraints. I know that I can use dictionary comprehension to turn my matrix of constraints into a list of dictionaries, but I'd like to know if "LinearConstraints" can be used as an off-the-shelf way to turn matrices into constraints.

EDIT 2: As pointed out by Johnny Drama, LinearConstraint is for a particular method. So above I've tried to use instead his suggestion for a dict-comprehension to produce the linear constraints, but am still not getting the expected answer.

Original section of code (now irrelevant):

from scipy.optimize import minimize
from scipy.optimize import LinearConstraint


#Create initial point.

x0=[.1,.1]

#Create function to be minimized

def obj(x):
    return x[0]+x[1]


#Create linear constraints  lbnd<= A* 
#(x,y)^T<= upbnd

A=[[1,0],[0,1]]

lbnd=[0,0]

upbnd=[0,0]

lin_cons=LinearConstraint(A,lbnd,upbnd)

sol=minimize(obj,x0,constraints=lin_cons)

print(sol)
like image 510
TBRS Avatar asked Aug 24 '18 09:08

TBRS


People also ask

What is optimize in SciPy?

SciPy optimize provides functions for minimizing (or maximizing) objective functions, possibly subject to constraints. It includes solvers for nonlinear problems (with support for both local and global optimization algorithms), linear programing, constrained and nonlinear least-squares, root finding, and curve fitting.

Is SciPy optimize a library?

The SciPy library is the fundamental library for scientific computing in Python. It provides many efficient and user-friendly interfaces for tasks such as numerical integration, optimization, signal processing, linear algebra, and more.

Is SciPy minimize deterministic?

The answer is yes.


1 Answers

As newbie already said, use scipy.optimize.linprog if you want to solve a LP (linear program), i.e. your objective function and your constraints are linear. If either the objective or one of the constraints isn't linear, we are facing a NLP (nonlinear optimization problem), which can be solved by scipy.optimize.minimize:

minimize(obj_fun, x0=xinit, bounds=bnds, constraints=cons)

where obj_fun is your objective function, xinit a initial point, bnds a list of tuples for the bounds of your variables and cons a list of constraint dicts.


Here's an example. Suppose we want to solve the following NLP:

enter image description here

Since all constraints are linear, we can express them by a affin-linear function A*x-b such that we have the inequality A*x >= b. Here A is a 3x2 matrix and b the 3x1 right hand side vector:

import numpy as np
from scipy.optimize import minimize

obj_fun = lambda x: (x[0] - 1)**2 + (x[1] - 2.5)**2
A = np.array([[1, -2], [-1, -2], [-1, 2]])
b = np.array([-2, -6, -2])
bnds = [(0, None) for i in range(A.shape[1])]  # x_1 >= 0, x_2 >= 0
xinit = [0, 0] 

Now the only thing left to do is defining the constraints, each one has to be a dict of the form

{"type": "ineq", "fun": constr_fun}

where constr_fun is a callable function such that constr_fun >= 0. Thus, we could define each constraint

cons = [{'type': 'ineq', 'fun': lambda x:  x[0] - 2 * x[1] + 2},
        {'type': 'ineq', 'fun': lambda x: -x[0] - 2 * x[1] + 6},
        {'type': 'ineq', 'fun': lambda x: -x[0] + 2 * x[1] + 2}]

and we'd be done. However, in fact, this can be quite cumbersome for many constraints. Instead, we can pass all constraints directly by:

cons = [{"type": "ineq", "fun": lambda x: A @ x - b}]

where @ denotes the matrix multiplication operator. Putting all together

res = minimize(obj_fun, x0=xinit, bounds=bnds, constraints=cons)
print(res)

yields

     fun: 0.799999999999998
     jac: array([ 0.79999999, -1.59999999])
 message: 'Optimization terminated successfully.'
    nfev: 16
     nit: 4
    njev: 4
  status: 0
 success: True
       x: array([1.39999999, 1.69999999])

Likewise, you could use a LinearConstraint object:

from scipy.optimize import LinearConstraint

# lb <= A <= ub. In our case: lb = b, ub = inf
lincon = LinearConstraint(A, b, np.inf*np.ones(3))

# rest as above
res = minimize(obj_fun, x0=xinit, bounds=bnds, constraints=(lincon,))

Edit: To answer your new question:

# b1    <= A * x   <==>   -b1 >= -A*x        <==>   A*x - b1 >= 0
# A * x <= b2      <==>    A*x - b2 <= 0     <==>  -Ax + b2 >= 0
cons = [{"type": "ineq", "fun": lambda x: A @ x - b1}, {"type": "ineq", "fun": lambda x: -A @ x + b2}]
sol=minimize(obj,x0,constraints=cons)
print(sol)
like image 77
joni Avatar answered Sep 23 '22 07:09

joni