Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Scipy optimize.minimize exits successfully when constraints aren't satisfied

Tags:

python

scipy

I've been using scipy.optimize.minimize (docs)

and noticed some strange behavior when I define a problem with impossible to satisfy constraints. Here's an example:

from scipy import optimize

# minimize f(x) = x^2 - 4x
def f(x):
    return x**2 - 4*x

def x_constraint(x, sign, value):
    return sign*(x - value)

# subject to x >= 5 and x<=0 (not possible)
constraints = []
constraints.append({'type': 'ineq', 'fun': x_constraint, 'args': [1, 5]})
constraints.append({'type': 'ineq', 'fun': x_constraint, 'args': [-1, 0]})

optimize.minimize(f, x0=3, constraints=constraints)

Resulting output:

fun: -3.0
     jac: array([ 2.])
 message: 'Optimization terminated successfully.'
    nfev: 3
     nit: 5
    njev: 1
  status: 0
 success: True
       x: array([ 3.])

There is no solution to this problem that satisfies the constraints, however, minimize() returns successfully using the initial condition as the optimal solution.

Is this behavior intended? If so, is there a way to force failure if the optimal solution doesn't satisfy the constraints?

like image 754
LoLa Avatar asked Jun 29 '18 22:06

LoLa


People also ask

How does SciPy optimize minimize work?

SciPy optimize provides functions for minimizing (or maximizing) objective functions, possibly subject to constraints. It includes solvers for nonlinear problems (with support for both local and global optimization algorithms), linear programing, constrained and nonlinear least-squares, root finding, and curve fitting.

Is SciPy minimize deterministic?

The answer is yes.

What is JAC in SciPy minimize?

jac : bool or callable, optional Jacobian (gradient) of objective function. Only for CG, BFGSBFGSIn numerical optimization, the Broyden–Fletcher–Goldfarb–Shanno (BFGS) algorithm is an iterative method for solving unconstrained nonlinear optimization problems.https://en.wikipedia.org › wikiBroyden–Fletcher–Goldfarb–Shanno algorithm - Wikipedia, Newton-CG, L-BFGSL-BFGSLimited-memory BFGS (L-BFGS or LM-BFGS) is an optimization algorithm in the family of quasi-Newton methods that approximates the Broyden–Fletcher–Goldfarb–Shanno algorithm (BFGS) using a limited amount of computer memory. It is a popular algorithm for parameter estimation in machine learning.https://en.wikipedia.org › wiki › Limited-memory_BFGSLimited-memory BFGS - Wikipedia-B, TNC, SLSQP, dogleg, trust-ncg. If jac is a Boolean and is True, fun is assumed to return the gradient along with the objective function. If False, the gradient will be estimated numerically.


1 Answers

This appears to be a bug. I added a comment with a variation of your example to the issue on github.

If you use a different method, such as COBYLA, the function correctly fails to find a solution:

In [10]: optimize.minimize(f, x0=3, constraints=constraints, method='COBYLA')
Out[10]: 
     fun: -3.75
   maxcv: 2.5
 message: 'Did not converge to a solution satisfying the constraints. See `maxcv` for magnitude of violation.'
    nfev: 7
  status: 4
 success: False
       x: array(2.5)
like image 61
Warren Weckesser Avatar answered Sep 18 '22 15:09

Warren Weckesser