Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

scipy.optimize.minimize(method=’trust-constr’) doesn't terminate on xtol condition

I have set up an optimization problem with linear equality constraints as follows

sol0 = minimize(objective, x0, args=mock_df, method='trust-constr',
                bounds=bnds, constraints=cons,
                options={'maxiter': 250, 'verbose': 3})

The objective is a weighted sum functions, whose coefficients/weights are to be optimized to make it minimized. As I have boundaries on the coefficients as well as constraints, I used the trust-constr method within scipy.optimize.minimize.

The minimization works out, but I do not understand the termination criteria. According to the trust-constr documentation it should terminate on xtol

The algorithm will terminate when tr_radius < xtol, where tr_radius is the radius of the trust region used in the algorithm. Default is 1e-8.

However, the verbose output shows, that the termination is indeed triggered by the barrier_tol parameter, as you can see in the listing below

| niter |f evals|CG iter|  obj func   |tr radius |   opt    |  c viol  | penalty  |barrier param|CG stop|
|-------|-------|-------|-------------|----------|----------|----------|----------|-------------|-------|
C:\ProgramData\Anaconda3\lib\site-packages\scipy\optimize\_trustregion_constr\projections.py:182: UserWarning: Singular Jacobian matrix. Using SVD decomposition to perform the factorizations.
  warn('Singular Jacobian matrix. Using SVD decomposition to ' +
|   1   |  31   |   0   | -4.4450e+02 | 1.00e+00 | 7.61e+02 | 5.00e-01 | 1.00e+00 |  1.00e-01   |   0   |
C:\ProgramData\Anaconda3\lib\site-packages\scipy\optimize\_hessian_update_strategy.py:187: UserWarning: delta_grad == 0.0. Check if the approximated function is linear. If the function is linear better results can be obtained by defining the Hessian as zero instead of using quasi-Newton approximations.
  'approximations.', UserWarning)
|   2   |  62   |   1   | -2.2830e+03 | 6.99e+00 | 3.64e+02 | 7.28e-01 | 1.00e+00 |  1.00e-01   |   2   |
|   3   |  93   |   2   | -9.7651e+03 | 3.42e+01 | 5.52e+01 | 5.33e+00 | 1.00e+00 |  1.00e-01   |   2   |
|   4   |  124  |  26   | -4.9999e+03 | 3.42e+01 | 8.23e+01 | 9.29e-01 | 3.48e+16 |  1.00e-01   |   1   |
|   5   |  155  |  50   | -4.1486e+03 | 3.42e+01 | 5.04e+01 | 2.08e-01 | 3.48e+16 |  1.00e-01   |   1   |
...
|  56   | 1674  | 1127  | -1.6146e+03 | 1.77e-08 | 4.49e+00 | 3.55e-15 | 3.66e+33 |  1.00e-01   |   1   |
|  57   | 1705  | 1151  | -1.6146e+03 | 1.77e-09 | 4.49e+00 | 3.55e-15 | 3.66e+33 |  1.00e-01   |   1   |
|  58   | 1736  | 1151  | -1.6146e+03 | 1.00e+00 | 4.42e+00 | 3.55e-15 | 1.00e+00 |  2.00e-02   |   0   |
|  59   | 1767  | 1175  | -1.6146e+03 | 1.00e-01 | 4.42e+00 | 3.55e-15 | 1.00e+00 |  2.00e-02   |   1   |
|  60   | 1798  | 1199  | -1.6146e+03 | 1.00e-02 | 4.42e+00 | 3.55e-15 | 1.00e+00 |  2.00e-02   |   1   |
...
|  66   | 1984  | 1343  | -1.6146e+03 | 1.00e-08 | 4.42e+00 | 3.55e-15 | 1.00e+00 |  2.00e-02   |   1   |
|  67   | 2015  | 1367  | -1.6146e+03 | 1.00e-09 | 4.42e+00 | 3.55e-15 | 1.00e+00 |  2.00e-02   |   1   |
|  68   | 2046  | 1367  | -1.6146e+03 | 1.00e+00 | 4.36e+00 | 3.55e-15 | 1.00e+00 |  4.00e-03   |   0   |
|  69   | 2077  | 1391  | -1.6146e+03 | 1.00e-01 | 4.36e+00 | 3.55e-15 | 1.00e+00 |  4.00e-03   |   1   |
...
|  77   | 2325  | 1583  | -1.6146e+03 | 1.00e-09 | 4.36e+00 | 3.55e-15 | 1.00e+00 |  4.00e-03   |   1   |
|  78   | 2356  | 1583  | -1.6146e+03 | 1.00e+00 | 4.35e+00 | 3.55e-15 | 1.00e+00 |  8.00e-04   |   0   |
|  79   | 2387  | 1607  | -1.6146e+03 | 1.00e-01 | 4.35e+00 | 3.55e-15 | 1.00e+00 |  8.00e-04   |   1   |
...
|  87   | 2635  | 1799  | -1.6146e+03 | 1.00e-09 | 4.35e+00 | 3.55e-15 | 1.00e+00 |  8.00e-04   |   1   |
|  88   | 2666  | 1799  | -1.6146e+03 | 1.00e+00 | 4.34e+00 | 3.55e-15 | 1.00e+00 |  1.60e-04   |   0   |
|  89   | 2697  | 1823  | -1.6146e+03 | 1.00e-01 | 4.34e+00 | 3.55e-15 | 1.00e+00 |  1.60e-04   |   1   |
...
|  97   | 2945  | 2015  | -1.6146e+03 | 1.00e-09 | 4.34e+00 | 3.55e-15 | 1.00e+00 |  1.60e-04   |   1   |
|  98   | 2976  | 2015  | -1.6146e+03 | 1.00e+00 | 4.34e+00 | 3.55e-15 | 1.00e+00 |  3.20e-05   |   0   |
|  99   | 3007  | 2039  | -1.6146e+03 | 1.00e-01 | 4.34e+00 | 3.55e-15 | 1.00e+00 |  3.20e-05   |   1   |
...
|  167  | 5053  | 3527  | -1.6146e+03 | 1.00e-07 | 1.35e+01 | 2.12e-11 | 1.00e+00 |  2.05e-09   |   1   |
|  168  | 5084  | 3551  | -1.6146e+03 | 1.00e-08 | 1.35e+01 | 2.12e-11 | 1.00e+00 |  2.05e-09   |   1   |
|  169  | 5115  | 3575  | -1.6146e+03 | 1.00e-09 | 1.35e+01 | 2.12e-11 | 1.00e+00 |  2.05e-09   |   1   |
`xtol` termination condition is satisfied.
Number of iterations: 169, function evaluations: 5115, CG iterations: 3575, optimality: 1.35e+01, constraint violation: 2.12e-11, execution time: 3.8e+02 s.

It is obvious that once, tr_radius < xtol, the tr_radius is reset to its default value 1 and the barrier param is reduced. Once barrier param < barrier_tol (i.e. 1e-8) and tr_radius < xtol, the optimization terminates successfully. The documentation says regarding barrier_tol

When inequality constraints are present the algorithm will terminate only when the barrier parameter is less than barrier_tol.

which would explain the behaviour in case of inequality constraints, but all my constraints are equality constraints defined as dictionary

con0 = {'type': 'eq', 'fun': constraint0}

Is anyone deep enough into trust-constr to explain this to me?

like image 896
gehbiszumeis Avatar asked Jul 19 '19 07:07

gehbiszumeis


People also ask

What is Scipy optimize minimize?

SciPy optimize provides functions for minimizing (or maximizing) objective functions, possibly subject to constraints. It includes solvers for nonlinear problems (with support for both local and global optimization algorithms), linear programing, constrained and nonlinear least-squares, root finding, and curve fitting.

What is JAC in Scipy optimize?

jac : bool or callable, optional Jacobian (gradient) of objective function. Only for CG, BFGS, Newton-CG, L-BFGS-B, TNC, SLSQP, dogleg, trust-ncg. If jac is a Boolean and is True, fun is assumed to return the gradient along with the objective function. If False, the gradient will be estimated numerically.

Does optimize belong to Scipy?

The scipy. optimize package provides several commonly used optimization algorithms. A detailed listing is available: scipy. optimize (can also be found by help(scipy.


1 Answers

Do you have variables with upper bounds? Perhaps the solver is implementing these as constraints like var < UPPER_BOUND.

(I would put this as a comment if I had the reputation score to do so)

like image 79
Dylan Black Avatar answered Sep 19 '22 04:09

Dylan Black