I'm trying to use the scipy.optimize package to find the maximum of my cost function.
In this particular case: I have a list of prices which vary over the day. To make it easier, lets assume the day has 8 hours and the price in each hour are as follows:
price_list = np.array([1,2,6,8,8,5,2,1])
In this simplified case I want to select the 4 highest prices from that price_list. And for various reasons I don't want to simply sort and select the best four prices, but use some optimization algorithm. I have several constraints, therefore I decided to use the least square algorithm from scipy, scipy.optimize.fmin_slsqp.
I first create a schedule for the hours I select:
schedule_list = np.zeros( len(price_list), dtype=float)
Next I need to define my inversed profit_function. For all selected hours I want to sum up my profit. While I want to optimize my schedule, the price_list is fixed, therefore I need to put it to the *args:
def price_func( schedule_list, *price_list ):
return -1.*np.sum( np.dot( schedule_list, price_list ) )
Once I understand how things work in principle I will move some stuff around. So long, I simply avoid using more *args and define my constraint with a hard coded number of hours to run. And I want my selected hours to be exactly 4, therefore I use an equality constrain:
def eqcon(x, *price_list):
return sum( schedule_list ) - 4
Furthermore I want to restrict my schedule values to be either 0 or 1. I'm not sure how to implement this right now, therefore I simply use the bounds keywords.
The unrestricted optimization with bounds works fine. I simply pass my schedule_list as first guess.
scipy.optimize.fmin_slsqp( price_func, schedule_list, args=price_list, bounds=[[0,1]]*len(schedule_list) )
and the output is as good as it can be:
Optimization terminated successfully. (Exit mode 0)
Current function value: -33.0
Iterations: 2
Function evaluations: 20
Gradient evaluations: 2
Out[8]: array([ 1., 1., 1., 1., 1., 1., 1., 1.])
Without any further constraints, this is the optimal solution!
Using the constrained optimization with the following command:
scipy.optimize.fmin_slsqp( price_func, schedule_list, args=price_list, bounds=[[0,1]]*len(schedule_list), eqcons=[eqcon, ] )
gives me the error:
Singular matrix C in LSQ subproblem (Exit mode 6)
Current function value: -0.0
Iterations: 1
Function evaluations: 10
Gradient evaluations: 1
Out[9]: array([ 0., 0., 0., 0., 0., 0., 0., 0.])
From gnuplot I know that this is often related to non-sense questions or bad initial values. I tried almost optimal initial values, but that does not help. Does anybody has an idea or even solution for me?
In a next step, I already have formulated inequality constrains. Do I understand it correctly that in the scipy minimize wrapper the inequalities are assumed to be larger than 0, while in fmin_slsqp it is vice versa. Solutions are constrained to negative constrain functions?
SciPy optimize provides functions for minimizing (or maximizing) objective functions, possibly subject to constraints. It includes solvers for nonlinear problems (with support for both local and global optimization algorithms), linear programing, constrained and nonlinear least-squares, root finding, and curve fitting.
The SciPy library is the fundamental library for scientific computing in Python. It provides many efficient and user-friendly interfaces for tasks such as numerical integration, optimization, signal processing, linear algebra, and more.
NumPy/SciPy's functions are usually optimized for multithreading. Did you look at your CPU utilization to confirm that only one core is being used while the simulation is being ran? Otherwise you have nothing to gain from running multiple instances.
You have a plain linear program, is that right ?
min: - prices . x
constrain: x >= 0, sum x = 4
so the second derivative matrix aka Hessian is exactly 0.slsqp
is trying to invert this --- not possible. Agreed, the error message could be better.
(The same will happen with other quadratic methods, in any package:
they'll converge much faster on smooth functions, but crash on rough cliffs.)
See also
why-cant-i-rig-scipys-constrained-optimization-for-integer-programming --
but LP should do the job (max 4), Integer programming is harder.
The SLSQP algorithm is a gradient-based optimizer, meaning it expects the derivatives of the objective and constraints to be continuous. From my understanding, it seems like you're trying to solve an integer programming problem (continuous values in the schedule list are not acceptable). You need an algorithm that selects appropriate values (0 or 1) for the independent variables, rather than trying to find the minimum of a continuous space of values. Unfortunately, I'm not sure that there are any in scipy that do that.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With