Google's results on this seem to require more advanced maths than I'm familiar with (and I might not be smarter than a fifth grader, but I don't intend to find out).
I'm looking for a general way to solve multivariate optimization problems, preferably in c#, without having to dig into matrices and eigenvectors and normal distributions.
Say I have numeric variables x, y, z, and w, and function f such that w = f(x, y, z)
. I want to maximize w, and...
f
is unknownx
, y
and/or z
, if any, is unknownx
, y
and z
and resample w
on-demandw
with the fewest trial permutations of x
, y
, and z
, and chooses the next value for each after every round of sampling I have rough minimum and maximum bounds for the independent variables. I of course don't want to sample any more of the permutation space than necessary. I'd like the algorithm to have at least a crude ability to detect the most glaring co-dependencies, e.g., diminishing returns when x
> 2y
, or actual deterioration in w
when the sum of x
, y
, and z
exceeds some ceiling, etc.
Most of the math libraries I've looked at assume I know how to perform quantum nergenflip projections over the Boigenfoodle Continuum, and I'm just not there. How would a non-mathematician coder accomplish this?
In a multivariate optimization problem, there are multiple variables that act as decision variables in the optimization problem.
Gradient Descent is the most basic but most used optimization algorithm. It's used heavily in linear regression and classification algorithms. Backpropagation in neural networks also uses a gradient descent algorithm.
Try four to five algorithms based on single and multi objective and compare their results to find the best one or the one that is better than others in some perspectives. Think about the problem, you would like to solve. Then, make a model, with appropriate objective function(s) and constraints.
Multivariable optimization: basic concepts. and properties. • Absolute maximum/absolute minimum (also called global max/min): Specify a region R contained in the domain of the function f. If the value at (a, b) is bigger than or equal to the value at any other point in R, then f(a, b) is called the global maximum.
You can try simulated annealing if you don't want to get stuck in the local minima.
Basically, start with some x,y,z. Generate dx, dy and dz randomly from a zero mean distribution (normal or gaussian distribution). If w(x+dx,y+dy,z+dz) > w(x,y,z) then select this new solution. Otherwise select it probability w(x+dx,y+dy,z+dz)/w(x,y,z).
Python code
def simAnneal( w, seed_x, numSteps=100000, sigma=0.01 ):
optimal_x = [i for i in seed_x]
optimal_w = w(optimal_x)
cur_w = w(seed_x)
for i in range(numSteps):
new_x = [i+random.gauss(0, sigma) for i in seed_x]
new_w = w(new_x)
if (new_w > cur_w) or (random.random() > new_w / cur_w) :
cur_x = new_x
cur_w = new_w
if cur_w > optimal_w:
optimal_w = cur_w
optimal_x = cur_x
return optimal_x
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With