I am facing a numerical optimization problem subject to constraints, equalities and inequalities. It looks like everything is in tensorflow for this task, reading documentation such as https://www.tensorflow.org/api_docs/python/tf/contrib/constrained_optimization .
Though I am missing a minimal working example. I have done extensive googling with no result. Can anyone share some useful ressource with me? Preferably running in eager mode.
edit:
I have now found https://github.com/tensorflow/tensorflow/tree/master/tensorflow/contrib/constrained_optimization
I am still welcoming any additional resources.
You can use TFCO which is available for TF > 1.4.
Here is a concrete example where we want to minimize:
(x - 2) ^ 2 + y
s.t.
import tensorflow as tf
# Use the GitHub version of TFCO
# !pip install git+https://github.com/google-research/tensorflow_constrained_optimization
import tensorflow_constrained_optimization as tfco
class SampleProblem(tfco.ConstrainedMinimizationProblem):
def __init__(self, loss_fn, weights):
self._loss_fn = loss_fn
self._weights = weights
@property
def num_constraints(self):
return 4
def objective(self):
return loss_fn()
def constraints(self):
x, y = self._weights
sum_weights = x + y
lt_or_eq_one = sum_weights - 1
gt_or_eq_one = 1 - sum_weights
constraints = tf.stack([lt_or_eq_one, gt_or_eq_one, -x, -y])
return constraints
x = tf.Variable(0.0, dtype=tf.float32, name='x')
y = tf.Variable(0.0, dtype=tf.float32, name='y')
def loss_fn():
return (x - 2) ** 2 + y
problem = SampleProblem(loss_fn, [x, y])
optimizer = tfco.LagrangianOptimizer(
optimizer=tf.optimizers.Adagrad(learning_rate=0.1),
num_constraints=problem.num_constraints
)
var_list = [x, y] + problem.trainable_variables + optimizer.trainable_variables()
for i in range(10000):
optimizer.minimize(problem, var_list=var_list)
if i % 1000 == 0:
print(f'step = {i}')
print(f'loss = {loss_fn()}')
print(f'constraint = {(x + y).numpy()}')
print(f'x = {x.numpy()}, y = {y.numpy()}')
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With