Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Tensorflow: Using Adam optimizer

I am experimenting with some simple models in tensorflow, including one that looks very similar to the first MNIST for ML Beginners example, but with a somewhat larger dimensionality. I am able to use the gradient descent optimizer with no problems, getting good enough convergence. When I try to use the ADAM optimizer, I get errors like this:

tensorflow.python.framework.errors.FailedPreconditionError: Attempting to use uninitialized value Variable_21/Adam      [[Node: Adam_2/update_Variable_21/ApplyAdam = ApplyAdam[T=DT_FLOAT, use_locking=false, _device="/job:localhost/replica:0/task:0/cpu:0"](Variable_21, Variable_21/Adam, Variable_21/Adam_1, beta1_power_2, beta2_power_2, Adam_2/learning_rate, Adam_2/beta1, Adam_2/beta2, Adam_2/epsilon, gradients_11/add_10_grad/tuple/control_dependency_1)]] 

where the specific variable that complains about being uninitialized changes depending on the run. What does this error mean? And what does it suggest is wrong? It seems to occur regardless of the learning rate I use.

like image 962
pythonic metaphor Avatar asked Nov 18 '15 19:11

pythonic metaphor


People also ask

What is TF keras optimizers Adam?

Adam optimization is a stochastic gradient descent method that is based on adaptive estimation of first-order and second-order moments.

What does the Optimizer do in Tensorflow?

The optimizers are used for improving speed and performance for training a specific model. This class is defined in the specified path of tensorflow/python/training/optimizer.py.


2 Answers

The AdamOptimizer class creates additional variables, called "slots", to hold values for the "m" and "v" accumulators.

See the source here if you're curious, it's actually quite readable: https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/training/adam.py#L39 . Other optimizers, such as Momentum and Adagrad use slots too.

These variables must be initialized before you can train a model.

The normal way to initialize variables is to call tf.initialize_all_variables() which adds ops to initialize the variables present in the graph when it is called.

(Aside: unlike its name suggests, initialize_all_variables() does not initialize anything, it only add ops that will initialize the variables when run.)

What you must do is call initialize_all_variables() after you have added the optimizer:

...build your model... # Add the optimizer train_op = tf.train.AdamOptimizer(1e-4).minimize(cross_entropy) # Add the ops to initialize variables.  These will include  # the optimizer slots added by AdamOptimizer(). init_op = tf.initialize_all_variables()  # launch the graph in a session sess = tf.Session() # Actually intialize the variables sess.run(init_op) # now train your model for ...:   sess.run(train_op) 
like image 63
Touts Avatar answered Oct 14 '22 02:10

Touts


FailedPreconditionError: Attempting to use uninitialized value is one of the most frequent errors related to tensorflow. From official documentation, FailedPreconditionError

This exception is most commonly raised when running an operation that reads a tf.Variable before it has been initialized.

In your case the error even explains what variable was not initialized: Attempting to use uninitialized value Variable_1. One of the TF tutorials explains a lot about variables, their creation/initialization/saving/loading

Basically to initialize the variable you have 3 options:

  • initialize all global variables with tf.global_variables_initializer()
  • initialize variables you care about with tf.variables_initializer(list_of_vars). Notice that you can use this function to mimic global_variable_initializer: tf.variable_initializers(tf.global_variables())
  • initialize only one variable with var_name.initializer

I almost always use the first approach. Remember you should put it inside a session run. So you will get something like this:

with tf.Session() as sess:     sess.run(tf.global_variables_initializer()) 

If your are curious about more information about variables, read this documentation to know how to report_uninitialized_variables and check is_variable_initialized.

like image 39
Salvador Dali Avatar answered Oct 14 '22 02:10

Salvador Dali