Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Running Adam Optimizer

I am attempting to run an AdamOptimizer for one step of training, unsuccesfully.

optimizer = tf.train.AdamOptimizer(learning_rate)
init = tf.global_variables_initializer()

with tf.Session() as sess:
    sess.run(init)
    sess.run(optimizer.minimize(cost), feed_dict={X:X_data, Y: Y_data})

The console is spitting an ugly looking error:

FailedPreconditionError (see above for traceback): Attempting to use uninitialized value beta1_power
 [[Node: beta1_power/read = Identity[T=DT_FLOAT, _class=["loc:@W1"], _device="/job:localhost/replica:0/task:0/cpu:0"](beta1_power)]]

In the code, cost is a well defined function implementing a conv NN plus a logistic loss function, using two parameters X, Y (the entry of the NN and the training labels respectively)

Any ideas on what could possibly be wrong?

like image 448
Jsevillamol Avatar asked Nov 01 '17 19:11

Jsevillamol


People also ask

What is Optimizer =' Adam?

What is the Adam optimization algorithm? Adam is an optimization algorithm that can be used instead of the classical stochastic gradient descent procedure to update network weights iterative based in training data.

How does Adam Optimiser work?

Adam optimizer involves a combination of two gradient descent methodologies: Momentum: This algorithm is used to accelerate the gradient descent algorithm by taking into consideration the 'exponentially weighted average' of the gradients. Using averages makes the algorithm converge towards the minima in a faster pace.

Why we use Adam Optimizer?

The results of the Adam optimizer are generally better than every other optimization algorithms, have faster computation time, and require fewer parameters for tuning. Because of all that, Adam is recommended as the default optimizer for most of the applications.

Is Adam the best optimizer?

Adam is the best among the adaptive optimizers in most of the cases. Good with sparse data: the adaptive learning rate is perfect for this type of datasets.


1 Answers

optimizer.minimize(cost) is creating new values & variables in your graph.

When you call sess.run(init) the variables that the .minimize method creates are not yet defined: from this your error.

You just have to declare your minimization operation before invoking tf.global_variables_initializer():

optimizer = tf.train.AdamOptimizer(learning_rate)
minimize = optimizer.minimize(cost)
init = tf.global_variables_initializer()

with tf.Session() as sess:
    sess.run(init)
    sess.run(minimize, feed_dict={X:X_data, Y: Y_data})
like image 171
nessuno Avatar answered Oct 23 '22 08:10

nessuno