Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Optimizer returning None

I am attempting to run an AdamOptimizer for one step of training, unsuccesfully.

optimizer = tf.train.AdamOptimizer(learning_rate).minimize(cost)
init = tf.global_variables_initializer()

with tf.Session() as sess:
    sess.run(init)
    _, cost_value = sess.run(optimizer, feed_dict={X:X_data, Y: Y_data})

In the code, cost is a well defined function implementing a conv NN plus a logistic loss function, using two parameters X, Y (the entry of the NN and the training labels respectively)

When I run this, the console is informing me that the run is returning None as an output, which leaves me baffled since I expected it to return me the cost.

What am I doing wrong?

like image 662
Jsevillamol Avatar asked Nov 01 '17 20:11

Jsevillamol


1 Answers

I don't think the optimizer is going to return anything. optimizer.minimize (or the train_op) will return an op which will used to update the trainable weights and increment the global step. If you want loss (or cost) returned, then you must specify so in sess.run([..., loss, ...], ...)

This is what your code may look like:

optimizer = tf.train.AdamOptimizer(learning_rate).minimize(cost)
init = tf.global_variables_initializer()

with tf.Session() as sess:
    sess.run(init)
    _, cost_value = sess.run([optimizer, cost], feed_dict={X:X_data, Y: Y_data})
like image 198
armundle Avatar answered Sep 29 '22 23:09

armundle