I am attempting to run an AdamOptimizer for one step of training, unsuccesfully.
optimizer = tf.train.AdamOptimizer(learning_rate).minimize(cost)
init = tf.global_variables_initializer()
with tf.Session() as sess:
sess.run(init)
_, cost_value = sess.run(optimizer, feed_dict={X:X_data, Y: Y_data})
In the code, cost is a well defined function implementing a conv NN plus a logistic loss function, using two parameters X, Y (the entry of the NN and the training labels respectively)
When I run this, the console is informing me that the run is returning None as an output, which leaves me baffled since I expected it to return me the cost.
What am I doing wrong?
I don't think the optimizer
is going to return anything. optimizer.minimize
(or the train_op
) will return an op which will used to update the trainable weights and increment the global step.
If you want loss (or cost) returned, then you must specify so in sess.run([..., loss, ...], ...)
This is what your code may look like:
optimizer = tf.train.AdamOptimizer(learning_rate).minimize(cost)
init = tf.global_variables_initializer()
with tf.Session() as sess:
sess.run(init)
_, cost_value = sess.run([optimizer, cost], feed_dict={X:X_data, Y: Y_data})
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With