I want to compute predictions, using the following code:
import tensorflow as tf
x = tf.placeholder("float", [None, n_input])
y = tf.placeholder("float", [None, n_classes])
pred = multilayer_perceptron(x, weights, biases)
cost = tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits(pred, y))
optimizer = tf.train.AdamOptimizer(learning_rate=learning_rate).minimize(cost)
# Initializing the variables
##trn.txt start
##tst.txt end
with tf.Session() as sess:
sess.run(init)
# Training cycle
for epoch in range(training_epochs):
avg_cost = 0.
total_batch = int(num_lines_trn/batch_size)
# Loop over all batches
for i in range(total_batch):
batch_x, batch_y = bat_x[i*batch_size:(i+1)*batch_size],bat_y[i*batch_size:(i+1)*batch_size]#mnist.train.next_batch(batch_size)
# Run optimization op (backprop) and cost op (to get loss value)
_, c = sess.run([optimizer, cost], feed_dict={x: batch_x,
y: batch_y})
# Compute average loss
avg_cost += c / total_batch
correct_prediction = tf.equal(tf.argmax(pred, 1), tf.argmax(y, 1))
accuracy = tf.reduce_mean(tf.cast(correct_prediction, "float"))
print(sess.run(accuracy, feed_dict={x: tst_x, y: tst_y}))
print(sess.run(accuracy, feed_dict={x: tst_x}))
The line
print(sess.run(accuracy, feed_dict={x: tst_x, y: tst_y}))
returns 0.80353
which is the accuracy for the batch.
However I want to get prediction result. so I added:
print(sess.run(accuracy, feed_dict={x: tst_x}))
But this line returns an error:
You must feed a value for placeholder tensor 'Placeholder_7' with dtype float
How can I solve this problem?
If you want to get the predictions of your model, you should do:
sess.run(pred, feed_dict={x: tst_x})
You have an error because you try to run sess.run(accuracy, feed_dict={x: tst_x})
, but to compute the accuracy on a given batch, you need the true labels contained in placeholder y
, so you get the following error:
You must feed a value for placeholder tensor 'Name of placeholder
y
'
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With