I'm getting the following error trying to eval my model.
tensorflow.python.framework.errors.InvalidArgumentError: Minimum tensor rank: 1 but got: 1 [[Node: ArgMax_1 = ArgMax[T=DT_INT64, _device="/job:localhost/replica:0/task:0/cpu:0"](_recv_Placeholder_1_0, ArgMax_1/dimension/_40)]]
Here is the relevant code
# Predictions for the current training minibatch.
train_prediction = tf.nn.softmax(logits)
correct_prediction = tf.equal(tf.argmax(train_prediction, 1), tf.argmax(train_labels, 1))
accuracy = tf.reduce_mean(tf.cast(correct_prediction, tf.float32))
sess.run(tf.initialize_all_variables())
for i in range(1000000):
start_time = time()
images, labels = get_batch(fifo_queue, FLAGS.batch_size)
feed_dict = {
train_images: images,
train_labels: labels
}
_, loss_value, learn_rate, predictions = sess.run(
[train_step, cross_entropy, learning_rate, train_prediction],
feed_dict=feed_dict)
duration = time() - start_time
if i % 1 == 0:
# Print status to stdout.
print('Step %d: loss = %.3f (%.3f sec)' % (i, loss_value, duration))
train_accuracy = accuracy.eval(feed_dict={
train_images: images, train_labels: labels, keep_prob: 1.0})
print("step %d, training accuracy %g"%(i, train_accuracy))
train_step.run(feed_dict={train_images: images[0], train_labels: labels[1], keep_prob: 0.5})
`
I haven't been able to try much yet because I'm just getting my first model eval-ing and this error (indicating expecting 1 and got 1) is not overly helpful.
The error message isn't great, but looking at the code might explain what's going on.
The issue arises because train_labels
is (presumably) a one-dimensional vector. Dimensions are numbered from 0, so a vector only has a 0th dimension, but your invocation of tf.argmax(train_labels, 1)
attempts to take the argmax in the 1st dimension, which doesn't exist.
In fact, there's no need to take the argmax of the labels at all. Instead, you can simply write:
correct_prediction = tf.equal(tf.argmax(train_prediction, 1), train_labels)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With