Can you guide how to fix this?
with tf.name_scope('loss'):
#cross_entropy = None
val = tf.nn.softmax_cross_entropy_with_logits(y_conv, y_)
cross_entropy = tf.reduce_mean(val)
with tf.name_scope('adam_optimizer'):
#train_step = None
train_step = tf.train.AdamOptimizer(1e-4).minimize(cross_entropy)
I get this error:
---------------------------------------------------------------------------
ValueError Traceback (most recent call last)
<ipython-input-40-f67d0aecc114> in <module>()
1 with tf.name_scope('loss'):
2 #cross_entropy = None
----> 3 val = tf.nn.softmax_cross_entropy_with_logits(y_conv, y_)
4 cross_entropy = tf.reduce_mean(val)
5
~/anaconda/lib/python3.6/site-packages/tensorflow/python/ops/nn_ops.py in softmax_cross_entropy_with_logits(_sentinel, labels, logits, dim, name)
1576 """
1577 _ensure_xent_args("softmax_cross_entropy_with_logits", _sentinel,
-> 1578 labels, logits)
1579
1580 # TODO(pcmurray) Raise an error when the labels do not sum to 1. Note: This
~/anaconda/lib/python3.6/site-packages/tensorflow/python/ops/nn_ops.py in _ensure_xent_args(name, sentinel, labels, logits)
1531 if sentinel is not None:
1532 raise ValueError("Only call `%s` with "
-> 1533 "named arguments (labels=..., logits=..., ...)" % name)
1534 if labels is None or logits is None:
1535 raise ValueError("Both labels and logits must be provided.")
ValueError: Only call `softmax_cross_entropy_with_logits` with named arguments (labels=..., logits=..., ...)
Also, tf.__version__
returns '1.0.0'
and I have Anaconda Python 3.6.2
on OSX Sierra.
This is an easy fix: softmax_cross_entropy_with_logits()
has three key arguments: _sentinel
, labels
, and logits
. The sentinel must be empty, requiring the use of named arguments.
Fixed (although I'm not sure if y_conv
or y_
is the label or logit in this case, so you may have to swap them):
with tf.name_scope('loss'):
#cross_entropy = None
val = tf.nn.softmax_cross_entropy_with_logits(labels = y_conv, logits=y_)
cross_entropy = tf.reduce_mean(val)
with tf.name_scope('adam_optimizer'):
#train_step = None
train_step = tf.train.AdamOptimizer(1e-4).minimize(cross_entropy)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With