I am a novice programmer trying to follow this guide. However, I ran across an issue. The guide says to define the loss function as:
def loss(labels, logits):
return tf.keras.losses.sparse_categorical_crossentropy(labels, logits, from_logits=True)
This gives me the following error:
sparse_categorical_crossentropy() got an unexpected keyword argument 'from_logits'
which I take to mean that from_logits
is an argument not specified in the function, which is supported by the documentation, which that tf.keras.losses.sparse_categorical_crossentropy()
has only two possible inputs.
Is there a way to specify that logits are being used or is that even necesarry?
Categorical cross-entropy is used when true labels are one-hot encoded, for example, we have the following true values for 3-class classification problem [1,0,0] , [0,1,0] and [0,0,1]. In sparse categorical cross-entropy , truth labels are integer encoded, for example, [1] , [2] and [3] for 3-class problem.
Binary cross-entropy is for binary classification and categorical cross-entropy is for multi-class classification, but both work for binary classification, for categorical cross-entropy you need to change data to categorical(one-hot encoding).
Simply: categorical_crossentropy ( cce ) produces a one-hot array containing the probable match for each category, sparse_categorical_crossentropy ( scce ) produces a category index of the most likely matching category.
from_logits = True signifies the values of the loss obtained by the model are not normalized and is basically used when we don't have any softmax function in our model.
I had the same problem while working through the tutorial. I changed the code from
def loss(labels, logits):
return tf.keras.losses.sparse_categorical_crossentropy(labels, logits, from_logits=True)
to
def loss(labels, logits):
return tf.nn.sparse_softmax_cross_entropy_with_logits(labels=labels, logits=logits)
and this resolved the issue without having to install tf-nightly.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With