Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

tensorflow sparse categorical cross entropy with logits

I am a novice programmer trying to follow this guide. However, I ran across an issue. The guide says to define the loss function as:

def loss(labels, logits):
    return tf.keras.losses.sparse_categorical_crossentropy(labels, logits, from_logits=True)

This gives me the following error:

sparse_categorical_crossentropy() got an unexpected keyword argument 'from_logits'

which I take to mean that from_logits is an argument not specified in the function, which is supported by the documentation, which that tf.keras.losses.sparse_categorical_crossentropy() has only two possible inputs.

Is there a way to specify that logits are being used or is that even necesarry?

like image 542
Pseudo Nym Avatar asked Dec 25 '18 04:12

Pseudo Nym


People also ask

How do you use sparse categorical cross entropy?

Categorical cross-entropy is used when true labels are one-hot encoded, for example, we have the following true values for 3-class classification problem [1,0,0] , [0,1,0] and [0,0,1]. In sparse categorical cross-entropy , truth labels are integer encoded, for example, [1] , [2] and [3] for 3-class problem.

Can I use sparse categorical cross entropy for binary classification?

Binary cross-entropy is for binary classification and categorical cross-entropy is for multi-class classification, but both work for binary classification, for categorical cross-entropy you need to change data to categorical(one-hot encoding).

What is sparse categorical cross entropy and Categorical_crossentropy?

Simply: categorical_crossentropy ( cce ) produces a one-hot array containing the probable match for each category, sparse_categorical_crossentropy ( scce ) produces a category index of the most likely matching category.

What is from Logits true in Tensorflow?

from_logits = True signifies the values of the loss obtained by the model are not normalized and is basically used when we don't have any softmax function in our model.


1 Answers

I had the same problem while working through the tutorial. I changed the code from

def loss(labels, logits):
    return tf.keras.losses.sparse_categorical_crossentropy(labels, logits, from_logits=True)

to

def loss(labels, logits):
    return tf.nn.sparse_softmax_cross_entropy_with_logits(labels=labels, logits=logits)

and this resolved the issue without having to install tf-nightly.

like image 199
Seth Avatar answered Sep 28 '22 16:09

Seth