Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to create the custom loss function by adding negative entropy to the cross-entropy?

I recently read a paper entitled "REGULARIZING NEURAL NETWORKS BY PENALIZING CONFIDENT OUTPUT DISTRIBUTIONS https://arxiv.org/abs/1701.06548". The authors discuss regularizing neural networks by penalizing low entropy output distributions through adding a negative entropy term to the negative log-likelihood and creating a custom loss function for model training.

enter image description here

The value β controls the strength of confidence penalty. I have written a custom function for categorical cross-entropy as shown below but the negative entropy term need to be added to the loss function.

import tensorflow as tf
def custom_loss(y_true, y_pred):
    cce = tf.keras.losses.CategoricalCrossentropy()
    cce_loss = cce(y_true, y_pred)    
    return cce_loss
like image 525
shiva Avatar asked Oct 14 '22 20:10

shiva


2 Answers

You do not need a custom loss, as it can be implemented as an activity regularizer (one applied to the output of a layer):

def regularizer(beta):
    def entropy_reg(inp):
        return -beta * K.mean(inp * K.log(inp))

Then this can be applied to your output layer:

model = Sequential()
#Add layers here
model.add(Dense(num_classes, activation="softmax",
          activity_regularizer=regularizer(0.01)))
like image 176
Dr. Snoopy Avatar answered Oct 20 '22 01:10

Dr. Snoopy


The entropy of y_pred is essentially the categorical cross entropy between y_pred and itself:

enter image description here

def custom_loss(y_true, y_pred, beta):
    cce = tf.keras.losses.CategoricalCrossentropy()
    return cce(y_true, y_pred) - beta*cce(y_pred, y_pred)
like image 40
Ivan Avatar answered Oct 20 '22 02:10

Ivan