Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to get results from custom loss function in Keras?

I want to implement a custom loss function in Python and It should work like this pseudocode:

aux = | Real - Prediction | / Prediction
errors = []
if aux <= 0.1:
 errors.append(0)
elif aux > 0.1 & <= 0.15:
 errors.append(5/3)
elif aux > 0.15 & <= 0.2:
 errors.append(5)
else:
 errors.append(2000)
return sum(errors)

I started to define the metric like this:

def custom_metric(y_true,y_pred):
    # y_true:
    res = K.abs((y_true-y_pred) / y_pred, axis = 1)
    ....

But I do not know how to get the value of the res for the if and else. Also I want to know what have to return the function.

Thanks

like image 976
Aceconhielo Avatar asked Apr 27 '18 11:04

Aceconhielo


People also ask

How do I use custom loss function in keras?

Creating custom loss functions in Keras A custom loss function can be created by defining a function that takes the true values and predicted values as required parameters. The function should return an array of losses. The function can then be passed at the compile stage.

How do I make a custom loss function in Tensorflow?

Tensorflow custom loss function numpy To do this task first we will create an array with sample data and find the mean squared value with the numpy() function. Next, we will use the tf. keras. Sequential() function and assign the dense value with input shape.


2 Answers

Also I want to know what have to return the function.

Custom metrics can be passed at the compilation step.

The function would need to take (y_true, y_pred) as arguments and return a single tensor value.

But I do not know how to get the value of the res for the if and else.

You can return the result from result_metric function.

def custom_metric(y_true,y_pred):
     result = K.abs((y_true-y_pred) / y_pred, axis = 1)
     return result

The second step is to use a keras callback function in order to find the sum of the errors.

The callback can be defined and passed to the fit method.

history = CustomLossHistory()
model.fit(callbacks = [history])

The last step is to create the the CustomLossHistory class in order to find out the sum of your expecting errors list.

CustomLossHistory will inherit some default methods from keras.callbacks.Callback.

  • on_epoch_begin: called at the beginning of every epoch.
  • on_epoch_end: called at the end of every epoch.
  • on_batch_begin: called at the beginning of every batch.
  • on_batch_end: called at the end of every batch.
  • on_train_begin: called at the beginning of model training.
  • on_train_end: called at the end of model training.

You can read more in the Keras Documentation

But for this example we only need on_train_begin and on_batch_end methods.

Implementation

class LossHistory(keras.callbacks.Callback):
    def on_train_begin(self, logs={}):
        self.errors= []

    def on_batch_end(self, batch, logs={}):
         loss = logs.get('loss')
         self.errors.append(self.loss_mapper(loss))

    def loss_mapper(self, loss):
         if loss <= 0.1:
             return 0
         elif loss > 0.1 & loss <= 0.15:
             return 5/3
         elif loss > 0.15 & loss <= 0.2:
             return 5
         else:
             return 2000

After your model is trained you can access your errors using following statement.

errors = history.errors
like image 120
Mihai Alexandru-Ionut Avatar answered Sep 29 '22 06:09

Mihai Alexandru-Ionut


I'll take a leap here and say this won't work because it is not differentiable. The loss needs to be continuously differentiable so you can propagate a gradient through there.

If you want to make this work you need to find a way to do this without discontinuity. For example you could try a weighted average over your 4 discrete values where the weights strongly prefer the closest value.

like image 27
Alexander Harnisch Avatar answered Sep 28 '22 06:09

Alexander Harnisch