Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to see the loss of the best epoch from early stopping in Keras?

I have managed to implement early stopping into my Keras model, but I am not sure how I can view the loss of the best epoch.

es = EarlyStopping(monitor='val_out_soft_loss', 
            mode='min',
            restore_best_weights=True, 
            verbose=2, 
            patience=10)

model.fit(tr_x,
          tr_y,
          batch_size=batch_size,
          epochs=epochs,
          verbose=1,
          callbacks=[es],
          validation_data=(val_x, val_y))
loss = model.history.history["val_out_soft_loss"][-1]
return model, loss

The way I have defined the loss score, means that the returned score comes from the final epoch, not the best epoch.

Example:

from sklearn.model_selection import train_test_split, KFold
losses = []
models = []
for k in range(2):
    kfold = KFold(5, random_state = 42 + k, shuffle = True)
    for k_fold, (tr_inds, val_inds) in enumerate(kfold.split(train_y)):
        print("-----------")
        print("-----------")
        model, loss = get_model(64, 100)
        models.append(model)
        print(k_fold, loss)
        losses.append(loss)
print("-------")
print(losses)
print(np.mean(losses))

Epoch 23/100
18536/18536 [==============================] - 7s 362us/step - loss: 0.0116 - out_soft_loss: 0.0112 - out_reg_loss: 0.0393 - val_loss: 0.0131 - val_out_soft_loss: 0.0127 - val_out_reg_loss: 0.0381

Epoch 24/100
18536/18536 [==============================] - 7s 356us/step - loss: 0.0116 - out_soft_loss: 0.0112 - out_reg_loss: 0.0388 - val_loss: 0.0132 - val_out_soft_loss: 0.0127 - val_out_reg_loss: 0.0403

Restoring model weights from the end of the best epoch
Epoch 00024: early stopping
0 0.012735568918287754

So in this example, I would like to see the loss at Epoch 00014 (which is 0.0124).

I also have a separate question: How can I set the decimal places for the val_out_soft_loss score?

like image 425
MRHarv Avatar asked Nov 20 '19 13:11

MRHarv


People also ask

What is EarlyStopping in Keras?

EarlyStopping classStop training when a monitored metric has stopped improving. Assuming the goal of a training is to minimize the loss. With this, the metric to be monitored would be 'loss' , and mode would be 'min' .

Does early stopping reduce overfitting?

In machine learning, early stopping is a form of regularization used to avoid overfitting when training a learner with an iterative method, such as gradient descent.

How do you choose the best number of epochs?

The right number of epochs depends on the inherent perplexity (or complexity) of your dataset. A good rule of thumb is to start with a value that is 3 times the number of columns in your data. If you find that the model is still improving after all epochs complete, try again with a higher value.

What is EarlyStopping patience?

Patience. Set the Patience that you want early stopping to use. This is the number of epochs without improvement after which training will be early stopped. A larger patience means that an experiment will wait longer before stopping an experiment.

How do I use early stopping in keras?

The Keras module contains a built-in callback designed for Early Stopping [2]. First, let’s import EarlyStopping callback and create an early stopping object early_stopping . monitor='val_loss': to use validation loss as performance measure to terminate the training.

How do I save a model in keras?

Use EarlyStopping which is available in Keras. Early stopping is basically stopping the training once your loss starts to increase (or in other words validation accuracy starts to decrease). use ModelCheckpoint to save the model wherever you want.

What do you learn in a keras training course?

Specifically, you learned: How to monitor the performance of a model during training using the Keras API. How to create and configure early stopping and model checkpoint callbacks using the Keras API. How to reduce overfitting by adding a early stopping to an existing model.

What happens if you have too many epochs in machine learning?

Too many epochs can lead to overfitting of the training dataset, whereas too few may result in an underfit model. Early stopping is a method that allows you to specify an arbitrary large number of training epochs and stop training once the model performance stops improving on a hold out validation dataset.


1 Answers

Assign the fit() call in Keras to a variable so you can track the metrics through the epochs.

history = model.fit(tr_x, ...

It will return a dictionary, access it like this:

loss_hist = history.history['loss']

And then get the min() to get the minimum loss, and argmin() to get the best epoch (zero-based).

np.min(loss_hist)
np.argmin(loss_hist)
like image 157
Nicolas Gervais Avatar answered Oct 01 '22 14:10

Nicolas Gervais