Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to save best model in Keras based on AUC metric?

I would like to save the best model in Keras based on auc and I have this code:

def MyMetric(yTrue, yPred):
    auc = tf.metrics.auc(yTrue, yPred)
    return auc

best_model = [ModelCheckpoint(filepath='best_model.h5', monitor='MyMetric', save_best_only=True)]

train_history = model.fit([train_x], 
          [train_y], batch_size=batch_size, epochs=epochs, validation_split=0.05, 
                          callbacks=best_model, verbose = 2)

SO my model runs nut I get this warning:

RuntimeWarning: Can save best model only with MyMetric available, skipping.
  'skipping.' % (self.monitor), RuntimeWarning)

It would be great if any can tell me this is the right way to do it and if not what should I do?

like image 876
MRM Avatar asked Nov 25 '25 02:11

MRM


2 Answers

You have to pass the Metric you want to monitor to model.compile.

https://keras.io/metrics/#custom-metrics

model.compile(loss='categorical_crossentropy', optimizer='adam', metrics=[MyMetric])

Also, tf.metrics.auc returns a tuple containing the tensor and update_op. Keras expects the custom metric function to return only a tensor.

def MyMetric(yTrue, yPred):
    import tensorflow as tf
    auc = tf.metrics.auc(yTrue, yPred)
    return auc[0]

After this step, you will get errors about uninitialized values. Please see these threads:

https://github.com/keras-team/keras/issues/3230

How to compute Receiving Operating Characteristic (ROC) and AUC in keras?

like image 56
Manoj Mohan Avatar answered Nov 28 '25 02:11

Manoj Mohan


You can define a custom metric that calls tensorflow to compute AUROC in the following way:

def as_keras_metric(method):
    import functools
    from keras import backend as K
    import tensorflow as tf
    @functools.wraps(method)
    def wrapper(self, args, **kwargs):
        """ Wrapper for turning tensorflow metrics into keras metrics """
        value, update_op = method(self, args, **kwargs)
        K.get_session().run(tf.local_variables_initializer())
        with tf.control_dependencies([update_op]):
            value = tf.identity(value)
        return value
    return wrapper

@as_keras_metric
def AUROC(y_true, y_pred, curve='ROC'):
    return tf.metrics.auc(y_true, y_pred, curve=curve)

You then need to compile your model with this metric:

model.compile(loss=train_loss, optimizer='adam', metrics=['accuracy',AUROC])

Finally: Checkpoint the model in the following way:

model_checkpoint = keras.callbacks.ModelCheckpoint(path_to_save_model, monitor='val_AUROC', 
                                                   verbose=0, save_best_only=True, 
                                                   save_weights_only=False, mode='auto', period=1)

Be careful though: I believe the Validation AUROC is calculated batch wise and averaged; so might give some errors with checkpointing. A good idea might be to verify after model training finishes that the AUROC of the predictions of the trained model (computed with sklearn.metrics) matches what Tensorflow reports while training and checkpointing

like image 20
Abhimanyu Avatar answered Nov 28 '25 02:11

Abhimanyu



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!