Has anyone had any luck with writing a custom AUC loss function for Keras using Theano?
The documentation is here: http://keras.io/objectives/
Sample code is here: https://github.com/fchollet/keras/blob/master/keras/objectives.py
I saw there is an implementation in pylearn2, (which is really a wrapper around sklearn) but was unable to port this to use in Keras
https://github.com/lisa-lab/pylearn2/blob/master/pylearn2/train_extensions/roc_auc.py
So i guess my question is, has anybody been able to write this function? and would you be willing to share?
AUC is not differentiable, so you can't use it as a loss function without some modification. There's been some work on algorithms to maximize AUC, but I'd recommend just using the regular cross-entropy / log likelihood loss.
The AUC (Area under the curve) of the ROC (Receiver operating characteristic; default) or PR (Precision Recall) curves are quality measures of binary classifiers. Unlike the accuracy, and like cross-entropy losses, ROC-AUC and PR-AUC evaluate all the operational points of a model.
Creating custom loss functions in Keras A custom loss function can be created by defining a function that takes the true values and predicted values as required parameters. The function should return an array of losses. The function can then be passed at the compile stage.
The loss function is used to optimize your model. This is the function that will get minimized by the optimizer. A metric is used to judge the performance of your model. This is only for you to look at and has nothing to do with the optimization process.
AUC is not differentiable, so you can't use it as a loss function without some modification. There's been some work on algorithms to maximize AUC, but I'd recommend just using the regular cross-entropy / log likelihood loss.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With