Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Add AUC as loss function for keras

Tags:

keras

theano

Has anyone had any luck with writing a custom AUC loss function for Keras using Theano?

The documentation is here: http://keras.io/objectives/

Sample code is here: https://github.com/fchollet/keras/blob/master/keras/objectives.py

I saw there is an implementation in pylearn2, (which is really a wrapper around sklearn) but was unable to port this to use in Keras

https://github.com/lisa-lab/pylearn2/blob/master/pylearn2/train_extensions/roc_auc.py

So i guess my question is, has anybody been able to write this function? and would you be willing to share?

like image 943
Mike Pearmain Avatar asked Sep 11 '15 08:09

Mike Pearmain


People also ask

Can we use AUC as loss function?

AUC is not differentiable, so you can't use it as a loss function without some modification. There's been some work on algorithms to maximize AUC, but I'd recommend just using the regular cross-entropy / log likelihood loss.

What is AUC in keras?

The AUC (Area under the curve) of the ROC (Receiver operating characteristic; default) or PR (Precision Recall) curves are quality measures of binary classifiers. Unlike the accuracy, and like cross-entropy losses, ROC-AUC and PR-AUC evaluate all the operational points of a model.

What is loss function in keras?

Creating custom loss functions in Keras A custom loss function can be created by defining a function that takes the true values and predicted values as required parameters. The function should return an array of losses. The function can then be passed at the compile stage.

What is loss and metric in keras?

The loss function is used to optimize your model. This is the function that will get minimized by the optimizer. A metric is used to judge the performance of your model. This is only for you to look at and has nothing to do with the optimization process.


1 Answers

AUC is not differentiable, so you can't use it as a loss function without some modification. There's been some work on algorithms to maximize AUC, but I'd recommend just using the regular cross-entropy / log likelihood loss.

like image 110
1'' Avatar answered Oct 31 '22 10:10

1''