I'd like to make a custom scoring function involving classification probabilities as follows:
def custom_score(y_true, y_pred_proba):
error = ...
return error
my_scorer = make_scorer(custom_score, needs_proba=True)
gs = GridSearchCV(estimator=KNeighborsClassifier(),
param_grid=[{'n_neighbors': [6]}],
cv=5,
scoring=my_scorer)
Is there any way to pass the estimator, as fit by GridSearch with the given data and parameters, to my custom scoring function? Then I could interpret the probabilities using estimator.classes_
For example:
def custom_score(y_true, y_pred_proba, clf):
class_labels = clf.classes_
error = ...
return error
There is an alternative way to make a scorer mentioned in the documentation. Using this method I can do the following:
def my_scorer(clf, X, y_true):
class_labels = clf.classes_
y_pred_proba = clf.predict_proba(X)
error = ...
return error
gs = GridSearchCV(estimator=KNeighborsClassifier(),
param_grid=[{'n_neighbors': [6]}],
cv=5,
scoring=my_scorer)
This avoids the use of sklearn.metrics.make_scorer
.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With