model = sklearn.model_selection.GridSearchCV(
estimator = est,
param_grid = param_grid,
scoring = 'precision',
verbose = 1,
n_jobs = 1,
iid = True,
cv = 3)
In sklearn.metrics.precision_score(y, y_pred,pos_label=[0])
, I can specify the positive label, how can I specify this in GridSearchCV too?
If there is no way to specify, when using custom scoring, how can I define?
I have tried this:
custom_score = make_scorer(precision_score(y, y_pred,pos_label=[0]),
greater_is_better=True)
but I got error:
NameError: name 'y_pred' is not defined
Reading the docs, you can pass any kwargs
into make_scorer
and they will be automatically passed into the score_func
callable.
from sklearn.metrics import precision_score, make_scorer
custom_scorer = make_scorer(precision_score, greater_is_better=True, pos_label=0)
Then you pass this custom_scorer
to GridSearchCV
:
gs = GridSearchCV(est, ..., scoring=custom_scorer)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With