Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

passing supplementary parameters to hyperopt objective function

I am using Python's hyperopt library to perform ML hyperparameters' optimization. In particular I am trying to find lightgbm optimal hyperparameter using this function to minimize:

def lgb_objective_map(params):
"""
objective function for lightgbm using MAP as success metric.
"""

# hyperopt casts as float
params['num_boost_round'] = int(params['num_boost_round'])
params['num_leaves'] = int(params['num_leaves'])
params['min_data_in_leaf'] = int(params['min_data_in_leaf'])

# need to be passed as parameter
params['verbose'] = -1
params['seed'] = 1

# Cross validation
cv_result = lgb.cv(
params,
lgtrain,
nfold=3,
metrics='binary_logloss',
num_boost_round=params['num_boost_round'],
early_stopping_rounds=20,
stratified=False,
)

# Update the number of trees based on the early stopping results
early_stop_dict[lgb_objective_map.i] = len(cv_result['binary_logloss-mean'])
params['num_boost_round'] = len(cv_result['binary_logloss-mean'])

# fit and predict
#model = lgb.LGBMRegressor(**params)
#model.fit(train,y_train,feature_name=all_cols,categorical_feature=cat_cols)
model= lgb.train(params=params,train_set=lgtrain)
preds = model.predict(X_test)

# add a column with predictions and rank


result = log_loss(y_test,preds)
#    actual_predicted 
actual_predicted = np.sum(y_test)/np.sum(preds)

print("INFO: iteration {} logloss {:.3f} actual on predicted ratio {:.3f}".format(lgb_objective_map.i, 
      result,actual_predicted))

lgb_objective_map.i+=1

return result

The hyperopt call is:

best = fmin(fn=lgb_objective_map,
        space=lgb_parameter_space,
        algo=tpe.suggest,
        max_evals=200,
        trials=trials)

Is is possible to modify the best call in order to pass supplementary parameter to lgb_objective_map like as lgbtrain, X_test, y_test? This would allow to generalize the call to hyperopt.

like image 206
Giorgio Spedicato Avatar asked Feb 01 '19 11:02

Giorgio Spedicato


People also ask

How do I speed up my HyperOpt?

You can press the pedal down a given amount (input variable, or u(t), is % the pedal is pushed), and the speed of the car (output variable, or y(t)) will increase accordingly.

Is HyperOpt Bayesian optimization?

HyperOpt takes Bayesian Optimization as its premise by making some variations in the sampling process, the definition and narrow down of the search space and the algorithms for maximizing the probability model [4].

What is trials in HyperOpt?

As per Hyperopt code: `Trials - a list of documents including at least sub-documents ['spec'] - the specification of hyper-parameters for a job ['result'] - the result of Domain.evaluate().


1 Answers

The partial function from functools provides an eloquent solution.

Just wrap your function and add the desired arguments:

partial(yourFunction,arg_1,arg_2,...,arg_n)

Then pass that to hp's fmin function.

Here's a toy example:

from functools import partial
from hyperopt import hp,fmin, STATUS_OK

def objective(params, data):
    output = f(**params, data)
    return {'loss': output ,  'status': STATUS_OK}

fmin_objective = partial(objective, data=data)

bestParams = fmin(fn = fmin_objective ,space = params)
like image 76
Moocember Avatar answered Sep 27 '22 20:09

Moocember