I am using Python's hyperopt library to perform ML hyperparameters' optimization. In particular I am trying to find lightgbm optimal hyperparameter using this function to minimize:
def lgb_objective_map(params):
"""
objective function for lightgbm using MAP as success metric.
"""
# hyperopt casts as float
params['num_boost_round'] = int(params['num_boost_round'])
params['num_leaves'] = int(params['num_leaves'])
params['min_data_in_leaf'] = int(params['min_data_in_leaf'])
# need to be passed as parameter
params['verbose'] = -1
params['seed'] = 1
# Cross validation
cv_result = lgb.cv(
params,
lgtrain,
nfold=3,
metrics='binary_logloss',
num_boost_round=params['num_boost_round'],
early_stopping_rounds=20,
stratified=False,
)
# Update the number of trees based on the early stopping results
early_stop_dict[lgb_objective_map.i] = len(cv_result['binary_logloss-mean'])
params['num_boost_round'] = len(cv_result['binary_logloss-mean'])
# fit and predict
#model = lgb.LGBMRegressor(**params)
#model.fit(train,y_train,feature_name=all_cols,categorical_feature=cat_cols)
model= lgb.train(params=params,train_set=lgtrain)
preds = model.predict(X_test)
# add a column with predictions and rank
result = log_loss(y_test,preds)
# actual_predicted
actual_predicted = np.sum(y_test)/np.sum(preds)
print("INFO: iteration {} logloss {:.3f} actual on predicted ratio {:.3f}".format(lgb_objective_map.i,
result,actual_predicted))
lgb_objective_map.i+=1
return result
The hyperopt call is:
best = fmin(fn=lgb_objective_map,
space=lgb_parameter_space,
algo=tpe.suggest,
max_evals=200,
trials=trials)
Is is possible to modify the best
call in order to pass supplementary parameter to lgb_objective_map
like as lgbtrain, X_test, y_test
? This would allow to generalize the call to hyperopt.
You can press the pedal down a given amount (input variable, or u(t), is % the pedal is pushed), and the speed of the car (output variable, or y(t)) will increase accordingly.
HyperOpt takes Bayesian Optimization as its premise by making some variations in the sampling process, the definition and narrow down of the search space and the algorithms for maximizing the probability model [4].
As per Hyperopt code: `Trials - a list of documents including at least sub-documents ['spec'] - the specification of hyper-parameters for a job ['result'] - the result of Domain.evaluate().
The partial
function from functools
provides an eloquent solution.
Just wrap your function and add the desired arguments:
partial(yourFunction,arg_1,arg_2,...,arg_n)
Then pass that to hp's fmin
function.
Here's a toy example:
from functools import partial
from hyperopt import hp,fmin, STATUS_OK
def objective(params, data):
output = f(**params, data)
return {'loss': output , 'status': STATUS_OK}
fmin_objective = partial(objective, data=data)
bestParams = fmin(fn = fmin_objective ,space = params)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With