Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

XGB via Scikit learn API doesn't seem to be running in GPU although compiled to run for GPU

It appears although XGB is compiled to run on GPU, when called/executed via Scikit learn API, it doesn't seem to be running on GPU.

Please advise if this is expected behaviour

like image 441
ksasi Avatar asked May 13 '17 05:05

ksasi


People also ask

How do I run XGBoost with GPU?

Enabling training of an xgboost model using the GPU is straightforward— just set the hyperparameter tree_method to“gpu_hist”.

Does XGBoost automatically use GPU?

Most of the objective functions implemented in XGBoost can be run on GPU. Following table shows current support status. Objective will run on GPU if GPU updater ( gpu_hist ), otherwise they will run on CPU by default.

Does Sklearn use GPU?

scikit-learn is designed to be easy to install on a wide variety of platforms. Outside of neural networks, GPUs don't play a large role in machine learning today, and much larger gains in speed can often be achieved by a careful choice of algorithms. NVIDIA have released their own version of sklearn with GPU support.

How much faster is XGBoost on GPU?

XGBoost on GPU That's 4.4 times faster than the CPU.


1 Answers

As far as I can tell, the Scikit learn API does not currently support GPU. You need to use the learning API (e.g. xgboost.train(...)). This also requires you to first convert your data into xgboost DMatrix.

Example:

params = {"updater":"grow_gpu"}
train = xgboost.DMatrix(x_train, label=y_train)
clf = xgboost.train(params, train, num_boost_round=10)

UPDATE:

The Scikit Learn API now supports GPU via the **kwargs argument: http://xgboost.readthedocs.io/en/latest/python/python_api.html#id1

like image 182
gaw89 Avatar answered Sep 28 '22 18:09

gaw89