Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Get out-of-fold predictions from xgboost.cv in python

In the R xgboost package, I can specify predictions=TRUE to save the out-of-fold predictions during cross-validation, e.g.:

library(xgboost)
data(mtcars)
xgb_params = list(
  max_depth = 1,
  eta = 0.01
)
x = model.matrix(mpg~0+., mtcars)
train = xgb.DMatrix(x, label=mtcars$mpg)
res = xgb.cv(xgb_params, train, 100, prediction=TRUE, nfold=5)
print(head(res$pred))

How would I do the equivalent in the python package? I can't find a prediction argument for xgboost.cvin python.

like image 266
Zach Avatar asked Dec 03 '22 23:12

Zach


1 Answers

I'm not sure if this is what you want, but you can accomplish this by using the sklearn wrapper for xgboost: (I know I'm using iris dataset as regression problem -- which it isn't but this is for illustration).

import xgboost as xgb
from sklearn.cross_validation import cross_val_predict as cvp
from sklearn import datasets
X = datasets.load_iris().data[:, :2]
y = datasets.load_iris().target
xgb_model = xgb.XGBRegressor()
y_pred = cvp(xgb_model, X, y, cv=3, n_jobs = 1)
y_pred


array([  9.07209516e-01,   1.84738374e+00,   1.78878939e+00,
         1.83672094e+00,   9.07209516e-01,   9.07209516e-01,
         1.77482617e+00,   9.07209516e-01,   1.75681138e+00,
         1.83672094e+00,   9.07209516e-01,   1.77482617e+00,
         1.84738374e+00,   1.84738374e+00,   1.12216723e+00,
         9.96944368e-01,   9.07209516e-01,   9.07209516e-01,
         9.96944368e-01,   9.07209516e-01,   9.07209516e-01,
         9.07209516e-01,   1.77482617e+00,   8.35850239e-01,
         1.77482617e+00,   9.87186074e-01,   9.07209516e-01,
         9.07209516e-01,   9.07209516e-01,   1.78878939e+00,
         1.83672094e+00,   9.07209516e-01,   9.07209516e-01,
         8.91427517e-01,   1.83672094e+00,   9.09049034e-01,
         8.91427517e-01,   1.83672094e+00,   1.84738374e+00,
         9.07209516e-01,   9.07209516e-01,   1.01038718e+00,
         1.78878939e+00,   9.07209516e-01,   9.07209516e-01,
         1.84738374e+00,   9.07209516e-01,   1.78878939e+00,
         9.07209516e-01,   8.35850239e-01,   1.99947178e+00,
         1.99947178e+00,   1.99947178e+00,   1.94922602e+00,
         1.99975276e+00,   1.91500926e+00,   1.99947178e+00,
         1.97454870e+00,   1.99947178e+00,   1.56287444e+00,
         1.96453893e+00,   1.99947178e+00,   1.99715066e+00,
         1.99947178e+00,   2.84575284e-01,   1.99947178e+00,
         2.84575284e-01,   2.00303388e+00,   1.99715066e+00,
         2.04597521e+00,   1.99947178e+00,   1.99975276e+00,
         2.00527954e+00,   1.99975276e+00,   1.99947178e+00,
         1.99947178e+00,   1.99975276e+00,   1.99947178e+00,
         1.99947178e+00,   1.91500926e+00,   1.95735490e+00,
         1.95735490e+00,   2.00303388e+00,   1.99975276e+00,
         5.92201948e-04,   1.99947178e+00,   1.99947178e+00,
         1.99715066e+00,   2.84575284e-01,   1.95735490e+00,
         1.89267385e+00,   1.99947178e+00,   2.00303388e+00,
         1.96453893e+00,   1.98232651e+00,   2.39597082e-01,
         2.39597082e-01,   1.99947178e+00,   1.97454870e+00,
         1.91500926e+00,   9.99531507e-01,   1.00023842e+00,
         1.00023842e+00,   1.00023842e+00,   1.00023842e+00,
         1.00023842e+00,   9.22234297e-01,   1.00023842e+00,
         1.00100708e+00,   1.16144836e-01,   1.00077248e+00,
         1.00023842e+00,   1.00023842e+00,   1.00100708e+00,
         1.00023842e+00,   1.00077248e+00,   1.00023842e+00,
         1.13711983e-01,   1.00023842e+00,   1.00135887e+00,
         1.00077248e+00,   1.00023842e+00,   1.00023842e+00,
         1.00023842e+00,   9.99531507e-01,   1.00077248e+00,
         1.00023842e+00,   1.00023842e+00,   1.00023842e+00,
         1.00023842e+00,   1.00023842e+00,   1.13711983e-01,
         1.00023842e+00,   1.00023842e+00,   1.00023842e+00,
         1.00023842e+00,   9.78098869e-01,   1.00023842e+00,
         1.00023842e+00,   1.00023842e+00,   1.00023842e+00,
         1.00023842e+00,   1.00023842e+00,   1.00077248e+00,
         9.99531507e-01,   1.00023842e+00,   1.00100708e+00,
         1.00023842e+00,   9.78098869e-01,   1.00023842e+00], dtype=float32)
like image 197
hamel Avatar answered Jan 10 '23 05:01

hamel