Does scikit-learn provide facility to perform regression using a gaussian or polynomial kernel? I looked at the APIs and I don't see any. Has anyone built a package on top of scikit-learn that does this?
It doesn't provide gradient descent info.
Theory
Polynomial regression is a special case of linear regression. With the main idea of how do you select your features. Looking at the multivariate regression with 2 variables: x1
and x2
. Linear regression will look like this: y = a1 * x1 + a2 * x2.
Now you want to have a polynomial regression (let's make 2 degree polynomial). We will create a few additional features: x1*x2
, x1^2
and x2^2
. So we will get your 'linear regression':
y = a1 * x1 + a2 * x2 + a3 * x1*x2 + a4 * x1^2 + a5 * x2^2
This nicely shows an important concept curse of dimensionality, because the number of new features grows much faster than linearly with the growth of degree of polynomial. You can take a look about this concept here.
Practice with scikit-learn
You do not need to do all this in scikit. Polynomial regression is already available there (in 0.15 version. Check how to update it here).
from sklearn.preprocessing import PolynomialFeatures
from sklearn import linear_model
X = [[0.44, 0.68], [0.99, 0.23]]
vector = [109.85, 155.72]
predict= [0.49, 0.18]
poly = PolynomialFeatures(degree=2)
X_ = poly.fit_transform(X)
predict_ = poly.fit_transform(predict)
clf = linear_model.LinearRegression()
clf.fit(X_, vector)
print clf.predict(predict_)
Either you use Support Vector Regression sklearn.svm.SVR
and set the appropritate kernel
(see here).
Or you install the latest master version of sklearn and use the recently added sklearn.preprocessing.PolynomialFeatures
(see here) and then OLS or Ridge
on top of that.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With