I am using LinearSVC from scikit-learn library and I wonder if it is possible to somehow pull out the vectors which my model uses after training to make predictions. Tried to google it for some time but without any luck. Anyone knows?
Unfortunately there seems to be no way to do that. LinearSVC calls liblinear (see relevant code) but doesn't retrieve the vectors, only the coefficients and the intercept.
One alternative would be to use SVC with the 'linear' kernel (libsvm instead of liblinear based), but also poly
, dbf
and sigmoid
kernel support this option:
from sklearn import svm
X = [[0, 0], [1, 1]]
y = [0, 1]
clf = svm.SVC(kernel='linear')
clf.fit(X, y)
print clf.support_vectors_
Output:
[[ 0. 0.]
[ 1. 1.]]
liblinear scales better to large number of samples, but otherwise the are mostly equivalent.
I am not sure if it helps, but I was searching for something similar and the conclusion was that when:
clf = svm.LinearSVC()
Then this:
clf.decision_function(x)
Is equal to this:
clf.coef_.dot(x) + clf.intercept_
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With