Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

scikit-learn, linearsvc - how to get support vectors from the trained SVM?

I am using LinearSVC from scikit-learn library and I wonder if it is possible to somehow pull out the vectors which my model uses after training to make predictions. Tried to google it for some time but without any luck. Anyone knows?

like image 524
Maksim Khaitovich Avatar asked Oct 08 '14 15:10

Maksim Khaitovich


Video Answer


2 Answers

Unfortunately there seems to be no way to do that. LinearSVC calls liblinear (see relevant code) but doesn't retrieve the vectors, only the coefficients and the intercept.

One alternative would be to use SVC with the 'linear' kernel (libsvm instead of liblinear based), but also poly, dbf and sigmoid kernel support this option:

from sklearn import svm

X = [[0, 0], [1, 1]]
y = [0, 1]

clf = svm.SVC(kernel='linear')
clf.fit(X, y)
print clf.support_vectors_

Output:

[[ 0.  0.]
 [ 1.  1.]]

liblinear scales better to large number of samples, but otherwise the are mostly equivalent.

like image 195
elyase Avatar answered Sep 23 '22 23:09

elyase


I am not sure if it helps, but I was searching for something similar and the conclusion was that when:

clf = svm.LinearSVC()

Then this:

clf.decision_function(x)

Is equal to this:

clf.coef_.dot(x) + clf.intercept_
like image 35
Ravid Cohen Avatar answered Sep 22 '22 23:09

Ravid Cohen