Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

An SVM implementation supporting non-linear kernels and multi-label on a one-vs.-rest

I'm looking for a SVM implementation with support for non-linear kernels and one-vs-rest scenario, to perform a multi-label classification. Preferably, written in Python, or that I can call from Python with wrappers.

I was looking into sklearn, and there are two implementations to use SVM for classification:

sklearn.svm.LinearSVC - supports multi-label classification with a one-vs.-rest scenario, but it's based on liblinear, and therefore only supports linear kernels.

sklearn.svm.SVC - based on libsvm, supports non-linear kernels, but multi-label classification is done under a one-vs.-one reduction, it trains K (K − 1) / 2 binary classifiers for a K-way multiclass problem.

More info also here: http://scikit-learn.org/stable/modules/multiclass.html

Does anyone knows any other SVM implementations directly supporting multi-label classification and non-linear kernels ?

One possible solution could also be to adapt the code based on sklearn.svm.SVC, to perform One-vs-Rest, was this already attempted before?

like image 339
David Batista Avatar asked Oct 18 '22 15:10

David Batista


1 Answers

Binary Relevance problem transformation method uses one-vs-rest approach for doing multi-label classification. One could easily implement SVM with non-linear kernels using scikit-multilearn library. The following is the sample python code to do the same, where each row of train_y is a one-hot vector representing multiple labels (For instance, [0,0,1,0,1,0])

from skmultilearn.problem_transform.br import BinaryRelevance
from sklearn.svm import SVC

# Non-linear kernel
svm = SVC(kernel='rbf')
cls = BinaryRelevance(classifier=svm)

cls.fit(train_x, train_y)
predictions = cls.predict(test_x)
like image 69
Nikhil Avatar answered Nov 10 '22 01:11

Nikhil