Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to determine feature importance of non linear kernals in SVM

I am using following code for feature importance calculation.

from matplotlib import pyplot as plt
from sklearn import svm

def features_importances(coef, names):
    imp = coef
    imp,names = zip(*sorted(zip(imp,names)))
    plt.barh(range(len(names)), imp, align='center')
    plt.yticks(range(len(names)), names)
    plt.show()

features_names = ['input1', 'input2']
svm = svm.SVC(kernel='linear')
svm.fit(X, Y)
feature_importances(svm.coef_, features_names)

How would I be able to calculate featurue importance of a non linear kernal, which doesn't give expected result in the given example.

like image 453
Jibin Mathew Avatar asked Jan 13 '17 05:01

Jibin Mathew


1 Answers

Short answer: It's not possible, (at least the present libraries are not able to do it.) The feature importance of linear SVMs could be found out but not for a nonlinear SVMs, the reason being that, when the SVM is non-linear the dataset is mapped into a space of higher dimension, which is quite different from the parent dataset and the hyperplane is obtained and this high dimensional data and hence the property is changed from that of the parent dataset and hence it is not possible to find the feature importance of this SVM in relation to the parent dataset features.

like image 194
Jibin Mathew Avatar answered Sep 30 '22 15:09

Jibin Mathew