Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to use a custom SVM kernel?

I'd like to implement my own Gaussian kernel in Python, just for exercise. I'm using: sklearn.svm.SVC(kernel=my_kernel) but I really don't understand what is going on.

I expect the function my_kernel to be called with the columns of the X matrix as parameters, instead I got it called with X, X as arguments. Looking at the examples things are not clearer.

What am I missing?

This is my code:

'''
Created on 15 Nov 2014

@author: Luigi
'''
import scipy.io
import numpy as np
from sklearn import svm
import matplotlib.pyplot as plt

def svm_class(fileName):

    data = scipy.io.loadmat(fileName)
    X = data['X']
    y = data['y']

    f = svm.SVC(kernel = 'rbf', gamma=50, C=1.0)
    f.fit(X,y.flatten())
    plotData(np.hstack((X,y)), X, f)

    return

def plotData(arr, X, f):

    ax = plt.subplot(111)

    ax.scatter(arr[arr[:,2]==0][:,0], arr[arr[:,2]==0][:,1], c='r', marker='o', label='Zero')
    ax.scatter(arr[arr[:,2]==1][:,0], arr[arr[:,2]==1][:,1], c='g', marker='+', label='One')

    h = .02  # step size in the mesh
    # create a mesh to plot in
    x_min, x_max = X[:, 0].min() - 1, X[:, 0].max() + 1
    y_min, y_max = X[:, 1].min() - 1, X[:, 1].max() + 1
    xx, yy = np.meshgrid(np.arange(x_min, x_max, h),
                         np.arange(y_min, y_max, h))


    # Plot the decision boundary. For that, we will assign a color to each
    # point in the mesh [x_min, m_max]x[y_min, y_max].
    Z = f.predict(np.c_[xx.ravel(), yy.ravel()])

    # Put the result into a color plot
    Z = Z.reshape(xx.shape)
    plt.contour(xx, yy, Z)



    plt.xlim(np.min(arr[:,0]), np.max(arr[:,0]))
    plt.ylim(np.min(arr[:,1]), np.max(arr[:,1]))
    plt.show()
    return


def gaussian_kernel(x1,x2):
    sigma = 0.5
    return np.exp(-np.sum((x1-x2)**2)/(2*sigma**2))

if __name__ == '__main__':

    fileName = 'ex6data2.mat'
    svm_class(fileName)
like image 856
Luigi Tiburzi Avatar asked Nov 16 '14 21:11

Luigi Tiburzi


People also ask

How are kernels used in SVM?

“Kernel” is used due to a set of mathematical functions used in Support Vector Machine providing the window to manipulate the data. So, Kernel Function generally transforms the training set of data so that a non-linear decision surface is able to transform to a linear equation in a higher number of dimension spaces.

How do I know which SVM kernel to use?

In practice, it is less useful for efficiency (computational as well as predictive) performance reasons. So, the rule of thumb is: use linear SVMs (or logistic regression) for linear problems, and nonlinear kernels such as the Radial Basis Function kernel for non-linear problems.

What is SVM kernel trick?

A Kernel Trick is a simple method where a Non Linear data is projected onto a higher dimension space so as to make it easier to classify the data where it could be linearly divided by a plane. This is mathematically achieved by Lagrangian formula using Lagrangian multipliers. (

What is the difference between SVM and kernel SVM?

A kernelized SVM is equivalent to a linear SVM that operates in feature space rather than input space. Conceptually, you can think of this as mapping the data (possibly nonlinearly) into feature space, then using a linear SVM.


2 Answers

After reading the answer above, and some other questions and sites (1, 2, 3, 4, 5), I put this together for a gaussian kernel in svm.SVC().

Call svm.SVC() with kernel=precomputed.

Then compute a Gram Matrix a.k.a. Kernel Matrix (often abbreviated as K).

Then use this Gram Matrix as the first argument (i.e. X) to svm.SVC().fit():

I start with the following code:

C=0.1
model = svmTrain(X, y, C, "gaussian")

that calls sklearn.svm.SVC() in svmTrain(), and then sklearn.svm.SVC().fit():

from sklearn import svm

if kernelFunction == "gaussian":
    clf = svm.SVC(C = C, kernel="precomputed")
    return clf.fit(gaussianKernelGramMatrix(X,X), y)

the Gram Matrix computation - used as a parameter to sklearn.svm.SVC().fit() - is done in gaussianKernelGramMatrix():

import numpy as np

def gaussianKernelGramMatrix(X1, X2, K_function=gaussianKernel):
    """(Pre)calculates Gram Matrix K"""

    gram_matrix = np.zeros((X1.shape[0], X2.shape[0]))
    for i, x1 in enumerate(X1):
        for j, x2 in enumerate(X2):
            gram_matrix[i, j] = K_function(x1, x2)
    return gram_matrix

which uses gaussianKernel() to get a radial basis function kernel between x1 and x2 (a measure of similarity based on a gaussian distribution centered on x1 with sigma=0.1):

def gaussianKernel(x1, x2, sigma=0.1):

    # Ensure that x1 and x2 are column vectors
    x1 = x1.flatten()
    x2 = x2.flatten()

    sim = np.exp(- np.sum( np.power((x1 - x2),2) ) / float( 2*(sigma**2) ) )

    return sim

Then, once the model is trained with this custom kernel, we predict with "the [custom] kernel between the test data and the training data":

predictions = model.predict( gaussianKernelGramMatrix(Xval, X) )

In short, to use a custom SVM gaussian kernel, you can use this snippet:

import numpy as np
from sklearn import svm

def gaussianKernelGramMatrixFull(X1, X2, sigma=0.1):
    """(Pre)calculates Gram Matrix K"""

    gram_matrix = np.zeros((X1.shape[0], X2.shape[0]))
    for i, x1 in enumerate(X1):
        for j, x2 in enumerate(X2):
            x1 = x1.flatten()
            x2 = x2.flatten()
            gram_matrix[i, j] = np.exp(- np.sum( np.power((x1 - x2),2) ) / float( 2*(sigma**2) ) )
    return gram_matrix

X=...
y=...
Xval=...

C=0.1
clf = svm.SVC(C = C, kernel="precomputed")
model = clf.fit( gaussianKernelGramMatrixFull(X,X), y )

p = model.predict( gaussianKernelGramMatrixFull(Xval, X) )
like image 121
arturomp Avatar answered Sep 28 '22 16:09

arturomp


For efficiency reasons, SVC assumes that your kernel is a function accepting two matrices of samples, X and Y (it will use two identical ones only during training) and you should return a matrix G where:

G_ij = K(X_i, Y_j)

and K is your "point-level" kernel function.

So either implement a gaussian kernel that works in such a generic way, or add a "proxy" function like:

def proxy_kernel(X,Y,K):
    gram_matrix = np.zeros((X.shape[0], Y.shape[0]))
    for i, x in enumerate(X):
        for j, y in enumerate(Y):
            gram_matrix[i, j] = K(x, y)
    return gram_matrix

and use it like:

from functools import partial
correct_gaussian_kernel = partial(proxy_kernel, K=gaussian_kernel)
like image 30
lejlot Avatar answered Sep 28 '22 17:09

lejlot