I am training a linear SVM classifier with the fitcsvm function in MATLAB:
cvFolds = crossvalind('Kfold', labels, nrFolds);
for i = 1:nrFolds % iterate through each fold
testIdx = (cvFolds == i); % indices of test instances
trainIdx = ~testIdx; % indices training instances
cl = fitcsvm(features(trainIdx,:),
labels(trainIdx),'KernelFunction',kernel,'Standardize',true,...
'BoxConstraint',C,'ClassNames',[0,1], 'Solver', solver);
[labelPred,scores] = predict(cl, features(testIdx,:));
eq = sum(labelPred==labels(testIdx));
accuracy(i) = eq/numel(labels(testIdx));
end
As visible from this part of code, the trained SVM model is stored in cl. Checking the model parameters in cl I do not see which parameters correspond to classifier weight - ie. the parameter for linear classifiers which reflects the importance of each feature. Which parameter represents the classification weights? I see in the MATLAB documentation "The vector β contains the coefficients that define an orthogonal vector to the hyperplane" - is hence cl.beta representing the classification weights?
As you can see in this documentation, the equation of a hyperplane
in fitcsvm
is
f(x)=x′β+b=0
And as you know, this equation shows following relationship:
f(x)=w*x+b=0 or f(x)=x*w+b=0
So, β is equal to w (weights).
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With