I am looking for the meaning of verbose log abbriviations of SVC function in scikit-learn?
If nSV is the number of support vectors, #iter is the number of iteration, what dose nBSV, rho,obj mean?
This is an example:
import numpy as np
from sklearn.svm import SVR
sets=np.loadtxt('data\Exp Rot.txt') # reading data
model=SVR(kernel='rbf',C=100,gamma=1,max_iter=100000,verbose=True)
model.fit(sets[:,:2],sets[:,2])
print(model.score)
and here is the result
scikit-learn is using libsvm's implementation of support-vector machines (LinearSVC will use liblinear by the same authors). The official website has it's own FAQ answering this here.
Excerpt:
Q: The output of training C-SVM is like the following. What do they mean?
optimization finished, #iter = 219
nu = 0.431030
obj = -100.877286, rho = 0.424632
nSV = 132, nBSV = 107
Total nSV = 132
obj is the optimal objective value of the dual SVM problem
rho is the bias term in the decision function sgn(w^Tx - rho)
nSV and nBSV are number of support vectors and bounded support vectors (i.e., alpha_i = C)
nu-svm is a somewhat equivalent form of C-SVM where C is replaced by nu
nu simply shows the corresponding parameter. More details are in libsvm document
Link to the libsvm document mentioned above (PDF!)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With