Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Plotting data from an svm fit - hyperplane

Tags:

r

svm

I used svm to find a hyperplane best fit regression dependent on q, where I have 4 dimensions: x, y, z, q.

fit <- svm(q ~ ., data=data,kernel='linear')

and here is my fit object:

Call:
svm(formula = q ~ ., data = data, kernel = "linear")


Parameters:
   SVM-Type:  C-classification 
 SVM-Kernel:  linear 
       cost:  1 
      gamma:  0.3333333 

Number of Support Vectors:  1800

I have a 3d plot of my data, where the 4th dimension is color, using plot3d. How can I overlay the hyperplane that svm found? How can I plot the hyperplane? I'd like to visualize the regress hyperplane.

like image 698
CodeGuy Avatar asked Nov 05 '11 01:11

CodeGuy


2 Answers

You wrote:

I used svm to find a hyperplane best fit regression

But according to:

Call:
svm(formula = q ~ ., data = data, kernel = "linear")

Parameters:
SVM-Type:  C-classification

you are doing classification.

So, first of all decide what you need: to classify or to fit regression, from ?svm, we see:

type: ‘svm’ can be used as a classification machine, as a
      regression machine, or for novelty detection.  Depending of
      whether ‘y’ is a factor or not, the default setting for
      ‘type’ is ‘C-classification’ or ‘eps-regression’,
      respectively, but may be overwritten by setting an explicit
      value.

As I believe you didn't change the parameter type from its default value, you are probably solving classification, so, I will show how to visualize this for classification.

Let's assume there are 2 classes, generate some data:

> require(e1071) # for svm()                                                                                                                                                          
> require(rgl) # for 3d graphics.                                                                                                                                                                                    
> set.seed(12345)                                                                                                                                                                     
> seed <- .Random.seed                                                                                                                                                                
> t <- data.frame(x=runif(100), y=runif(100), z=runif(100), cl=NA)
> t$cl <- 2 * t$x + 3 * t$y - 5 * t$z                                                                                                                                                 
> t$cl <- as.factor(ifelse(t$cl>0,1,-1))
> t[1:4,]
           x         y         z cl
 1 0.7209039 0.2944654 0.5885923 -1
 2 0.8757732 0.6172537 0.8925918 -1
 3 0.7609823 0.9742741 0.1237949  1
 4 0.8861246 0.6182120 0.5133090  1

Since you want kernel='linear' the boundary must be w1*x + w2*y + w3*z - w0 - hyperplane. Our task divides to 2 subtasks: 1) to evaluate equation of this boundary plane 2) draw this plane.

1) Evaluating the equation of boundary plane

First, let's run svm():

> svm_model <- svm(cl~x+y+z, t, type='C-classification', kernel='linear',scale=FALSE)

I wrote here explicitly type=C-classification just for emphasis we want do classification. scale=FALSE means that we want svm() to run directly with provided data without scaling data (as it does by default). I did it for future evaluations that become simpler.

Unfortunately, svm_model doesn't store the equation of boundary plane (or just, normal vector of it), so we must evaluate it. From svm-algorithm we know that we can evaluate such weights with following formula:

w <- t(svm_model$coefs) %*% svm_model$SV

The negative intercept is stored in svm_model, and accessed via svm_model$rho.

2) Drawing plane.

I didn't find any helpful function plane3d, so, again we should do some handy work. We just take grid of pairs (x,y) and evaluate the appropriate value of z of the boundary plane.

detalization <- 100                                                                                                                                                                 
grid <- expand.grid(seq(from=min(t$x),to=max(t$x),length.out=detalization),                                                                                                         
                    seq(from=min(t$y),to=max(t$y),length.out=detalization))                                                                                                         
z <- (svm_model$rho- w[1,1]*grid[,1] - w[1,2]*grid[,2]) / w[1,3]

plot3d(grid[,1],grid[,2],z)  # this will draw plane.
# adding of points to the graphics.
points3d(t$x[which(t$cl==-1)], t$y[which(t$cl==-1)], t$z[which(t$cl==-1)], col='red')
points3d(t$x[which(t$cl==1)], t$y[which(t$cl==1)], t$z[which(t$cl==1)], col='blue')

We did it with rgl package, you can rotate this image and enjoy it :)

enter image description here

like image 145
Max Avatar answered Nov 16 '22 02:11

Max


I'm just starting out in R myself, but there's a decent tutorial on using the e1071 package in R for regression rather than classification:

http://eric.univ-lyon2.fr/~ricco/tanagra/fichiers/en_Tanagra_Support_Vector_Regression.pdf

with a zip file of the test dataset and R script in:

http://eric.univ-lyon2.fr/~ricco/tanagra/fichiers/qsar.zip

Skip the first section on Tanagra and head straight to section 6 (page 14). It has its faults, but it gives examples of using R for linear regression, SVR with epsilon-regression and with nu-regression. It also makes a stab at demonstrating the tune() method (but could be done better, IMHO).

(Note: if you choose to run the examples in that paper, don't bother trying to find a working copy of xlsReadWrite -- it's much easier to export qsar.xls as a .csv file and just use read.csv() to load the dataset.)

like image 37
fearless_fool Avatar answered Nov 16 '22 03:11

fearless_fool