Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

plot decision boundary matplotlib

I am very new to matplotlib and am working on simple projects to get acquainted with it. I was wondering how I might plot the decision boundary which is the weight vector of the form [w1,w2], which basically separates the two classes lets say C1 and C2, using matplotlib.

Is it as simple as plotting a line from (0,0) to the point (w1,w2) (since W is the weight "vector") if so, how do I extend this like in both directions if I need to?

Right now all I am doing is :

 import matplotlib.pyplot as plt
   plt.plot([0,w1],[0,w2])
   plt.show()

Thanks in advance.

like image 798
anonuser0428 Avatar asked Sep 27 '13 15:09

anonuser0428


People also ask

How do you plot a decision boundary?

Single-Line Decision Boundary: The basic strategy to draw the Decision Boundary on a Scatter Plot is to find a single line that separates the data-points into regions signifying different classes.

How do you find the decision boundary for logistic regression in Python?

This could be achieved by calculating the prediction associated with ˆy for a mesh of (x1,x2) points and plotting a contour plot (see e.g. this scikit-learn example). Alternatively, one can think of the decision boundary as the line x2=mx1+c, being defined by points for which ˆy=0.5 and hence z=0.

What is meant by decision boundary?

A decision boundary is the region of a problem space in which the output label of a classifier is ambiguous. If the decision surface is a hyperplane, then the classification problem is linear, and the classes are linearly separable. Decision boundaries are not always clear cut.


1 Answers

Decision boundary is generally much more complex then just a line, and so (in 2d dimensional case) it is better to use the code for generic case, which will also work well with linear classifiers. The simplest idea is to plot contour plot of the decision function

# X - some data in 2dimensional np.array

x_min, x_max = X[:, 0].min() - 1, X[:, 0].max() + 1
y_min, y_max = X[:, 1].min() - 1, X[:, 1].max() + 1
xx, yy = np.meshgrid(np.arange(x_min, x_max, h),
                     np.arange(y_min, y_max, h))

# here "model" is your model's prediction (classification) function
Z = model(np.c_[xx.ravel(), yy.ravel()]) 

# Put the result into a color plot
Z = Z.reshape(xx.shape)
plt.contourf(xx, yy, Z, cmap=pl.cm.Paired)
plt.axis('off')

# Plot also the training points
plt.scatter(X[:, 0], X[:, 1], c=Y, cmap=pl.cm.Paired)

some examples from sklearn documentation

enter image description here

like image 168
lejlot Avatar answered Sep 30 '22 17:09

lejlot