Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Geometric representation of Perceptrons (Artificial neural networks)

I am taking this course on Neural networks in Coursera by Geoffrey Hinton (not current).

I have a very basic doubt on weight spaces. https://d396qusza40orc.cloudfront.net/neuralnets/lecture_slides%2Flec2.pdf Page 18. enter image description here

If I have a weight vector (bias is 0) as [w1=1,w2=2] and training case as {1,2,-1} and {2,1,1} where I guess {1,2} and {2,1} are the input vectors. How can it be represented geometrically?

I am unable to visualize it? Why is training case giving a plane which divides the weight space into 2? Could somebody explain this in a coordinate axes of 3 dimensions?

The following is the text from the ppt:

1.Weight-space has one dimension per weight.

2.A point in the space has particular setting for all the weights.

3.Assuming that we have eliminated the threshold each hyperplane could be represented as a hyperplane through the origin.

My doubt is in the third point above. Kindly help me understand.

like image 892
kosmos Avatar asked Mar 01 '14 22:03

kosmos


People also ask

What is perceptrons in neural network?

A Perceptron is a neural network unit that does certain computations to detect features or business intelligence in the input data. It is a function that maps its input “x,” which is multiplied by the learned weight coefficient, and generates an output value ”f(x).

What is the representation power of perceptron?

Decision surface of two-input ( x 1 and x 2 ) perceptron. We can view the perceptron as representing a hyperplane decision surface in n-dimensional space of instances.

What is geometric interpretation in machine learning?

In geometric interpretation terms, the linear regression algorithm tries to find a plane or line that best fits the data points as well as possible. Linear regression is a regression technique that predicts real value.

What is representational power of perceptron in ML?

a single perceptron can represent many boolean functions. if 1 (true) and -1 (false), then to implement an AND function make and. a perceptron can represent AND, OR, NAND, and NOR but not XOR!!


1 Answers

Equation of a plane passing through origin is written in the form:

ax+by+cz=0

If a=1,b=2,c=3;Equation of the plane can be written as:

x+2y+3z=0

So,in the XYZ plane,Equation: x+2y+3z=0

Now,in the weight space;every dimension will represent a weight.So,if the perceptron has 10 weights,Weight space will be 10 dimensional.

Equation of the perceptron: ax+by+cz<=0 ==> Class 0

                          ax+by+cz>0  ==> Class 1

In this case;a,b & c are the weights.x,y & z are the input features.

In the weight space;a,b & c are the variables(axis).

So,for every training example;for eg: (x,y,z)=(2,3,4);a hyperplane would be formed in the weight space whose equation would be:

2a+3b+4c=0

passing through the origin.

I hope,now,you understand it.

like image 66
Khusaal giri Avatar answered Oct 13 '22 12:10

Khusaal giri