Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to solve XOR problem with MLP neural network?

Tomorrow morning I have to give neural network final exam, but there is a problem, I cannot solve XOR problem with MLP, I don't know how to assign weights and bias values :(

like image 748
Maysam Avatar asked Jun 27 '11 16:06

Maysam


People also ask

Can MLP solve XOR problem?

MLP solves the XOR problem efficiently by visualizing the data points in multi-dimensions and thus constructing an n-variable equation to fit in the output values. In this blog, we read about the popular XOR problem and how it is solved by using multi-layered perceptrons.

How do you solve XOR with single perceptron?

Yes, a single layer neural network with a non-monotonic activation function can solve the XOR problem. More specifically, a periodic function would cut the XY plane more than once. Even an Abs or Gaussian activation function will cut it twice.

Why XOR Cannot be solved by perceptron?

A "single-layer" perceptron can't implement XOR. The reason is because the classes in XOR are not linearly separable. You cannot draw a straight line to separate the points (0,0),(1,1) from the points (0,1),(1,0). Led to invention of multi-layer networks.


1 Answers

So, seeing as you posted this 2 days ago, I guess I'm a lil late to help with your exam :(

However, learning is always a good thing, and learning about neural nets doubly so!

Normally I'd answer this question by telling you to use a network with 2 input units (one for each boolean), 2 hidden units, and 1 output unit (for the boolean answer), and then directing you towards the wikipedia article on the backprop learning algorithm to find the correct weights.

However, your phrasing -- "I cannot solve" makes it sound like your teacher wants you to find the weights yourself. In which case, a solution would be to think of one hidden unit as representing an OR gate and the other representing an AND gate. The connections from the those units to the output would allow you to say 'fire if the OR gate fires and the AND gate doesn't', which is the definition of the XOR gate. Anyways, that's just the intuition, the actual net is shown below.

Notice that the thresholds of some of the units in the diagram aren't 0 as they normally are -- this is just shorthand for having the bias unit connected to those units with the threshold as the weight.

enter image description here

like image 50
zergylord Avatar answered Nov 29 '22 11:11

zergylord