Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Why is a bias neuron necessary for a backpropagating neural network that recognizes the XOR operator?

I posted a question yesterday regarding issues that I was having with my backpropagating neural network for the XOR operator. I did a little more work and realized that it may have to do with not having a bias neuron.

My question is, what is the role of the bias neuron in general, and what is its role in a backpropagating neural network that recognizes the XOR operator? Is it possible to create one without a bias neuron?

like image 623
Vivin Paliath Avatar asked Nov 07 '11 16:11

Vivin Paliath


People also ask

Why do we need a bias neuron?

It is an additional parameter in the Neural Network which is used to adjust the output along with the weighted sum of the inputs to the neuron. Therefore Bias is a constant which helps the model in a way that it can fit best for the given data.

Why is the XOR problem important in neural networks?

The XOR, or “exclusive or”, problem is a classic problem in ANN research. It is the problem of using a neural network to predict the outputs of XOR logic gates given two binary inputs. An XOR function should return a true value if the two inputs are not equal and a false value if they are equal.

How many neurons does it take to solve XOR?

Our lab demonstrates that single human layer 2/3 neurons can compute the XOR operation.

What is bias in back propagation?

“Biases are values that are added to the sums calculated at each node (except Input nodes) during the feed-forward phase.” That is, the bias associated with a particular node is added to the score Sj in: prior to the use of activation function at that same node.


1 Answers

It's possible to create a neural network without a bias neuron... it would work just fine, but for more information I would recommend you see the answers to this question:

Role of Bias in Neural Networks

Update: the role of the bias neuron in the neural net that attempts to solve model XOR is to minimize the size of the neural net. Usually, for "primitive" (not sure if this is the correct term) logic functions such as AND, OR, NAND, etc, you are trying to create a neural network with 2 input neurons, 2 hidden neurons and 1 output neuron. This can't be done for XOR because the simplest way you can model an XOR is with two NANDs:

enter image description here

You can consider A and B as your input neurons, the gate in the middle is your "bias" neuron, the two gates following are your "hidden" neurons and finally you have the output neuron. You can solve XOR without having a bias neuron, but it would require that you increase the number of hidden neurons to a minimum of 3 hidden neurons. In this case, the 3rd neuron essentially acts as a bias neuron. Here is another question that discusses the bias neuron with regards to XOR: XOR problem solvable with 2x2x1 neural network without bias?

like image 73
Kiril Avatar answered Oct 10 '22 03:10

Kiril