Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

What are the weights and bias for the AND perceptron?

I am implementing AND Perceptron and facing difficulty in deciding the weights and bias for the combination to match it to AND Truth table.

Here's the Code that I have written:

import pandas as pd

# Set weight1, weight2, and bias
weight1 = 2.0
weight2 = -1.0
bias = -1.0

# Inputs and outputs
test_inputs = [(0, 0), (0, 1), (1, 0), (1, 1)]
correct_outputs = [False, False, False, True]
outputs = []

# Generate and check output
for test_input, correct_output in zip(test_inputs, correct_outputs):
    linear_combination = weight1 * test_input[0] + weight2 * test_input[1] + bias
    output = int(linear_combination >= 0)
    is_correct_string = 'Yes' if output == correct_output else 'No'
    outputs.append([test_input[0], test_input[1], linear_combination, output, is_correct_string])

# Print output
num_wrong = len([output[4] for output in outputs if output[4] == 'No'])
output_frame = pd.DataFrame(outputs, columns=['Input 1', '  Input 2', '  Linear Combination', '  Activation Output', '  Is Correct'])
if not num_wrong:
    print('Nice!  You got it all correct.\n')
else:
    print('You got {} wrong.  Keep trying!\n'.format(num_wrong))
print(output_frame.to_string(index=False))

I have to decide weight1, weight2 and bias and from the mentioned values. I am getting one output wrong when there is 1 and 0 as Input.

Thanks for helping.

like image 820
ParthS007 Avatar asked Nov 26 '25 23:11

ParthS007


2 Answers

  • The equation is symmetric: the two inputs are functionally equivalent.
  • Taking your weights as the variables, you have four (now three) inequalities in three (now two) variables. Where are you stuck on solving that system?

System:

w = weight (same for both inputs)
b = bias

0*w + 0*w + b <= 0
1*w + 0*w + b <= 0
1*w + 1*w + b >  0

This leaves you with

w + b <= 0
2*w + b > 0

You should be able to characterize the possible solutions from there.

like image 67
Prune Avatar answered Nov 28 '25 14:11

Prune


AND Perceptron:

weight1 = 1.0
weight2 = 1.0
bias = -2.0

OR Perceptron:

weight1 = 1.0
weight2 = 1.0
bias = -1

NOT Perceptron:

weight1 = 1.0
weight2 = -2.0
bias = 0

The bias acts as an intercept to adjust the linear equation.

like image 34
Jaison Francisco Avatar answered Nov 28 '25 14:11

Jaison Francisco