SCXT 350

An exercise in perceptrons

Due:  Monday, 24 (in class)

This due date is selected so that I can have the results back to you in time for your review for Exam #3. 

In recent class discussions, we discovered how to construct one and two input perceptrons to implement the logical functions NOT, AND, and OR.  Computer circuitry, however, is generally described in terms of NAND (NOT AND), and NOR (NOT OR).  With this in mind, discover weights on a two input perceptron (with bias) which will implement the following functions:

Perceptron #1 (NAND):

X Y Bias Out
0 0 1 1
0 1 1 1
1 0 1 1
1 1 1 0

 Perceptron #2 (NOR):

X Y Bias Out
0 0 1 1
0 1 1 0
1 0 1 0
1 1 1 0

For each perceptron, sketch the equation of the line

    w0 * bias + w1 * X + w2 * y = 0

where w0 is the weight you have placed on the bias, w1 is the weight you have placed on the input X, and w2 is the weight you have placed on the input Y (X and Y correspond to our earlier x_1 and x_2)..

Finally, consider the truth function for XOR:

X Y Bias Out
0 0 1 0
0 1 1 1
1 0 1 1
1 1 1 0

This was the example that stopped research in perceptrons and neural networks in general for twenty years.  We can't do it with one perceptron, but there are several ways that we can do it with more than one.  Draw a picture of a network of perceptrons (with weights) which does the trick.  You need at least two, and should not have more than three.