CSci 431

Exercise set #5

Due: Wednesday, Dec. 8


Please write (in almost any language you want to use - languages such as MatLab or Mathematica excluded) a feed-forward network implementing backpropagation to solve the XOR problem. Use two nodes in the hidden level, and stop when the results are within 0.1 of the expected result.

My experience with this problem is that it tends to converge to the (inclusive) OR. If this happens, you will need to start over with a random set of weights. If it does not converge in something like 9,000 epochs, it probably won't converge.

In the header documentation of your program, record the final results (how closely the system converged to the expected values), and how many tries it took to get there.


Any questions? Please ask!