$29
Computer experiments play a critical role in the study of neural networks. Many aspects of this class can be fully understood only with computer simulations. Programming assignments help you gain first-hand experience with the algorithms introduced in the class. You may use any computer language for the implementation, but NOT a neural network toolbox.
Implement a two-layer perceptron with the backpropagation algorithm to solve the parity problem. The desired output for the parity problem is 1 if an input pattern contains an odd number of 1's and 0 otherwise. Follow the algorithm introduced in class and consult the textbook. Use a network with 4 binary input elements, 4 hidden units for the first layer, and one output unit for the second layer. The learning procedure is stopped when an absolute error (difference) of 0.05 is reached for every input pattern. Other implementation details are:
Initialize all weights and biases to random numbers between -1 and 1.
Use a logistic sigmoid with a = 1 as the activation function for all units.
After programming is done, do the following:
Vary the value of η from 0.05 to 0.5 with increment 0.05, and report the number of epochs for each choice of η.
Include a momentum term in weight update with α = 0.9 and report its effect on the speed of training for each value of η.
What you need to turn in: (1) 1-2 page summary report as a PDF document; (2) test results of your implementation as an appendix to your report; and (3) your source program. Put all of these things into a ZIP file and upload it to the dropbox on the Carmen website.