Starting from:
$30

$24

Lab 6: Artificial Neural Networks III Solution




Problem Description




In C++ implement the multi-layered Arti cial Neural Network shown in g-ure 1, using Sigmoid activation functions for the hidden layer and output nodes.




Given the inputs, x1 = 0, x2 = 1, target outputs t1 = 1, t2 = 0 (for output nodes: y1, y2, respectively), and connection weight values: v11, v12, v21, v22, w11, w12, w21, w22 (shown in gure 1), use the Back-Propagation algorithm (chapter 4 [Mitchell, 1997]), to do one forward pass and one backward pass and calculate the following:




Hidden node outputs (activations) in rst forward pass. Outputs (y1, y2) in rst forward pass.




Error for each output node after rst forward pass.




New weights for layer 2 connections (hidden to output node weights: w11, w12, w21, w22) in rst backward pass.




Hidden node errors in rst backward pass.




New weights for layer 1 connections (input to hidden node weights: v11, v12, v21, v22) in rst backward pass.




In a ZIP le, place your source code, make le, and a text le containing answers to the above node output, error and weight calculations.




Upload the ZIP le to Vula before 10.00 AM, Friday 21 September.

Figure 1: Multi-layered Arti cial Neural Network with Sigmoid activation func-tion for hidden and output nodes. Initial connection weights (v11, v12, v21, v22, w11, w12, w21, w22), and input values (x1 = 0, x2 = 1) are shown.













References




[Mitchell, 1997] Mitchell, T. (1997). Machine Learning. McGraw Hill, New York, USA.



More products