Starting from:
$35

$29

Lab1 : back-propagation Solution


Lab Objective:

In this lab, you will need to understand and implement simple neural networks with forwarding pass and backpropagation using two hidden layers. Notice that you can only use Numpy and the python standard libraries, any other frameworks (ex : Tensorflow、PyTorch) are not allowed in this lab.




















Figure 1. Two-layer neural network

Important Date:

    1. Experiment Report Submission Deadline: 3/30 (Tue) 12:00 p.m.

    2. Demo date: 3/30 (Thu)



Turn in:

    1. Experiment Report (.pdf)

    2. Source code

Notice: zip all files in one file and name it like「DLP_LAB1_your

studentID_name.zip」, ex: 「DLP_LAB1_309551009_陳璽存.zip」


Requirements:

    1. Implement simple neural networks with two hidden layers.

    2. You must use backpropagation in this neural network and can only use Numpy and other python standard libraries to implement.

    3. Plot your comparison figure that show the predicted results and the ground-truth.

Implementation Details:




















Figure 2. Forward pass

    • In the figure 2, we use the following definitions for the notations:
        1. 1, 2∶

        2. ∶ [  1, 2]
        3. ∶
        4.   ̂∶ℎ

        5. L(θ) ∶
        6. 1,2,3∶   ℎ

    • Here are the computations represented:
=   (    1)    =   (      2)    =   (      3)

    • In the equations, the is sigmoid function that refers to the special case of the logistic function and defined by the formula:
(x) = 1+  1−  


    • Input / Test:

The inputs are two kinds which showing at below.
















You need to use the following generate functions to create your inputs x, y.































Function usage





In the training, you need to print the loss values; In the testing, you need to show your predictions as shown below.





















Visualize the predictions and ground truth at the end of the training process.

The comparison figure should like example as below.

























You can refer to the following visualization code

    x: inputs (2-dimensional array)

    y: ground truth label (1-dimensional array)

pred_y: outputs of neural network (1-dimensional array)





























    • Sigmoid functions:

        1. A sigmoid function is a mathematical function having a characteristic "S"-shaped curve or sigmoid curve. It is a bounded, differentiable, real function that is defined for all real input values and has a non-negative derivative at each point. In general, a sigmoid function is monotonic, and has a first derivative which is bell shaped.

        2. (hint) You may write the function like this:





        3. (hint) The derivative of sigmoid function









    • Back Propagation (Gradient computation)

Backpropagation is a method used in artificial neural networks to calculate a gradient that is needed in the calculation of the weights to be used in the network. Backpropagation is a generalization of the delta rule to multi-layered feedforward networks, made possible by using the chain rule to


iteratively compute gradients for each layer. The backpropagation learning algorithm can be divided into two parts; propagation and weight update.

Part 1: Propagation

Each propagation involves the following steps:

    1. Propagation forward through the network to generate the output value
    2. Calculation of the cost L(θ) (error term)

    3. Propagation of the output activations back through the network using the training pattern target in order to generate the deltas (the difference between the targeted and actual output values) of all output and hidden neurons.



Part 2: Weight update

For each weight-synapse follow the below steps:

    1. Multiply its output delta and input activation to get the gradient of the weight.

    2. Subtract a ratio (percentage) of the gradient from the weight.

    3. This ratio (percentage) influences the speed and quality of learning; it is called the learning rate. The greater the ratio, the faster the neuron trains; the lower the ratio, the more accurate the training is. The sign of the gradient of a weight indicates where the error is increasing, this is why the weight must be updated in the opposite direction.


Repeat part. 1 and 2 until the performance of the network is satisfactory.


Pseudocode:

Report Spec

    1. Introduction (20%)

    2. Experiment setups (30%):

        A. Sigmoid functions

        B. Neural network

        C. Backpropagation

    3. Results of your testing (20%)

            A. Screenshot and comparison figure

            B. Show the accuracy of your prediction

            C. Learning curve (loss, epoch curve)

            D. anything you want to present

    4. Discussion (30%)

            A. Try different learning rates

            B. Try different numbers of hidden units

            C. Try without activation functions

            D. Anything you want to share

    5. Extra (10%)

            A. Implement different optimizers. (2)

            B. Implement different activation functions. (3)

            C. Implement convolutional layers. (5)



Score:

60% demo score (experimental results & questions) + 40% report For experimental results, you have to achieve at least 90% of accuracy to get the demo score.

If the zip file name or the report spec have format error, you will be punished (-5)


Reference:

    1. Logical regression: http://www.bogotobogo.com/python/scikit-learn/logistic_regression.php

    2. Python tutorial: https://docs.python.org/3/tutorial/

    3. Numpy tutorial: https://www.tutorialspoint.com/numpy/index.htm

    4. Python Standard Library: https://docs.python.org/3/library/index.html

    5. http://speech.ee.ntu.edu.tw/~tlkagk/courses/ML_2016/Lecture/BP.pdf

    6. https://en.wikipedia.org/wiki/Sigmoid_function

    7. https://en.wikipedia.org/wiki/Backpropagation

More products