$29
1. [2 points] Consider a single-input neuron
The input to the neuron is 3.0, its weight is 2.3 and bias is -3.0.
a) What is the net input to the transfer function, tot ?
b) Using an activation function of your choice, determine output of the neuron.
2. [3 points] Consider two single-neuron perceptrons with the same weight and bias values
The first perceptron uses the unipolar hardlimit function, hlu, and the second perceptron uses the bipolar hardlimit function, hlb. If the networks are given the same input x, and updated with the perceptron learning rule, will their weights continue to have the same value?
3. [5 points] Consider two types of activation functions
Logistic sigmoid
y =
1
(covered in class), and Elliott y =
tot
(new in this assignment).
1+ e
−tot
1+ tot
a) Determine derivatives of these functions,
b) Plot graphs of the functions and their derivatives,
c) Compare the functions and describe your observations.