Starting from:
$25

$19

Homework 2 (Week 4) Solution


Note: for a dataset, classification accuracy is defined as number of correctly classified data points divided by total number of data points.


    1. In this 3-class problem, you will use the one vs. rest method for multiclass classification. Let the discriminant functions be:
g!"x$ = −x! − x" + 5
g""x$ = x! − 3
g#"x$ = −x! + x" − 1

In this problem, use the OvR decision rule given in lecture.

Draw the decision boundaries and label decision regions Γi   and any indeterminate regions.

Classify the points x = (2,4), (4,3), and (1,2) . If there are indeterminate regions, show that a point in (one of) the region(s) doesn’t get classified according to the OvR decision rule. If there is no indeterminate region, so state.


    2. For the wine dataset (from Homework 1 data files), code up a nearest-means classifier with the following multiclass approach: one vs. one. Use the original unnormalized data given with Homework 1, and use the decision rule given in lecture. Note that the class means should always be defined by the training data. Run the classifier using only the following two features: 1 and 2.

Note that the same guidelines as Homework 1 apply on coding the classifier(s) yourself vs. using available packages or routines , with one possible exception*.

Give the following:

        (a) Classification accuracy on training set and on test set.

        (b) Plots showing each resulting 2-class decision boundary and regions (S$ vs. S%).
        (c) A plot showing the final decision boundaries and regions ( Γ1, Γ2 , Γ3, indeterminate if any). Please note that decision boundaries (which have area=0) don’t count as indeterminate regions.

Hint 1: For (b) and (c), you can use PlotDecBoundaries(). Modify it if necessary.

Hint 2: *If using Python, you may optionally use scipy.spatial.distance.cdist in calculating Euclidean distance between matrix elements.




p. 1 of 3
    3. (a) Derive an expression for the discriminant function   (  ) for a 2-class nearest-means classifier, based on Euclidean distance, for class means ! and µ". Keep the number of dimensions variable. Express in simplest form.1 Is the classifier linear2?

Hints:    1Remember that the expression for   (  ) is not unique; choose an expression that has a simple form. What matters is how   (  ) compares to 0.

2You can check your answer by comparing with a plot of the decision boundary.

    (b) Continuing from part (a), for the following class means:


µ


0

,

µ


0



1
=
−2




2
=
1














Plot the decision boundaries and label the decision regions.

    (c) Repeat part (a) except for a 3-class classifier, using the maximal value method (MVM): find the three discriminant functions g1 (x), g2 (x), g3 (x) , given three class means µ1, µ2 , and µ3 . Express in simplest form. Is the classifier linear?

        (d) Continuing from part (c) using MVM, for the following class means:

µ


0

,

µ


0

,

µ


2


1
=
−2




2
=
1




3
=
0




















Plot the decision boundaries and label the decision regions.

        (e) Repeat part (a) except for a 3-class classifier, using the one vs. one (OvO) method: find the three discriminant function, !,"" $,    !,#" $,    ",#" $ given three class means µ1, µ2 , and µ3 . Express in simplest form. Is the classifier linear?

    4. (a) Let p( x ) be a scalar function of a D-dimensional vector x , and f ( p) be a scalar function of p. Prove that:
∇x  f  p(x ) =  dpd f ( p) ∇x p(x )



i.e., prove that the chain rule applies in this way. [Hint: you can show it for the ith component of the gradient vector, for any i. It can be done in a couple lines.]

(b)    Use relation (4) of “expressions” in Discussion 2, to find    ∇x (xT x ) .

    (c) Prove your result of ∇x (xT x ) in part (b) by, instead, writing out the components.

    (d) Use (a) and (b) to find ∇x  (xT x)3    in terms of x .


p. 2 of 3
5.    (a) Use relations above to find ∇w w2 . Express your answer in terms of w2 where possible. Hint: let p = wT w ; what is f ?
    (b) Find: ∇w  Mw − b2 . Express your result in simplest form. Hint: first choose p (remember it must be a scalar).

























































p. 3 of 3

More products