Starting from:
$14.99

$8.99

Homework #1 Solution

1.  [4 points] Intro  to Machine Learning

 

 

Consider  the  task  of classifying  an  image  as  one  of a  set  of objects.    Suppose  we use  a convolutional  neural  network  to do so (you will learn what  this is later  in the semester).

 

(a)  For this setup,  what  is the data  (often referred to as x(i))?

Your answer:

 

 

 

 

(b)  For this setup,  what  is the label (often referred to as y(i))?

Your answer:

 

(c)  For this setup,  what  is the model?

Your answer:

 

 

 

 

 

 

 

(d)  What  is the distinction between  inference and learning  for this task?

Your answer:

 

 

 

 

 

 

 

 

2.  [8 points] K -Nearest  Neighbors

 

 

K-Nearest  Neighbors is an extension  of the Nearest-Neighbor classification algorithm.  Given a set  of points  with  assigned  labels,  a new point is classified by considering  the  K  points closest to it (according  to some metric)  and  selecting  the  most  common label among  these points.  One common metric  to use for KNN is the squared  euclidean  distance,  i.e.

 



2
 
d(x(1) , x(2) ) = kx(1)  − x(2) k2                                                                      (1)

 

For this problem,  consider the following set of points  in R2, each of which is assigned with a label y ∈ {1, 2}:

 

 

x1
x2
y
1

0.4

−2.8

3.2

−1.3

−3
1

5.2

−1.1

1.4

3.2

3.1
2

1

2

1

1

2
 

 

(a)  Classify each of the following points  using the Nearest  Neighbor rule (i.e.  K = 1) with the squared  euclidean  distance  metric.

 

 



x1
x2
y
−2.6

1.4

−2.5
6.6

1.6

1.2
?

?

?
 
 
Your answer:

 

 

 

 

 

(b)  Classify each of the following points using the 3-Nearest Neighbor rule with the squared euclidean  distance  metric.

 

 



x1
x2
y
−2.6

1.4

−2.5
6.6

1.6

1.2
?

?

?
 
 
Your answer:

 

 

 

 

 

(c)  Given a dataset containing  n points,  what  is the  outcome  of classifying any additional point using the n-Nearest  Neighbors algorithm?

Your answer:

 

 

 

 

 

 

 

 

 

(d)  How many parameters are learned  when applying  K -nearest  neighbors?

Your answer:

 

 

 

 

 

 

 

 

 

 

 

 

2

More products