$29
Problem 1: Principal Component Analysis (100%)
Principal component analysis (PCA) is a technique of dimensionality reduction, which linearly maps data onto a lower-dimensional space, so that the variance of the projected data in the associated dimensions would be maximized. In this problem, you will perform PCA on a dataset of face images.
The folder p1_data contains face images of 40 different subjects (classes) and 10 grayscale images for each subject, all of size (56; 46) pixels. Note that i_j.png is the j-th image of the i-th person, which is denoted as personiimagej for simplicity.
First, split the dataset into two subsets (i.e., training and testing sets). The first subset contains the first 9 images of each subject, while the second subset contains the remaining images. Thus, a total of 9 40 = 360 images are in the training set, and 1 40 = 40 images in the testing set.
In this problem, you will compute the eigenfaces of the training set, and project face images from both the training and testing sets onto the same feature space with reduced dimension.
1. (20%) Perform PCA on the training set. Plot the mean face and the first four eigenfaces.
2. (20%) If the last digit of your student ID number is odd, take person2image1. If the last digit of your student ID number is even, take person8image1. Project it onto the PCA eigenspace you obtained above. Reconstruct this image using the first n = 3; 50; 170; 240; 345 eigenfaces. Plot the five reconstructed images.
3. (20%) For each of the five images you obtained in 2., compute the mean squared error (MSE) between the reconstructed image and the original image. Record the corresponding MSE values in your report.
4. (20%) Now, apply the k-nearest neighbors algorithm to classify the testing set images. First, you will need to determine the best k and n values by 3-fold cross-validation. For simplicity, the choices for such hyperparameters are k = f1; 3; 5g and n = f3; 50; 170g. Show the cross-validation results and explain your choice for (k; n).
5. (20%) Use your hyperparameter choice in 4. and report the recognition rate of the testing set.
• Hint
• When plotting eigenfaces, be sure to start from the most dominant one to the least dominant one. Note that the calculated eigenvalues may be sorted in either ascending or descending order depending on the programming language/packages you use.
• Display your output faces in grayscale colormap instead of other ones.
• When calculating MSE, your pixel values should be in the range of [0; 255].
Remarks
• For this homework, we will not grade your source code. Thus, you may use any programming language you feel comfortable with. You are also allowed to use any related packages, libraries, and functions for your implementation. However, you must provide image outputs with detailed discussions or explanations.
• Please convert your report into a single .pdf file (with arbitrary file name) and upload it to NTU COOL before the deadline. You do NOT need to upload anything other than the pdf report for this homework.
2