Starting from:
$30

$24

Problem Set 5 (Logistic Regression) Solution

Goals. The goal of this exercise is to




Do classi cation using linear regression.




Implement and debug logistic regression using gradient descent. Compare it to linear regression.




Implement Newton's method for logistic regression.




Setup, data and sample code. Obtain the folder labs/ex05 of the course github repository




github.com/epfml/ML course







We will use the dataset \height weight genders.csv" in this exercise, and we have provided sample code templates that already contain useful snippets of code required for this exercise.










Classi cation Using Linear Regression



We will try to use linear regression to do classi cation. Although this is not a good idea in general (as discussed in the class), it will work for simple data. We will use the height-weight data from the rst exercise. For better visualization, we will randomly sub-select 200 data point from the data.




Exercise 1:




Classi cation using linear regression.




Use least squares nished in previous exercise to compute a w. Please COPY your previous implementation to the template least squares.py.




Visualize the data points and the decision boundary as Figure 1.
















Logistic Regression



Exercise 2:




Implement logistic regression using gradient descent.




Fill in the notebook function sigmoid().




Fill in two notebook functions calculate loss() and calculate gradient(). The rst function should return negative of the value of log-likelihood, while the second function should return the corresponding gradient.




Implement the gradient descent learning by gradient descent() for logistic regression. You should calculate loss, gradient and update the weight w, and the function should return loss and updated weight.







Figure 1: classi cation by least square.




Plot predictions to get a visualization similar to the right one of Figure 1. Check if you get similar or di erent results.




Do your results make sense?







Once you have gradient descent, it is also straightforward to implement Newton's method.




Exercise 3:




Newton's method for logistic regression.




Fill in the notebook function calculate hessian(). And then integrate your implementation, i.e., calculate loss(), calculate gradient() and calculate hessian(), to the function logistic regression() for future usage. The function should return the cost function, gradient, and Hessian altogether.




Your gradient descent code can now be turned into a Newton's method algorithm. Please ll the notebook function learning by newton method(). The function should return your loss and updated weight.










Exercise 4:




Penalized logistic regression.




Fill in the notebook function penalized logistic regression(). Note that it can be done by adding the regularization term kwk2. Set to a low value and check if it gives the same result. Once it is done, please ll in the notebook function learning by penalized gradient(), and increase the value of and check whether w is shrinking or not.




Check if this gives the same answer as gradient descent. To debug, print the function value and the norm of the gradient in every iteration. All of these values should decrease in every iteration.







Exercise 5:




BONUS: Implement IRLS. Follow the pseudo-code given in the leture notes.










2

More products