Starting from:
$35

$29

Problem Set#6 Solution



    1. We are writing a WebMD program that is slightly larger than the one we worked through in class. In this program we predict whether a user has a flu (F = 1) or cold (C = 1) based on knowing any subset of 10 potential binary symptoms (e.g., headache, sniffles, fatigue, cough, etc) and a subset of binary risk factors (exposure, stress).
        ◦ We know the prior probability for Stress is 0.5 and Exposure is 0.1.

        ◦ The functions probCold(s, e) and probFlu(s, e) return the probability that a patient has a cold or flu, given the state of the risk factors stress (s) and exposure (e).
        ◦ The function probSymptom(i, f, c) which returns the probability that the ith symptom (Xi) takes on value 1, given the state of cold (c) and flu (f): P (Xi = 1jF = f; C = c).

We would like to write pseudocode to calculate the probability of flu conditioned on observing that the patient has had exposure to a sick friend and that they are experiencing Symptom 2 (sore throat). In terms of random variables P (Flu = 1 | Exposure = 1, X2 = 1):


def inferProbFlu() # P (Flu = 1 | Exposure = 1 and X2 = 1)

Write pseudocode that calculates inferProbFlu() using Rejection Sampling.



    2. Consider the Exponential distribution. It is your friend . . . really. Specifically, consider a sample of I.I.D. exponential random variables X1; X2; : : : ; Xn, where each Xi Exp( ). Derive the maximum likelihood estimate for the parameter in the Exponential distribution.


    3. Say you have a set of binary input features/variables X1; X2; : : : ; Xm that can be used to make a prediction about a discrete binary output variable Y (i.e., each of the Xi as well as Y can only take on the values 0 or 1). Say that the first k input variables X1; X2; : : : ; Xk are actually all identical copies of each other, so that when one has the value 0 or 1, they all do. Explain informally, but precisely, why this may be problematic for the model learned by the Naïve Bayes classifier.


    4. Implement a Naïve Bayes classifier. Detailed instructions are provided in the comments of the starter code.

        a. [Coding] Implement the function fit in naive_bayes.py.

        b. [Coding] Implement the function predict in naive_bayes.py.

    5. Implement a Logistic Regression classifier. Specifically, you should implement the gradient ascent algorithm described in class. Detailed instructions are provided in the comments of the starter code.

        a. [Coding] Implement the function fit in logistic_regression.py.

        b. [Coding] Implement the function predict in logistic_regression.py.
















































6

More products