Prev: M5 Next: M7
Back to week 3 page: Link
 

# Warning: this is a replica of the homework page for testing purposes, please use M6 for homework submission.


# T6 Written (Math) Problems

📗 Enter your ID (the wisc email ID without @wisc.edu) here: and click (or hit enter key)
📗 The same ID should generate the same set of questions. Your answers are not saved when you close the browser. You could print the page: , solve the problems, then enter all your answers at the end.
📗 Please do not refresh the page: your answers will not be saved.

# Warning: please enter your ID before you start!


# Question 1


# Question 2


# Question 3


# Question 4


# Question 5


# Question 6


# Question 7


# Question 8


# Question 9


# Question 10


📗 [3 points] There are two biased coins in my pocket: coin A has \(\mathbb{P}\left\{H | A\right\}\) = , coin B has \(\mathbb{P}\left\{H | B\right\}\) = . I took out a coin from the pocket at random with probability of A is . I flipped it twice the outcome is . What is the probability that the coin was ?
Hint See Spring 2018 Final Q22 Q23, Fall 2018 Midterm Q11, Fall 2017 Final Q20, Spring 2017 Final Q6, Fall 2010 Final Q18. For example, the Bayes Rule for the probability that the document is \(A\) given the outcome is \(H T H\) is \(\mathbb{P}\left\{A | H T H\right\} = \dfrac{\mathbb{P}\left\{H T H, A\right\}}{\mathbb{P}\left\{H T H\right\}}\) = \(\dfrac{\mathbb{P}\left\{H T H | A\right\} \mathbb{P}\left\{A\right\}}{\mathbb{P}\left\{H T H | A\right\} \mathbb{P}\left\{A\right\} + \mathbb{P}\left\{H T H | B\right\} \mathbb{P}\left\{B\right\}}\) = \(\dfrac{\mathbb{P}\left\{H | A\right\} \mathbb{P}\left\{T | A\right\} \mathbb{P}\left\{H | A\right\} \mathbb{P}\left\{A\right\}}{\mathbb{P}\left\{H | A\right\} \mathbb{P}\left\{T | A\right\} \mathbb{P}\left\{H | A\right\} \mathbb{P}\left\{A\right\} + \mathbb{P}\left\{H | B\right\} \mathbb{P}\left\{T | B\right\} \mathbb{P}\left\{H | B\right\} \mathbb{P}\left\{B\right\}}\). Note that \(\mathbb{P}\left\{H T H | A\right\}\) can be split into three probabilities because the coins are independently flipped.
📗 Answer: .
📗 [2 points] You have a vocabulary with \(n\) = word types. You want to estimate the unigram probability \(p_{w}\) for each word type \(w\) in the vocabulary. In your corpus the total word token count \(\displaystyle\sum_{w} c_{w}\) is , and \(c_{\text{tenet}}\) = . Using add-one smoothing \(\delta\) = (Laplace smoothing), compute \(p_{\text{tenet}}\).
Hint See Fall 2018 Midterm Q12. The maximum likelihood estimate of \(p_{w}\) is \(\dfrac{c_{w} + \delta}{\displaystyle\sum_{w'} c_{w'} + n \delta}\).
📗 Answer: .
📗 [2 points] A traffic light repeats the following cycle: green seconds, yellow seconds, red seconds. A driver saw at a random moment. What is the probability that one second later the light became ?
Hint See Fall 2017 Midterm Q1. Among the total of \(t\) seconds that the light has a particular color, with only one (the last second) of them the light will change in the next second.
📗 Answer: .
📗 [2 points] Let \(A \in\) and \(B \in\) . What is the least number of probabilities needed to fully specify the conditional probability table of B given A (\(\mathbb{P}\left\{B | A\right\}\))?
Hint See Fall 2017 Midterm Q2, Fall 2014 Final Q3. Suppose \(A\) can take the values \(1, 2, ..., n_{A}\) and \(B\) can take the values \(1, 2, ..., n_{B}\), then given \(\mathbb{P}\left\{B = 1 | A = 1\right\}\), \(\mathbb{P}\left\{B = 2 | A = 1\right\}\), ..., \(\mathbb{P}\left\{B = n_{B} - 1 | A = 1\right\}\), then the probability \(\mathbb{P}\left\{B = n_{B} | A = 1\right\}\) can be calculated by \(1 - \displaystyle\sum_{i=1}^{n_{B} - 1} \mathbb{P}\left\{B = i | A = 1\right\}\). On the other hand, given \(\mathbb{P}\left\{B = 1 | A = 1\right\}\), \(\mathbb{P}\left\{B = 1 | A = 2\right\}\), ..., \(\mathbb{P}\left\{B = 1 | A = n_{A} - 1\right\}\), the probability that \(\mathbb{P}\left\{B = 1 | A = n_{A}\right\}\) cannot be calculated.
📗 Answer: .
📗 [2 points] In a corpus with word tokens, the phrase "San Francisco" appeared times. In particular, "San" appeared times and "Francisco" appeared . If we estimate probability by frequency (the maximum likelihood estimate), what is the estimated probability of P(Francisco | San)?
Hint See Fall 2017 Midterm Q7, Fall 2016 Final Q4. The maximum likelihood estimate of \(\mathbb{P}\left\{B | A\right\} = \dfrac{\mathbb{P}\left\{A B\right\}}{\mathbb{P}\left\{A\right\}}\) is \(\dfrac{n_{A B}}{n_{A}}\).
📗 Answer: .
📗 [2 points] According to Zipf's law, if a word \(w_{1}\) has rank and \(w_{2}\) has rank , what is the ratio \(\dfrac{f_{1}}{f_{2}}\) between the frequency (or count) of the two words?
Hint See Fall 2017 Final Q1. The Zipf's Law says \(f \cdot r = c\) is a constant. Therefore, \(f_{1} r_{1} = f_{2} r_{2} = c\), where \(r_{1}, r_{2}\) are the ranks of the two words.
📗 Answer: .
📗 [2 points] In your day vacation, the counts of days are:
rainy warm bighorn (saw sheep) days
N N N
N N Y
N Y N
N Y Y
Y N N
Y N Y
Y Y N
Y Y Y

Use maximum likelihood estimate (no smoothing), estimate the probability that P(bighorn = | rainy = , warm = )?
Hint See Fall 2017 Final Q3, Fall 2006 Final Q19, Fall 2005 Final Q19. For example, the maximum likelihood estimate of \(\mathbb{P}\left\{A | \neg B, C\right\} = \dfrac{\mathbb{P}\left\{A, \neg B, C\right\}}{\mathbb{P}\left\{\neg B, C\right\}}\), for binary variables \(A, B, C\), is \(\dfrac{n_{A, \neg B, C}}{n_{A, \neg B, C} + n_{\neg A, \neg B, C}}\).
📗 Answer: .
📗 [2 points] \(C\) is the boolean whether you have COVID-19 or not. \(F\) is the boolean whether you have a fever or not. Let \(\mathbb{P}\left\{F = 1\right\}\) = , \(\mathbb{P}\left\{C = 1\right\}\) = , \(\mathbb{P}\left\{F = 0 | C = 1\right\}\) = . Given that you have COVID-19, what is the probability that you have fever? Note: this question uses random fake data, please refer to CDC for actual data.
Hint See Fall 2017 Midterm Q6. The probability that you have fever given you have COVID-19 is \(\mathbb{P}\left\{F = 1 | C = 1\right\} = 1 - \mathbb{P}\left\{F = 0 | C = 1\right\}\).
📗 Answer: .
📗 [3 points] Given two Boolean random variables, \(A\) and \(B\), where \(\mathbb{P}\left\{A\right\}\) = , \(\mathbb{P}\left\{B\right\}\) = , and \(\mathbb{P}\left\{A| \neg B\right\}\) = , what is \(\mathbb{P}\left\{A|B\right\}\)?
Hint See Spring 2018 Final Q20, Fall 2016 Final Q3, Fall 2009 Final Q6. The following relations may be useful: \(\mathbb{P}\left\{A | \neg B\right\} = \dfrac{\mathbb{P}\left\{A, \neg B\right\}}{\mathbb{P}\left\{\neg B\right\}}\), \(\mathbb{P}\left\{A | B\right\} = \dfrac{\mathbb{P}\left\{A, B\right\}}{\mathbb{P}\left\{B\right\}}\), and \(\mathbb{P}\left\{A\right\} = \mathbb{P}\left\{A, B\right\} + \mathbb{P}\left\{A, \neg B\right\}\).
📗 Answer: .
📗 [1 points] Please enter any comments and suggestions including possible mistakes and bugs with the questions and the auto-grading, and materials relevant to solving the questions that you think are not covered well during the lectures. If you have no comments, please enter "None": do not leave it blank.
📗 Answer: .

# Grade


 ***** ***** ***** ***** ***** 

 ***** ***** ***** ***** *****


📗 You could save the text in the above text box to a file using the button or copy and paste it into a file yourself .
📗 You could load your answers from the text (or txt file) in the text box below using the button . The first two lines should be "##m: 6" and "##id: your id", and the format of the remaining lines should be "##1: your answer to question 1" newline "##2: your answer to question 2", etc. Please make sure that your answers are loaded correctly before submitting them.







Last Updated: April 29, 2024 at 1:11 AM