Previous: M6, Next: M8

Back to week 6 page: Link

Official Due Date: June 28

# Written (Math) Problems

📗 Enter your ID here: and click
📗 The same ID should generate the same set of questions. Your answers are not saved when you close the browser. You could print the page: , solve the problems, then enter all your answers at the end.
📗 Some of the referenced past exams can be found in on Professor Zhu's and Professor Dyer's websites: Link and Link.
📗 Please do not refresh the page: your answers will not be saved. You can save and load your answers (only fill-in-the-blank questions) using the buttons at the bottom of the page.
📗 Please report any bugs on Piazza.

# Warning: please enter your ID before you start!


# Question 1 [4 points]

📗 (Fall 2016 Final Q18, Fall 2011 Midterm Q20) Consider a classification problem with \(n\) = classes \(y \in \left\{1, 2, ..., n\right\}\), and two binary features \(x_{1}, x_{2} \in \left\{0, 1\right\}\). Suppose \(\mathbb{P}\left\{Y = y\right\}\) = , \(\mathbb{P}\left\{x_{1} = 1 | Y = y\right\}\) = , \(\mathbb{P}\left\{x_{2} = 1 | Y = y\right\}\) = . Which class will naive Bayes classifier produce on a test item with \(x_{1}\) = and \(x_{2}\) = .
📗 Hint: make sure you read and understand Fall 2016 Final Q18.
📗 Answer: .

# Question 2 [3 points]

📗 (Fall 2019 Final Q22, Fall 2019 Final Q23, Fall 2014 Final Q9) Consider the following directed graphical model over binary variables: \(A \to  B \leftarrow C\). Given the CPTs:
Variable Probability Variable Probability
\(\mathbb{P}\left\{A = 1\right\}\)
\(\mathbb{P}\left\{C = 1\right\}\)
\(\mathbb{P}\left\{B = 1 | A = C = 1\right\}\) \(\mathbb{P}\left\{B = 1 | A = 0, C = 1\right\}\)
\(\mathbb{P}\left\{B = 1 | A = 1, C = 0\right\}\) \(\mathbb{P}\left\{B = 1 | A = C = 0\right\}\)

What is the probability that \(\mathbb{P}\left\{A = a, B = b, C = c\right\}\), \(a, b, c\) = ?
📗 Answer: .

# Question 3 [5 points]

📗 (Fall 2019 Final Q22, Fall 2019 Final Q23, Fall 2014 Final Q9) Consider the following directed graphical model over binary variables: \(A \to  B \to  C\). Given the CPTs:
Variable Probability Variable Probability
\(\mathbb{P}\left\{A = 1\right\}\)
\(\mathbb{P}\left\{B = 1 | A = 1\right\}\) \(\mathbb{P}\left\{B = 1 | A = 0\right\}\)
\(\mathbb{P}\left\{C = 1 | B = 1\right\}\) \(\mathbb{P}\left\{C = 1 | B = 0\right\}\)

What is the probability that \(\mathbb{P}\left\{A = a | C = c\right\}\), \(a, c\) = ?
📗 Answer: .

# Question 4 [5 points]

📗 (Fall 2019 Final Q22, Fall 2019 Final Q23, Fall 2014 Final Q9) Consider the following directed graphical model over binary variables: \(A \leftarrow B \to  C\). Given the CPTs:
Variable Probability Variable Probability
\(\mathbb{P}\left\{B = 1\right\}\)
\(\mathbb{P}\left\{C = 1 | B = 1\right\}\) \(\mathbb{P}\left\{C = 1 | B = 0\right\}\)
\(\mathbb{P}\left\{A = 1 | B = 1\right\}\) \(\mathbb{P}\left\{A = 1 | B = 0\right\}\)

What is the probability that \(\mathbb{P}\left\{A = a | C = c\right\}\), \(a, c\) = ?
📗 Answer: .

# Question 5 [2 points]

📗 (Fall 2011 Midterm Q15) Given the following network \(A \to  B \to  C\) where A can take on values, B can take on values, C can take on values. Write down the minimum number of conditional probabilities that define the CPTs.
📗 Answer: .

# Question 6 [3 points]

📗 (Spring 2018 Final Q21, Fall 2011 Midterm Q16) You roll a 6-sided die times and observe the following counts: . Use Laplace smoothing (i.e. add-1 smoothing), estimate the probability of each side.
📗 Answer (comma separated vector, 6 numbers):

# Question 7 [2 points]

📗 (Fall 2011 Midterm Q18) An n-gram language model computes the probability \(\mathbb{P}\left\{w_{n} | w_{1}, w_{2}, ..., w_{n-1}\right\}\). How many parameters need to be estimated for a -gram language model given a vocabulary size of ?
📗 Answer: .

# Question 8 [2 points]

📗 (Fall 2010 Final Q19) We have a biased coin with probability of producing Heads. We create a predictor as follows: generate a random number uniformly distributed in (0, 1). If the random number is less than we predict Heads, otherwise, we predict Tails. What is this predictor's accuracy in predicting the coin's outcome?
📗 Answer: .

# Question 9 [2 points]

📗 Use the Transformer: Link to generate some interesting (funny, nonsensical, etc) stories and share one on Piazza: Link. You should start with a sentence instead of a word or a phrase.
📗 Note: We did not talk about attention mechanisms and the Transformers during the RNN lecture, but you can read about it on the "Talk to Transformer" website.
📗 The initial sentence is: and I have participated in the discussion on Piazza.

# Question 10 [1 points]

📗 Please enter any comments and suggestions including possible mistakes and bugs with the questions and the auto-grading, and materials relevant to solving the questions that you think are not covered well during the lectures. If you have no comments, please enter "None": do not leave it blank.
📗 Answer: .

# Grade


 ***** ***** ***** ***** ***** 

 ***** ***** ***** ***** ***** 

📗 Please copy and paste the text between the *****s (not including the *****s) and submit it on Canvas, M7.
📗 You could save the text as text file using the button or just copy and paste it into a text file.
📗 Warning: the load button does not function properly for all questions, please recheck everything after you load. You could load your answers using the button from the text field:







Last Updated: November 09, 2021 at 12:30 AM