Previous: M4, Next: M6

Back to week 4 page: Link

Official Due Date: June 14

# Written (Math) Problems

📗 Enter your ID here: and click
📗 The same ID should generate the same set of questions. Your answers are not saved when you close the browser. You could print the page: , solve the problems, then enter all your answers at the end.
📗 Some of the referenced past exams can be found in on Professor Zhu's and Professor Dyer's websites: Link and Link.
📗 Please do not refresh the page: your answers will not be saved. You can save and load your answers (only fill-in-the-blank questions) using the buttons at the bottom of the page.
📗 Please report any bugs on Piazza.

# Warning: please enter your ID before you start!


# Question 1 [3 points]

📗 (Fall 2014 Midterm Q6) What is the city-block distance (also known as L1 distance or Manhattan distance) between two points and ?
📗 Answer: .

# Question 2 [3 points]

📗 (Fall 2013 Final Q5) Consider binary classification in 2D where the intended label of a point \(x = \begin{bmatrix} x_{1} \\ x_{2} \end{bmatrix}\) is positive (1) if \(x_{1} > x_{2}\) and negative (0) otherwise. Let the training set be all points of the form \(x\) = where \(a, b\) are integers. Each training item has the correct label that follows the rule above. With a 1NN classifier (Euclidean distance), which ones of the following points are labeled positive?
📗 Hint: if multiple instances have the same distance to the new point, use the instances with larger x values.
📗 Choices:





None of the above
📗 Calculator: .

# Question 3 [3 points]

📗 (Fall 2012 Final Q4) Consider points in 2D and binary labels. Given the training data in the table, and use Manhattan distance with 1NN, which of the following points in 2D are classified as 1? Answer the question by first drawing the decision boundaries (the drawing is not graded).
index \(x_{1}\) \(x_{2}\) label
1 -1 -1
2 -1 1
3 1 -1
4 1 1



📗 Hint: as discussed in the lectures, if multiple instances have the same distance to the new point, use the ones with smaller indices.
📗 Choices:





None of the above

# Question 4 [3 points]

📗 (Fall 2016 Final Q11) Consider a training set with 8 items. The first dimension of their feature vectors are: . However, this dimension is continuous (i.e. it is a real number). To build a decision tree, one may ask questions in the form "Is \(x_{1} \geq \theta\)"? where \(\theta\) is a threshold value. Ideally, what is the maximum number of different \(\theta\) values should we consider for the first dimension \(x_{1}\)?
📗 Answer: .

# Question 5 [3 points]

📗 (Fall 2014 Midterm Q9, Fall 2012 Final Q6) A decision tree has depth \(d\) = (a decision tree where the root is a leaf node has \(d\) = 0). All its internal node has children. The tree is also complete, meaning all leaf nodes are at depth \(d\). If we require each leaf node to contain at least training examples, what is the minimum size of the training set?
📗 Answer: .

# Question 6 [3 points]

📗 (Fall 2014 Midterm Q10) A bag contains different colored balls. Randomly draw a ball from the bag with equal probability. What is the entropy of the outcome?
📗 Hint: log based 2 of x can be found by log(x) / log(2).
📗 Answer: .

# Question 7 [3 points]

📗 (Fall 2013 Final Q12, Fall 2011 Midterm Q4, Fall 2010 Final Q10) Statistically, December 18 is the cloudiest day of the year in Madison, Wisconsin. Your professor (not me) is not making this up. On that day, the sky is overcast, mostly cloudy, or partly cloudy of the time (C = 0), and clear or mostly clear of the time (C = 1). What is the entropy of the binary random variable C?
📗 Hint: log based 2 of x can be found by log(x) / log(2).
📗 Answer: .

# Question 8 [3 points]

📗 (Fall 2012 Final Q5, Fall 2011 Midterm Q5) The RDA Corporation has a prison with many cells. Without justification, you're about to be randomly thrown into a cell with equal probability. Cells to have Toruks that eat prisoners. Cells to are safe. With sufficient bribe, the warden will answer your question "Will I be in cell 1?" What's the mutual information (we call it information gain) between the warden's answer and your encounter with the Toruks?
📗 Note: I didn't write the stories in these questions, so I don't know the reference too.
📗 Compute the information gain based on entropy of Toruks and conditional entropy of Toruks given whether you are in cell 1. See Fall 2011 Midterm Q5 for more hints.
📗 Answer: .

# Question 9 [2 points]

📗 Go the MobileNet demo: Link. Find an image that is classified incorrectly by MobileNet and share it (and the incorrect label) on Piazza (link to the post will be added here). Please try to avoid posting inappropriate and offensive images on Piazza: Link
📗 The incorrect label is: and I have participated in the discussion on Piazza.

# Question 10 [1 points]

📗 Please enter any comments and suggestions including possible mistakes and bugs with the questions and the auto-grading, and materials relevant to solving the questions that you think are not covered well during the lectures. If you have no comments, please enter "None": do not leave it blank.
📗 Answer: .

# Grade


 * * * *

 * * * *

📗 Please copy and paste the text between the *s (not including the *s) and submit it on Canvas, M5.
📗 You could save the text as text file using the button or just copy and paste it into a text file.
📗 Warning: the load button does not function properly for all questions, please recheck everything after you load. You could load your answers using the button from the text field:







Last Updated: July 14, 2024 at 8:37 PM