Prev: M7 Next: M9
Back to week 5 page: Link

# M8 Written (Math) Problems

📗 Enter your ID (the wisc email ID without @wisc.edu) here: and click (or hit enter key)
📗 The official deadline is July 25, but you can submit or resubmit without penalty until August 10.
📗 The same ID should generate the same set of questions. Your answers are not saved when you close the browser. You could print the page: , solve the problems, then enter all your answers at the end.
📗 Please do not refresh the page: your answers will not be saved.
📗 Please report any bugs on Piazza.

# Warning: please enter your ID before you start!


# Question 1



# Question 2



# Question 3



# Question 4



# Question 5



# Question 6



# Question 7



# Question 8



# Question 9



# Question 10



📗 [2 points] You performed PCA (Principal Component Analysis) in \(\mathbb{R}^{3}\). If the first principal component is \(u_{1}\) = \(\approx\) and the second principal component is \(u_{2}\) = \(\approx\) . What is the new 2D coordinates (new features created by PCA) for the point \(x\) = ?

📗 In the diagram, the black axes are the original axes, the green axes are the PCA axes, the red vector is \(x\), the red point is the reconstruction \(\hat{x}\) using the PCA axes.
Hint See Fall 2018 Midterm Q13, Fall 2017 Final Q10. Coordinate \(i\) is given by the projection of \(x\) onto the principal component \(v_{i}\). If the principal component is a unit vector \(u_{i}\), use the simplified formula: \(u_{i^\top} x\); otherwise, use the formula: \(\dfrac{v_{i^\top} x}{v_{i^\top} v_{i}}\).
📗 Answer (comma separated vector): .
📗 [3 points] Let \(x\) = and \(v\) = . The projection of \(x\) onto \(v\) is the point \(y\) on the direction of \(v\) such that the line connecting \(x, y\) is perpendicular to \(v\). Compute \(y\).
Hint See Fall 2018 Midterm Q14. To compute the projection: if \(v\) is a unit vector \(\left\|v\right\| = 1\), use the simplified formula: \(v^\top x v\); otherwise, use the formula: \(\dfrac{v^\top x}{v^\top v} v\).
📗 Answer (comma separated vector): .
📗 [3 points] Perform k-means clustering on six points: \(x_{1}\) = , \(x_{2}\) = , \(x_{3}\) = , \(x_{4}\) = , \(x_{5}\) = , \(x_{6}\) = . Initially the cluster centers are at \(c_{1}\) = , \(c_{2}\) = . Run k-means for one iteration (assign the points, update center once and reassign the points once). Break ties in distances by putting the point in the cluster with the smaller index (i.e. favor cluster 1). What is the reduction in total distortion? Use Euclidean distance and calculate the total distortion by summing the squares of the individual distances to the center.

📗 Note: the red points are the cluster centers and the other points are the training items.
Hint See Spring 2018 Midterm Q7, Fall 2016 Final Q9, Fall 2014 Midterm Q5, Fall 2012 Final Q3. (1) Find which cluster each \(x_{i}\) belongs to (call it \(k_{i}\)): it's the cluster center that is the closest to the point. (2) Compute the total distortion as \(\displaystyle\sum_{i=1}^{6} \left(x_{i} - c_{k_{i}}\right)^{2}\). (3) Compute the new cluster centers \(c'_{1}, c'_{2}\) as \(c'_{k} = \dfrac{1}{\displaystyle\sum_{k_{i} = k} 1} \displaystyle\sum_{k_{i} = k} x_{i}\). Then repeat (1) and (2). Take the difference between the two distortions.
📗 Answer: .
📗 [4 points] Consider the four points: \(x_{1}\) = , \(x_{2}\) = , \(x_{3}\) = , \(x_{4}\) = . Let there be two initial cluster centers \(c_{1}\) = , \(c_{2}\) = . Use Euclidean distance. Break ties in distances by putting the point in the cluster with the smaller index (i.e. favor cluster 1). If a cluster contains no points, do not move the cluster center (it stays at the initial position). Write down the cluster centers after one iteration of k-means, the first cluster center (comma separated vector) on the first line and the second cluster center (comma separated vector) on the second line.

📗 Note: the red points are the cluster centers and the other points are the training items.
Hint See Fall 2019 Midterm Q22, Spring 2018 Midterm Q7, Fall 2017 Final Q22, Spring 2017 Midterm Q5, Fall 2014 Final Q20, Fall 2013 Final Q14, Fall 2006 Final Q14, Fall 2005 Final Q14. Find which cluster each \(x_{i}\) belongs to (call it \(k_{i}\)): it's the cluster center that is the closest to the point. Compute the new cluster centers \(c'_{1}, c'_{2}\) as \(c'_{k} = \dfrac{1}{\displaystyle\sum_{k_{i} = k} 1} \displaystyle\sum_{k_{i} = k} x_{i}\).
📗 Answer (matrix with multiple lines, each line is a comma separated vector): .
📗 [4 points] Perform hierarchical clustering with linkage in one-dimensional space on the following points: , , , , , . Break ties in distances by first combining the instances with the smallest index (appears earliest in the list). Draw the cluster tree.
📗 Note: the node \(C_{1}\) should be the first cluster formed, \(C_{2}\) should be the second and so on. All edges to point to the instances (or other clusters) that belong to the cluster.
Hint See Fall 2019 Midterm Q20, Spring 2018 Midterm Q5, Fall 2017 Final Q17, Fall 2016 Midterm Q10, Fall 2016 Final Q8, Fall 2014 Midterm Q1, Fall 2012 Final Q2, Fall 2010 Final Q12, Fall 2006 Midterm Q4. Start with 6 clusters with one point each. (1) Find the two clusters that are the closest to each other (measure the distance between two clusters by either the smallest pairwise distance of points in the clusters (single linkage) or the largest pairwise distance (complete linkage). Draw edges from a new cluster node \(C_{i}\) to these two existing clusters (or instances). (2) Repeat (1) until all all instances are in one cluster.
📗 Answer: 



📗 Note: to erase an edge, draw the same edge again.
📗 [4 points] Perform hierarchical clustering with linkage in one-dimensional space on the following points: , , , , , . Break ties in distances by first combining the instances with the smallest index (appears earliest in the list). Draw the cluster tree.
📗 Note: the node \(C_{1}\) should be the first cluster formed, \(C_{2}\) should be the second and so on. All edges to point to the instances (or other clusters) that belong to the cluster.
Hint See Fall 2019 Midterm Q20, Spring 2018 Midterm Q5, Fall 2017 Final Q17, Fall 2016 Midterm Q10, Fall 2016 Final Q8, Fall 2014 Midterm Q1, Fall 2012 Final Q2, Fall 2010 Final Q12, Fall 2006 Midterm Q4. Start with 6 clusters with one point each. (1) Find the two clusters that are the closest to each other (measure the distance between two clusters by either the smallest pairwise distance of points in the clusters (single linkage) or the largest pairwise distance (complete linkage). Draw edges from a new cluster node \(C_{i}\) to these two existing clusters (or instances). (2) Repeat (1) until all all instances are in one cluster.
📗 Answer: 



📗 Note: to erase an edge, draw the same edge again.
📗 [3 points] Consider the 1D data set: \(x_{i} = i\) for \(i\) = to . To select good initial centers for k-means where \(k\) = , let's set \(c_{1}\) = . Then select \(c_{j}\) from the unused points in the data set, so that it is farthest from any already-selected centers \(c_{1}, ..., c_{j-1}\) (i.e. \(c_{j} = \mathop{\mathrm{argmax}}_{x_{i}} \displaystyle\min\left\{d\left(c_{1}, x_{i}\right), d\left(c_{2}, x_{i}\right), ..., d\left(c_{j-1}, x_{i}\right)\right\}\)). Enter the initial centers (including \(c_{1}\)) in increasing order (from the smallest to the largest). In case of ties, select the smaller number.
Hint See Fall 2011 Midterm Q2. Use the formula repeatedly: \(c_{j} = \mathop{\mathrm{argmax}}_{x_{i}} \displaystyle\min\left\{d\left(c_{1}, x_{i}\right), d\left(c_{2}, x_{i}\right), ..., d\left(c_{j-1}, x_{i}\right)\right\}\).
📗 Answer (comma separated vector): .
📗 [4 points] You are given the distance table. Consider the next iteration of hierarchical agglomerative clustering (another name for the hierarchical clustering method we covered in the lectures) using linkage. What will the new values be in the resulting distance table corresponding to the new clusters? If you merge two columns (rows), put the new distances in the column (row) with the smaller index. For example, if you merge columns 2 and 4, the new column 2 should contain the new distances and column 4 should be removed, i.e. the columns and rows should be in the order (1), (2 and 4), (3), (5).
\(d\) =
Hint See Spring 2017 Midterm Q4. The resulting matrix should have 4 columns and 4 rows. Find the smallest non-zero number in the pair-wise distance matrix, suppose row \(i\) and column \(j\), merge columns \(i\) and \(j\) and rows \(i\) and \(j\) at the same time: for single linkage, take the minimum of the numbers in the two rows and columns; for complete linkage, take the maximum.
📗 Answer (matrix with multiple lines, each line is a comma separated vector): .
📗 [4 points] Suppose K-Means with \(K = 2\) is used to cluster the data set and initial cluster centers are \(c_{1}\) = and \(c_{2}\) = \(x\). What is the value of \(x\) if cluster 1 has \(n\) = points initially (before updating the cluster centers). Break ties by assigning the point to cluster 2.
Hint The \(n\) points on the left (or right, depending on the question) should be assigned to cluster 1. The \(n + 1\)-th point (call it \(x_{n + 1}\) from the left (or right) can be equidistant from cluster 1 center and cluster 2 center because if the distances to the clusters are the same, the point is assigned to cluster 2 due to the tie-breaking rule. Therefore, \(x_{n + 1} = \dfrac{1}{2} \left(c_{1} + c_{2}\right)\) can be used to solved for \(c_{2}\).
📗 Answer: .
📗 [1 points] Please enter any comments and suggestions including possible mistakes and bugs with the questions and the auto-grading, and materials relevant to solving the questions that you think are not covered well during the lectures. If you have no comments, please enter "None": do not leave it blank.
📗 Answer: .

# Grade


 * * * *

 * * * * *

# Submission


📗 Please do not modify the content in the above text field: use the "Grade" button to update.


📗 Please wait for the message "Successful submission." to appear after the "Submit" button. If there is an error message or no message appears after 10 seconds, please save the text in the above text box to a file using the button or copy and paste it into a file yourself and submit it to Canvas Assignment M8. You could submit multiple times (but please do not submit too often): only the latest submission will be counted.
📗 You could load your answers from the text (or txt file) in the text box below using the button . The first two lines should be "##m: 8" and "##id: your id", and the format of the remaining lines should be "##1: your answer to question 1" newline "##2: your answer to question 2", etc. Please make sure that your answers are loaded correctly before submitting them.



# Solutions

📗 Some of the past exams referenced in the Hints can be found on Professor Zhu's and Professor Dyer's websites: Link and Link.
📗 Some of the questions are from last year, and I recorded videos going through them, the links are at the bottom of the Week 1 to Week 8 pages, for example: W4 and W8.
📗 The links to the solutions the students volunteered to share on Piazza will be collected in this post around the official deadline: Link.





Last Updated: November 18, 2024 at 11:43 PM