📗 Enter your ID (the wisc email ID without @wisc.edu) here: and click (or hit the "Enter" key) 1,2,3,4,5,6,7,8,9,10,11m8
📗 The same ID should generate the same set of questions. Your answers are not saved when you close the browser. You could print the page: , solve the problems, then enter all your answers at the end.
📗 Please do not refresh the page: your answers will not be saved.
📗 [3 points] Let \(x\) = and \(v\) = . The projection of \(x\) onto \(v\) is the point \(y\) on the direction of \(v\) such that the line connecting \(x, y\) is perpendicular to \(v\). Compute \(y\).
Hint
See Fall 2018 Midterm Q14. To compute the projection: if \(v\) is a unit vector \(\left\|v\right\| = 1\), use the simplified formula: \(v^\top x v\); otherwise, use the formula: \(\dfrac{v^\top x}{v^\top v} v\).
📗 Answer (comma separated vector): .
📗 [2 points] You performed PCA (Principal Component Analysis) in \(\mathbb{R}^{3}\). If the first principal component is \(u_{1}\) = \(\approx\) and the second principal component is \(u_{2}\) = \(\approx\) . What is the new 2D coordinates (new features created by PCA) for the point \(x\) = ?
📗 In the diagram, the black axes are the original axes, the green axes are the PCA axes, the red vector is \(x\), the red point is the reconstruction \(\hat{x}\) using the PCA axes.
Hint
See Fall 2018 Midterm Q13, Fall 2017 Final Q10. Coordinate \(i\) is given by the projection of \(x\) onto the principal component \(v_{i}\). If the principal component is a unit vector \(u_{i}\), use the simplified formula: \(u_{i^\top} x\); otherwise, use the formula: \(\dfrac{v_{i^\top} x}{v_{i^\top} v_{i}}\).
📗 Answer (comma separated vector): .
📗 [3 points] Given the variance matrix \(\hat{\Sigma}\) = . If one original data point is \(x\) = . What is the reconstructed vector using only the first principal components?
Hint
First find the principal components, call them \(u_{1}, u_{2}, u_{3}\): the first principal component is the eigenvector corresponding to the largest eigenvalue (for diagonal matrices, these are just the diagonal entries), and the \(i\)th principal component is the eigenvector corresponding to the \(i\)th largest eigenvalue. Here, the eigenvector corresponding to the first eigenvalue is \(\begin{bmatrix} 1 \\ 0 \\ 0 \end{bmatrix}\), the eigenvector corresponding to the second eigenvalue is \(\begin{bmatrix} 0 \\ 1 \\ 0 \end{bmatrix}\), and the eigenvector corresponding to the third eigenvalue is \(\begin{bmatrix} 0 \\ 0 \\ 1 \end{bmatrix}\). Then find the new feature vector in the \(K\) dimensional space: the \(i\)th component in the new feature vector is \(u_{i^\top} x\), meaning the new feature vector here (with \(K = 2\) is \(\begin{bmatrix} v_{1} \\ v_{2} \end{bmatrix}\) = \(\begin{bmatrix} u_{1^\top} x \\ u_{2^\top} x \end{bmatrix}\). At the end, the reconstructed vector is given by \(v_{1} u_{1} + v_{2} u_{2}\) = \(u_{1^\top} x u_{1} + u_{2^\top} x u_{2}\).
📗 Answer (comma separated vector):
📗 [4 points] Perform hierarchical clustering with linkage in one-dimensional space on the following points: , , , , , . Break ties in distances by first combining the instances with the smallest index (appears earliest in the list). Draw the cluster tree.
📗 Note: the node \(C_{1}\) should be the first cluster formed, \(C_{2}\) should be the second and so on. All edges to point to the instances (or other clusters) that belong to the cluster.
Hint
See Fall 2019 Midterm Q20, Spring 2018 Midterm Q5, Fall 2017 Final Q17, Fall 2016 Midterm Q10, Fall 2016 Final Q8, Fall 2014 Midterm Q1, Fall 2012 Final Q2, Fall 2010 Final Q12, Fall 2006 Midterm Q4. Start with 6 clusters with one point each. (1) Find the two clusters that are the closest to each other (measure the distance between two clusters by either the smallest pairwise distance of points in the clusters (single linkage) or the largest pairwise distance (complete linkage). Draw edges from a new cluster node \(C_{i}\) to these two existing clusters (or instances). (2) Repeat (1) until all all instances are in one cluster.
📗 Answer:
graph
📗 Note: to erase an edge, draw the same edge again.
📗 [4 points] Perform hierarchical clustering with linkage in one-dimensional space on the following points: , , , , , . Break ties in distances by first combining the instances with the smallest index (appears earliest in the list). Draw the cluster tree.
📗 Note: the node \(C_{1}\) should be the first cluster formed, \(C_{2}\) should be the second and so on. All edges to point to the instances (or other clusters) that belong to the cluster.
Hint
See Fall 2019 Midterm Q20, Spring 2018 Midterm Q5, Fall 2017 Final Q17, Fall 2016 Midterm Q10, Fall 2016 Final Q8, Fall 2014 Midterm Q1, Fall 2012 Final Q2, Fall 2010 Final Q12, Fall 2006 Midterm Q4. Start with 6 clusters with one point each. (1) Find the two clusters that are the closest to each other (measure the distance between two clusters by either the smallest pairwise distance of points in the clusters (single linkage) or the largest pairwise distance (complete linkage). Draw edges from a new cluster node \(C_{i}\) to these two existing clusters (or instances). (2) Repeat (1) until all all instances are in one cluster.
📗 Answer:
graph
📗 Note: to erase an edge, draw the same edge again.
📗 [4 points] You are given the distance table. Consider the next iteration of hierarchical clustering using linkage. What will the new values be in the resulting distance table corresponding to the new clusters? If you merge two columns (rows), put the new distances in the column (row) with the smaller index. For example, if you merge columns 2 and 4, the new column 2 should contain the new distances and column 4 should be removed, i.e. the columns and rows should be in the order (1), (2 and 4), (3).
\(d\) = Hint
See Spring 2017 Midterm Q4. The resulting matrix should have 4 columns and 4 rows. Find the smallest non-zero number in the pair-wise distance matrix, suppose row \(i\) and column \(j\), merge columns \(i\) and \(j\) and rows \(i\) and \(j\) at the same time: for single linkage, take the minimum of the numbers in the two rows and columns; for complete linkage, take the maximum.
📗 Answer (matrix with multiple lines, each line is a comma separated vector): .
📗 [4 points] Consider the four points: \(x_{1}\) = , \(x_{2}\) = , \(x_{3}\) = , \(x_{4}\) = . Let there be two initial cluster centers \(c_{1}\) = , \(c_{2}\) = . Use Euclidean distance. Break ties in distances by putting the point in the cluster with the smaller index (i.e. favor cluster 1). If a cluster contains no points, do not move the cluster center (it stays at the initial position). Write down the cluster centers after one iteration of k-means, the first cluster center (comma separated vector) on the first line and the second cluster center (comma separated vector) on the second line.
📗 Note: the red points are the cluster centers and the other points are the training items.
Hint
See Fall 2019 Midterm Q22, Spring 2018 Midterm Q7, Fall 2017 Final Q22, Spring 2017 Midterm Q5, Fall 2014 Final Q20, Fall 2013 Final Q14, Fall 2006 Final Q14, Fall 2005 Final Q14. Find which cluster each \(x_{i}\) belongs to (call it \(k_{i}\)): it's the cluster center that is the closest to the point. Compute the new cluster centers \(c'_{1}, c'_{2}\) as \(c'_{k} = \dfrac{1}{\displaystyle\sum_{k_{i} = k} 1} \displaystyle\sum_{k_{i} = k} x_{i}\).
📗 Answer (matrix with multiple lines, each line is a comma separated vector): .
📗 [3 points] Perform k-means clustering on six points: \(x_{1}\) = , \(x_{2}\) = , \(x_{3}\) = , \(x_{4}\) = , \(x_{5}\) = , \(x_{6}\) = . Initially the cluster centers are at \(c_{1}\) = , \(c_{2}\) = . Run k-means for one iteration (assign the points, update center once and reassign the points once). Break ties in distances by putting the point in the cluster with the smaller index (i.e. favor cluster 1). What is the reduction in total distortion? Use Euclidean distance and calculate the total distortion by summing the squares of the individual distances to the center.
📗 Note: the red points are the cluster centers and the other points are the training items.
Hint
See Spring 2018 Midterm Q7, Fall 2016 Final Q9, Fall 2014 Midterm Q5, Fall 2012 Final Q3. (1) Find which cluster each \(x_{i}\) belongs to (call it \(k_{i}\)): it's the cluster center that is the closest to the point. (2) Compute the total distortion as \(\displaystyle\sum_{i=1}^{6} \left(x_{i} - c_{k_{i}}\right)^{2}\). (3) Compute the new cluster centers \(c'_{1}, c'_{2}\) as \(c'_{k} = \dfrac{1}{\displaystyle\sum_{k_{i} = k} 1} \displaystyle\sum_{k_{i} = k} x_{i}\). Then repeat (1) and (2). Take the difference between the two distortions.
📗 Answer: .
📗 [3 points] Consider the 1D data set: \(x_{i} = i\) for \(i\) = to . To select good initial centers for k-means where \(k\) = , let's set \(c_{1}\) = . Then select \(c_{j}\) from the unused points in the data set, so that it is farthest from any already-selected centers \(c_{1}, ..., c_{j-1}\) (i.e. \(c_{j} = \mathop{\mathrm{argmax}}_{x_{i}} \displaystyle\min\left\{d\left(c_{1}, x_{i}\right), d\left(c_{2}, x_{i}\right), ..., d\left(c_{j-1}, x_{i}\right)\right\}\)). Enter the initial centers (including \(c_{1}\)) in increasing order (from the smallest to the largest). In case of ties, select the smaller number.
Hint
See Fall 2011 Midterm Q2. Use the formula repeatedly: \(c_{j} = \mathop{\mathrm{argmax}}_{x_{i}} \displaystyle\min\left\{d\left(c_{1}, x_{i}\right), d\left(c_{2}, x_{i}\right), ..., d\left(c_{j-1}, x_{i}\right)\right\}\).
📗 Answer (comma separated vector): .
📗 [4 points] Suppose K-Means with \(K = 2\) is used to cluster the data set and initial cluster centers are \(c_{1}\) = and \(c_{2}\) = \(x\). What is the smallest value of \(x\) if cluster 1 has \(n\) = points initially (before updating the cluster centers). Break ties by assigning the point to cluster 2.
📗 Note: the red points are the cluster centers and the other points are the training items.
Hint
The \(n\) points on the left (or right, depending on the question) should be assigned to cluster 1. The \(n + 1\)-th point (call it \(x_{n + 1}\) from the left (or right) can be equidistant from cluster 1 center and cluster 2 center because if the distances to the clusters are the same, the point is assigned to cluster 2 due to the tie-breaking rule. Therefore, \(x_{n + 1} = \dfrac{1}{2} \left(c_{1} + c_{2}\right)\) can be used to solved for \(c_{2}\).
📗 Answer: .
📗 [1 points] Please enter any comments and suggestions including possible mistakes and bugs with the questions and the auto-grading, and materials relevant to solving the questions that you think are not covered well during the lectures. If you have no comments, please enter "None": do not leave it blank.
📗 You could save the text in the above text box to a file using the button or copy and paste it into a file yourself .
📗 You could load your answers from the text (or txt file) in the text box below using the button . The first two lines should be "##m: 8" and "##id: your id", and the format of the remaining lines should be "##1: your answer to question 1" newline "##2: your answer to question 2", etc. Please make sure that your answers are loaded correctly before submitting them.