📗 Enter your ID here: and click 1,2,3,4,5,6,7,8,9,10
📗 The same ID should generate the same set of questions. Your answers are not saved when you close the browser. You could print the page: , solve the problems, then enter all your answers at the end.
📗 Some of the referenced past exams can be found in on Professor Zhu's and Professor Dyer's websites: Link and Link.
📗 Please do not refresh the page: your answers will not be saved. You can save and load your answers (only fill-in-the-blank questions) using the buttons at the bottom of the page.
📗 (Fall 2018 Midterm Q13) You performed PCA in \(\mathbb{R}^{3}\). If the first principal component is \(v_{1}\) = and the second principal component is \(v_{2}\) = . What is the new 2D coordinates for the point \(x\) = ?
📗 (Fall 2018 Midterm Q14) Let \(x\) = and \(v\) = . The projection of \(x\) onto \(v\) is the point \(y\) on the direction of \(v\) such that the line connecting \(x, y\) is perpendicular to \(v\). Compute \(y\):
📗 (Fall 2016 Final Q9, Fall 2014 Midterm Q5, Fall 2012 Final Q3) Perform k-means clustering on six points: \(x_{1}\) = , \(x_{2}\) = , \(x_{3}\) = , \(x_{4}\) = , \(x_{5}\) = , \(x_{6}\) = . Initially the cluster centers are at \(c_{1}\) = , \(c_{2}\) = . Run k-means for one iteration (assign the points, update center once and reassign the points once). Break ties in distances by putting the point in the cluster with the smaller index (i.e. favor cluster 1). What is the reduction in total distortion?
📗 Note: in 1D, use the Manhattan distances, so do not square the distances when computing the distortion.
📗 (Fall 2017 Final Q22, Fall 2014 Final Q20, Fall 2013 Final Q14) Consider the four points: \(x_{1}\) = , \(x_{2}\) = , \(x_{3}\) = , \(x_{4}\) = . Let there be two initial cluster centers \(c_{1}\) = , \(c_{2}\) = . Use Euclidean distance. Break ties in distances by putting the point in the cluster with the smaller index (i.e. favor cluster 1). Write down the cluster centers after one iteration of k-means, the first cluster center (comma separated vector) on the first line and the second cluster center (comma separated vector) on the second line.
📗 Answer (matrix with multiple lines, each line is a comma separated vector): .
📗 (Fall 2017 Final Q17, Fall 2016 Midterm Q10, Fall 2016 Final Q8, Fall 2014 Midterm Q1, Fall 2012 Final Q2, Fall 2010 Final Q12) Perform hierarchical clustering with single linkage in one-dimensional space on the following points: , , , , , . Break ties in distances by first combining the instances with the smallest index (appears earliest in the list). Draw the cluster tree.
📗 Note: the node \(C_{1}\) should be the first cluster formed, \(C_{2}\) should be the second and so on. All edges to point to the instances (or other clusters) that belong to the cluster.
📗 Answer:
📗 Note: to use the eraser, drag it from one node to another to remove the (directed) edge in between. This answer cannot be saved and loaded using the "download" and "load" buttons at the bottom of the page.
📗 (Fall 2017 Final Q17, Fall 2016 Midterm Q10, Fall 2016 Final Q8, Fall 2014 Midterm Q1, Fall 2012 Final Q2, Fall 2010 Final Q12) Perform hierarchical clustering with complete linkage in one-dimensional space on the following points: , , , , , . Break ties in distances by first combining the instances with the smallest index (appears earliest in the list). Draw the cluster tree.
📗 Note: the node \(C_{1}\) should be the first cluster formed, \(C_{2}\) should be the second and so on. All edges to point to the instances (or other clusters) that belong to the cluster.
📗 Answer:
📗 Note: to use the eraser, drag it from one node to another to remove the (directed) edge in between. This answer cannot be saved and loaded using the "download" and "load" buttons at the bottom of the page.
📗 (Fall 2011 Midterm Q2) Consider the 1D data set: \(x_{i} = i\) for \(i\) = to . To select good initial centers for k-means where \(k\) = , let's set \(c_{1}\) = . Then select \(c_{j}\) from the unused points in the data set, so that it is farthest from any already-selected centers \(c_{1}, ..., c_{j-1}\) (i.e. \(\displaystyle\max_{c_{j}} \displaystyle\min\left\{d\left(c_{1}, c_{j}\right), d\left(c_{2}, c_{j}\right), ..., d\left(c_{j-1}, c_{j}\right)\right\}\)). Enter the initial centers (including \(c_{1}\)) in increasing order (from the smallest to the largest). In case of ties, select the smaller number.
📗 (Spring 2017 Midterm Q4) You are given the distance table. Consider the next iteration of hierarchical agglomerative clustering (another name for the hierarchical clustering method we covered in the lectures) using linkage. What will the new values be in the resulting distance table corresponding to the four new clusters? If you merge two columns (rows), put the new distances in the column (row) with the smaller index. For example, if you merge columns 2 and 4, the new column 2 should contain the new distances and column 4 should be removed, i.e. the columns and rows should be in the order (1), (2 and 4), (3), (5).
\(d\) =
📗 Hint: the resulting matrix should have 4 columns and 4 rows.
📗 Answer (matrix with multiple lines, each line is a comma separated vector): .
📗 Go to the K-means clustering demo Link and find an example in which the algorithm converges to a local minimum that is not a global minimum. (You can use the same one I gave during the lecture, but make sure you can replicate it.) Post a screen shot on Piazza.
📗 The data set you used is (e.g. mine is Gaussian Mixture): and I have participated in the discussion on Piazza.
📗 Please enter any comments and suggestions including possible mistakes and bugs with the questions and the auto-grading, and materials relevant to solving the questions that you think are not covered well during the lectures. If you have no comments, please enter "None": do not leave it blank.
📗 Please copy and paste the text between the *s (not including the *s) and submit it on Canvas, M8.
📗 You could save the text as text file using the button or just copy and paste it into a text file.
📗 Warning: the load button does not function properly for all questions, please recheck everything after you load. You could load your answers using the button from the text field: