➩ Result in a binary tree with close clusters as children.
TopHat Discussion
ID:
📗 [1 points] Given the following dataset, use hierarchical clustering to divide the points into groups. Drag one point to another point to merge them into one cluster. Click on a point to move it out of the cluster.
📗 [1 points] Move the green point so that it is within 100 pixels of the red point measured by the distance. Highlight the region containing all points within 100 pixels of the red point.
📗 Distance between clusters (group of points) can be measured by single linkage distance, complete linkage distance, or average linkage distance.
➩ Single linkage distance: the shortest distance from any item in one cluster to any item in the other cluster: Wikipedia.
➩ Complete linkage distance: the longest distance from any item in one cluster to any item in the other cluster: Wikipedia.
➩ Average linkage distance: the average distance from any item in one cluster to any item in the other cluster (average of distances, not distance between averages): Wikipedia.
TopHat Discussion
ID:
📗 [1 points] Highlight the Euclidean distance between the two clusters (red and blue) measured by the linkage distance.
Distance:
TopHat Quiz
(Past Exam Question) ID:
📗 [4 points] You are given the distance table. Consider the next iteration of hierarchical agglomerative clustering (another name for the hierarchical clustering method we covered in the lectures) using linkage. What will the new values be in the resulting distance table corresponding to the new clusters? If you merge two columns (rows), put the new distances in the column (row) with the smaller index. For example, if you merge columns 2 and 4, the new column 2 should contain the new distances and column 4 should be removed, i.e. the columns and rows should be in the order (1), (2 and 4), (3), (5).
\(d\) =
📗 Answer (matrix with multiple lines, each line is a comma separated vector): .
📗 The number of clusters should be chosen based on prior knowledge about the dataset.
📗 The algorithm can also stop merging as soon as all the between-cluster distances are larger than some fixed threshold.
📗 The binary tree generated by hierarachical clustering is often called dendrogram: Wikipedia.
testhi,di,li,haq
📗 Notes and code adapted from the course taught by Professors Jerry Zhu, Yingyu Liang, and Charles Dyer.
📗 Content from note blocks marked "optional" and content from Wikipedia and other demo links are helpful for understanding the materials, but will not be explicitly tested on the exams.
📗 Please use Ctrl+F5 or Shift+F5 or Shift+Command+R or Incognito mode or Private Browsing to refresh the cached JavaScript.
📗 You can expand all TopHat Quizzes and Discussions: , and print the notes: , or download all text areas as text file: .
📗 If there is an issue with TopHat during the lectures, please submit your answers on paper (include your Wisc ID and answers) or this Google form Link at the end of the lecture.