Previous: W7,
Next: W10
# Summary
📗 Examples and quizzes:
E8
📗 Math homework:
M8
📗 Programming homework:
P4
📗 Monday lecture: 5:30 to 8:30,
Guest Link
📗 Tuesday programming office hours: 5:00 to 6:00,
Java Guest Link,
Python Guest Link
📗 Wednesday math homework office hours: 5:00 to 6:00,
Guest Link
📗 Thursday math homework office hours: 5:00 to 6:00,
Guest Link
📗 Friday office hours for other things: 5:00 to 6:00,
Guest Link
# Lectures
📗 Slides
Lecture 11:
Slides.
Lecture 12:
Slides.
📗 Videos
Lecture 11 Part 1:
Link
Lecture 11 Part 2:
Link
Lecture 11 Part 3:
Link
Lecture 12 Part 1:
Link
Lecture 12 Part 2:
Link
Lecture 12 Part 3:
Link
📗 Note
The video going through P4 grading:
Link.
The total distortion sometimes are defined as the sum of squared distances for Euclidean distances: that is the one I used for the gradient descent step derivation.
# Other Materials
📗 Relevant websites
Image Segmentation:
Link 1,
Link 2
K Means Clustering:
Link
K Gaussian Mixture:
Link
Tree of Life:
Link 1,
Link 2
Generative Adversarial Net:
Link
Principal Component:
Link 1,
Link 2
Eigen Face:
Link 1,
Link 2
t-distributed Stochastic Neighbor Embedding:
Link
Swiss Roll:
Link
tSNE Demo:
Link
PCA Proofs from Professor Jerry Zhu's 540 notes:
PDF File
📗 YouTube videos from 2019
What is the relationship between Naive Bayes and Logistic Regression?
Link
What is the relationship between K Means and Gradient Descent?
Link
Why is PCA solving eigenvalues and eigenvectors?
Part 1,
Part 2,
Part 3
How to update distance table for hierarchical clustering?
Link
How to update cluster centers for K-means clustering?
Link
How to compute projection?
Link
How to compute new features based on PCA?
Link
Last Updated: November 09, 2021 at 12:30 AM