CS760, Fall 2017
Department of Computer Sciences
University of Wisconsin–Madison
This schedule is tentative and subject to change. Please check back often.
The reading is not required but strongly recommended for all students. Those explicitly noted as optional are for students interested in that specific topic. "A, B; C; D" means to read (A OR B) AND C AND D. Text in red means a link to the reading material. Abbreviations for textbooks:
Date | Lecture | Readings | Homework Released | Homework Due |
---|---|---|---|---|
Wed, Sep 6 | course overview [Slides] | Homework 1 (background test) | ||
Fri, Sep 8 | machine learning overview [Slides] | Mitchell chapter 1, Murphy chapter 1; Dietterich, Nature Encyclopedia of Cognitive Science, 2003; Jordan and Mitchell, Science, 2015 |
||
Mon, Sep 11 | decision tree learning part 1 [Slides] | Mitchell chapter 3, Murphy chapter 16.2, Shalev-Shwartz and Ben-David chapter 18 | ||
Wed, Sep 13 | decision tree learning part 2 [Slides] | Optional papers to read: 1. CART paper: Breiman, Leo; Friedman, J. H.; Olshen, R. A.; Stone, C. J. (1984). Classification and regression trees. Monterey, CA: Wadsworth & Brooks/Cole Advanced Books & Software. ISBN 978-0-412-04841-8. 2. ID3 paper: Quinlan, J. R. 1986. Induction of Decision Trees. Mach. Learn. 1, 1 (Mar. 1986), 81–106 3. C4.5 paper: Quinlan, J. R. C4.5: Programs for Machine Learning. Morgan Kaufmann Publishers, 1993. |
||
Fri, Sep 15 | instance-based methods [Slides] | Mitchell chapter 8, Shalev-Shwartz and Ben-David chapter 19 | ||
Mon, Sep 18 | evaluating learning algorithms part 1 [Slides] | Mitchell chapter 5, Murphy 5.7.2, 6.5.3; Manning et al., Sections 8.3-8.4 |
||
Wed, Sep 20 | evaluating learning algorithms part 2 [Slides] | Homework 1 due | ||
Fri, Sep 22 | no class | |||
Mon, Sep 25 | Naive Bayes [Slides] | Mitchell 6.1-6.10, Murphy 3 | ||
Wed, Sep 27 | regression (linear and losgistic) [Slides] | Murphy 8.1-3 and 8.6; Maximum Likelihood, Logistic Regression, and Stochastic Gradient Training |
||
Fri, Sep 29 | neural networks part 1 [Slides] | Mitchell chapter 4, Murphy 16.5 and 28, Bishop 5.1-5.3, Goodfellow-Bengio-Courville 6; LeCun et al., Nature, 2015 |
Homework 2 | |
Mon, Oct 2 | neural networks part 2 [Slides] | Mohri-Rostamizadeh-Talwalkar Appendix B (Convex Optimization), Bishop Appendix E (Lagrange Multipliers); Goodfellow-Bengio-Courville chapter 7 and 8; Optional paper to read: 1. Bishop, Neural Computation, 1995. |
||
Wed, Oct 4 | neural networks part 3 [Slides] | Goodfellow-Bengio-Courville chapter 9; Optional papers to read: 1. LeNet 2. AlexNet 3. ResNet |
||
Fri, Oct 6 | neural networks part 4 [Slides] | Goodfellow-Bengio-Courville chapter 10; Optional papers to read: 1. LSTM 2. GRU |
||
Mon, Oct 9 | neural networks part 5 [Slides] | Optional papers to read: 1. Deep Boltzmann Machines (DBM) 2. Generative Adversarial Networks (GAN) |
||
Wed, Oct 11 | learning theory part 1: PAC model [Slides] | Mohri-Rostamizadeh-Talwalkar Chapter 2, Mitchell chapter 7 | ||
Fri, Oct 13 | learning theory part 2: mistake-bound model [Slides] | Optional paper to read: 1. Littlestone, N.; Warmuth, M. (1994). The Weighted Majority Algorithm. Information and Computation. |
Homework 3 | Homework 2 due |
Mon, Oct 16 | learning theory part 3: bias-variance decomposition [Slides] | Geman et al., Neural Computation, 1992 (Sections 1-3) | ||
Wed, Oct 18 | Bayesian networks part 1 [Slides] | Mitchell chapter 6, Bishop chapter 8.1, Shalev-Shwartz and Ben-David chapter 24 | ||
Fri, Oct 20 | Bayesian networks part 2 [Slides] | Heckerman Tutorial | ||
Mon, Oct 23 | Bayesian networks part 3 [Slides] | Optional papers to read: 1. TAN algorithm (Friedman et al., Machine Learning, 1997) 2. Sparse Candidate algorithm (Friedman, Nachman, and Peer, UAI, 1999) |
||
Wed, Oct 25 | discriminative vs. generative learning [Slides] | Ng and Jordan, NIPS 2001 | ||
Fri, Oct 27 | support vector machines part 1 [Slides] | Andrew Ng's note on SVM, Ben-Hur and Weston's note; Mohri-Rostamizadeh-Talwalkar Appendix B (Convex Optimization), Bishop Appendix E (Lagrange Multipliers) |
Homework 4 | Homework 3 due |
Mon, Oct 30 | support vector machines part 2 [Slides] | Bishop chapter 6.1,6.2,7.1, or Shalev-Shwartz and Ben-David chapter 15, 16 | ||
Wed, Nov 1 | ensemble methods [Slides] | Dietterich, AI Magazine 1997 | ||
Fri, Nov 3 | feature selection [Slides] | |||
Mon, Nov 6 | dimension reduction [Slides] | Bishop 12.1 and 12.3, or Shalev-Shwartz and Ben-David 22 and 23 | ||
Wed, Nov 8 | reinforcement learning part 1 [Slides] | Mitchell Chapter 13 | ||
Fri, Nov 10 | reinforcement learning part 2 [Slides] | Reinforcement Learning, A Survey by Kaelbling, et al | ||
Mon, Nov 13 | active learning learning [Slides] | Two faces of active learning by Sanjoy Dasgupta, 2011. Optional: Introduction to Semi-Supervised Learning by Jerry Zhu, Chapters 1-3, 5. |
Homework 5 | |
Wed, Nov 15 | machine learning in practice [Slides] | Domingos, CACM, 2012 | Project proposal due | |
Friday, Nov 17 | Homework 4 due | |||
Monday, Nov 27 | Homework 5 due | |||
Dec 19 | Project report due | |||
Dec 19th from 2:45 – 4:45 PM | Final Exam | Place: PSYCHOLOGY 113 |