Machine Learning

CS760, Fall 2018
Department of Computer Sciences
University of Wisconsin–Madison


Important Notes

This schedule is tentative and subject to change. Please check back often. In particular, the deadlines for the homework sets/project are tentative, please see Canvas for the actual deadlines.

The homework problem sets can be found on the Coursework page.

How to interpret the column Readings in the Tentative Schedule

The reading is not required but strongly recommended for all students. Those explicitly noted as optional are for students interested in that specific topic. "A, B; C; D" means to read (A OR B) AND C AND D. Text in red means a link to the reading material. Abbreviations for textbooks:

Tentative Schedule

Please view the pdf slides using Adobe PDF reader. Some notations do not show up correctly in say Chrome.
Date Lecture Readings Homework Released Homework Due
Wed, Sep 5 course overview [Slides] Homework 1
Fri, Sep 7 machine learning overview [Slides] Mitchell chapter 1, Murphy chapter 1;
Dietterich, Nature Encyclopedia of Cognitive Science, 2003;
Jordan and Mitchell, Science, 2015
Mon, Sep 10 decision tree learning part 1 [Slides] Mitchell chapter 3, Murphy chapter 16.2, Shalev-Shwartz and Ben-David chapter 18
Wed, Sep 12 decision tree learning part 2 [Slides] Optional papers to read:
1. CART paper: Breiman, Leo; Friedman, J. H.; Olshen, R. A.; Stone, C. J. (1984). Classification and regression trees. Monterey, CA: Wadsworth & Brooks/Cole Advanced Books & Software. ISBN 978-0-412-04841-8.
2. ID3 paper: Quinlan, J. R. 1986. Induction of Decision Trees. Mach. Learn. 1, 1 (Mar. 1986), 81–106
3. C4.5 paper: Quinlan, J. R. C4.5: Programs for Machine Learning. Morgan Kaufmann Publishers, 1993.
Fri, Sep 14 instance-based methods [Slides] Mitchell chapter 8, Shalev-Shwartz and Ben-David chapter 19
Mon, Sep 17 evaluating learning algorithms part 1 [Slides] Mitchell chapter 5, Murphy 5.7.2, 6.5.3;
Manning et al., Sections 8.3-8.4
Wed, Sep 19 evaluating learning algorithms part 2 [Slides] Homework 2 Homework 1 due
Friday, Sep 21 Naive Bayes [Slides] Mitchell 6.1-6.10, Murphy 3
Mon, Sep 24 regression (linear and losgistic) [Slides] Murphy 8.1-3 and 8.6;
Maximum Likelihood, Logistic Regression, and Stochastic Gradient Training
Wed, Sep 26 neural networks part 1 [Slides] Mitchell chapter 4, Murphy 16.5 and 28, Bishop 5.1-5.3, Goodfellow-Bengio-Courville 6;
LeCun et al., Nature, 2015
Fri, Sep 28 TA explaining Homework 1
Mon, Oct 1 neural networks part 2 [Slides] Mohri-Rostamizadeh-Talwalkar Appendix B (Convex Optimization), Bishop Appendix E (Lagrange Multipliers);
Goodfellow-Bengio-Courville chapter 7 and 8;
Optional paper to read:
1. Bishop, Neural Computation, 1995.
Wed, Oct 3 neural networks part 3 [Slides] Goodfellow-Bengio-Courville chapter 9;
Optional papers to read:
1. LeNet
2. AlexNet
3. ResNet
Homework 3 Homework 2 due
Fri, Oct 5 neural networks part 4 [Slides] Goodfellow-Bengio-Courville chapter 10;
Optional papers to read for part 4:
1. LSTM
2. GRU
Mon, Oct 8 neural networks part 5 [Slides] Goodfellow's tutorial about GAN;
Optional papers to read for part 5:
1. Deep Boltzmann Machines (DBM)
2. Generative Adversarial Networks (GAN)
Wed, Oct 10 learning theory part 1: PAC model [Slides] Mohri-Rostamizadeh-Talwalkar Chapter 2, Mitchell chapter 7
Fri, Oct 12 TA explaining Homework 2
Mon, Oct 15 learning theory part 3: bias-variance decomposition [Slides] Optional paper to read: Geman et al., Neural Computation, 1992 (Sections 1-3)
Wed, Oct 17 learning theory part 2: mistake-bound model [Slides];
Optional paper to read:
1. Littlestone, N.; Warmuth, M. (1994). The Weighted Majority Algorithm. Information and Computation.
Homework 4 Homework 3 due
Fri, Oct 19 Bayesian networks part 1 [Slides] Mitchell chapter 6, Bishop chapter 8.1, Shalev-Shwartz and Ben-David chapter 24; Heckerman Tutorial
Mon, Oct 22 Bayesian networks part 2 [Slides] Optional papers to read:
1. TAN algorithm (Friedman et al., Machine Learning, 1997)
2. Sparse Candidate algorithm (Friedman, Nachman, and Peer, UAI, 1999)
Project proposal due
Wed, Oct 24 Bayesian networks part 3 [Slides]
Fri, Oct 26 discriminative vs. generative learning [Slides] Ng and Jordan, NIPS 2001
Mon, Oct 29 support vector machines part 1 [Slides] Andrew Ng's note on SVM, Ben-Hur and Weston's note;
Mohri-Rostamizadeh-Talwalkar Appendix B (Convex Optimization), Bishop Appendix E (Lagrange Multipliers)
Wed, Oct 31 support vector machines part 2 [Slides] Bishop chapter 6.1,6.2,7.1, or Shalev-Shwartz and Ben-David chapter 15, 16
Fri, Nov 2 TA explaining Homework 3
Mon, Nov 5 ensemble methods [Slides] Dietterich, AI Magazine 1997
Wed, Nov 7 feature selection [Slides];
dimension reduction [Slides]
Bishop 12.1 and 12.3, or Shalev-Shwartz and Ben-David 22 and 23 Homework 5 Homework 4 due
Fri, Nov 9 reinforcement learning part 1 [Slides] Mitchell Chapter 13
Mon, Nov 12 reinforcement learning part 2 [Slides] Reinforcement Learning, A Survey by Kaelbling, et al
Wed, Nov 14 machine learning in practice [Slides] Domingos, CACM, 2012
Fri, Nov 16 TA explaining Homework 4
Wed, Nov 21 Homework 5 due
Wed, Nov 28 TA explaining Homework 5
Wed, Dec 12 Project report due