CS 760: Machine Learning (Spring 2017)

Instructor:
David Page
page@biostat.wisc.edu
(Please put cs760 in email subject line; otherwise it's easy to overlook emails)
Office Hours: 1pm2pm Tuesdays and Thursdays in 1157 WID

TAs:
Kirthanaa Raghuraman
kraghuraman (at) wisc (dot) edu
Office Hour: TBD
Heemanshu Suri
hsuri (at) wisc (dot) edu
Office Hour: TBD
 Important Dates:
 Prerequisite: CS 540 or equivalent
 Meeting Time and Location: 11am MWF, 132 Noland
 Textbook:
 Tom Mitchell (1997). Machine Learning. McGrawHill.
 The following textbook is freely available for download and can be tested as alternative if you like: ShalevShwartz and BenDavid (2014). Let me know after the semester how it worked for you.

Course Overview
Many of the same technologies underly adaptive autonomous robots,
scientific knowledge discovery, adaptive game playing and discovery
from databases. This course will focus on these key underlying
technologies, particularly supervised learning. The course will
cover support vector machines, decision tree learners, neural network
learning and Bayesian classifiers, among others. It also will
address reinforcement learning and learning from relational data, including statistical
relational learning and inductive logic programming.
It will cover correct evaluation methodology, including case studies
of methodological errors.
Course Outline
 Course Overview, Feature Vector Representation, Unsupervised Learning Overview (Mitchell Ch. 1)
 Decision Trees (Mitchell Ch. 3)
 InstanceBased Learning, kNearest Neighbor (Mitchell Ch. 8.1 and 8.2)
 Brief Introduction to Probability (Mitchell Ch. 6, supplementary background notes on probability and Bayesian Networks: 1, 2, 3)
 Bayesian Network Learning including Naive Bayes and TAN (Heckerman Tutorial; Recommended: Friedman, Geiger & Goldszmidt, Machine Learning Journal 1997; Friedman, Nachman & Peer, UAI99; Mitchell Ch. 6; additional lecture notes on Gibbs Sampling and MCMC theory [PDF])
 Machine Learning Methodology (Mitchell Ch. 5; Optional Supplements: The Case Against Accuracy Estimation for Comparing Induction Algorithms by F. Provost, T. Fawcett, and R. Kohavi, Proc. ICML98; The Relationship Between PrecisionRecall and ROC Curves by J. Davis and M. Goadrich, Proc. ICML06)
 Computational Learning Theory [PDF](Mitchell Ch. 7)
 Ensemble Methods [PDF] (Dietterich, 2002)
 Neural Networks and Deep Learning (Mitchell Ch. 4, Andrew Ng's Deep Learning Tutorial)
 Support Vector Machines (BenHur and Weston, 2010, Alternative SVM Lecture by Gautam Kunapuli (optional), Chris Burges's tutorial (optional))
 SVM by Sequential Minimal Optimization (SMO)[PDF] (Platt's original SMO paper)
 Reinforcement Learning (Mitchell Ch. 13)
 Rule Learning and Relational Learning (Mitchell Ch. 10)
 Markov Networks. Try this tutorial on loglinear models by Frank Ferraro and Jason Eisner.
 Statistical Relational Learning
 The lectures below for ILP and SRL will not be used in class, but are left here for background.
 Background for Rule Learning and Inductive Logic Programming (Mitchell Ch. 10; for added background see De Raedt & Muggleton)
 Rule Learning and Inductive Logic Programming (Mitchell Ch. 10; for added background see De Raedt & Muggleton)
 Statistical Relational Learning
 Regression (Linear and Logistic, including LASSOpenalized forms)
 DimensionalityReduction
 Remaining Topics: Active Learning, Causal Discovery, Temporal Models, MultipleInstance Learning (if time permits)
Course Requirements
The grading for the course will be be based on:
 Homework Assignments (5 anticipated): 40%
 Exam 1: 30%
 Exam 2: 30%
Homework Policy
The programming assignments are to be done
individually.
You may communicate with
other class members about the problem, but please do not
seek or receive help from people not in the class, and
please do not share answers or code.
Your programs may be in C, C++, Java, or Python. Other languages may be available on approval from the TAs.
You must submit both linux executable and source code; your program should run on the CS Dept. lab computers. Assignments are to be submitted at the
course moodle.
Homework assignments are due at the start of class on the assigned due date,
and late homeworks will be penalized 20 points (out of 100) for
each lecture that passes after the assigned due date. Homeworks cannot be submitted more than one week late; the submission site will be locked at that time. At the
start of the course every student will be given 3 "free" days,
each of which may be used to offset a 20point late penalty.
Only 1 free day can be used for any given written assignment, so that solutions can be posted at next class period.
Free days are nontransferable, and no credit will be given for
unused free days. Nevertheless, please use them sparingly because
the late penalty is strictly enforced.
Assignments Weightage
Assignment 1  4%
Assignments 2 and 5 (Programming)  10% each
Assignments 3 and 4 (Written)  8% each
Homework Assignments
Assignment 1. Assigned 1/20, Due 1/29.
Assignment 2 Assigned 2/1, Due 2/15
Assignment 3 Assigned 2/16, Due 2/24
Assignment 4 Assigned 3/9, Due 3/18
Assignment 5 Assigned 4/4, Due 4/18
Sample Exams
Additional Sample Exercises