CS 540  Introduction to Artificial Intelligence
Section 2
Spring 2017


CS 540 Examinations
Schedule
 Midterm Examination
Thursday, March 9, 7:15 p.m.  9:15 p.m., room 105 Psychology
(Makeup Midterm: Thusday, March 9, 5:00  7:00 p.m., room 22 Ingraham Hall)
Closed book.
Bring a calculator. No phones allowed.
One 8.5 x 11 sheet of paper with notes on both sides allowed.
Covers topics in first half of the course,
including
readings,
lectures
and
assignments.
That is, covers
Search (Chapters 3.1  3.6, 4.1, 5.1  5.3, 5.5),
and some Machine Learning (Chapters 18.1  18.4, 18.8.1)
The following readings were assigned but will not be covered in
the exam: Chapters 1, 2.
You are responsible for topics covered in lecture even if there
are no readings associated with a topic.
Also, you are responsible for topics
covered in the readings, except those explicitly excluded,
even if they were not covered in class or in the lecture notes.
A summary of topics is given HERE.
 Final Examination
Sunday, May 7, 5:05 p.m.  7:05 p.m., room 272 Bascom Hall
Closed book. Not cumulative  covers topics since the midterm exam though will also
refer back to basic concepts from the first half of the course. One 8.5 x 11 sheet of paper
with notes on both sides allowed.
Bring a calculator (not on a phone).
Emphasizes topics since the Midterm Examination including
readings,
lectures
and
assignments.
That is, covers readings (Chapters
6.1  6.4,
13, 14.1, 14.2, 14.4, 15.1  15.3, 18.6.3, 18.6.4, 18.7, 18.9, 18.10, 23.5)
and the required papers on deep learning and HMMs.
The papers on Bayesian Networks and Eigenfaces are recommended but not required.
You should have knowledge sufficient
to work through simple examples using the algorithms covered in class
including
constraint satisfaction problems, neural networks, Perceptrons, backpropagation, deep learning, convolutional neural networks,
support vector machines, support vectors, margin, slack variables, kernel trick,
basic probability, uncertainty reasoning, full joint probability distribution,
marginalization, summing out, conditioning rule, product rule, chain rule, conditionalized version of chain rule,
Bayes's rule, conditionalized version of Bayes's rule, independence, conditional independence,
Bayesian networks, inference by enumeration, Naive Bayes,
Markov model, Hidden Markov model, speech recognition, language model, acoustic model,
Adaboost, ViolaJones face detection algorithm, face recognition using eigenfaces.
A summary of topics is given HERE.
The exam will focus on material since the midterm.
Questions may, however, refer
back to issues brought up before the midterm, however, so you should
refresh your memory about the main ideas and methods from the
material associated with the midterm examination.
For example, you should be able to relate general search questions to
the topics in this part of the course (e.g., what is the search
space and what is the search method).
Old Exams
