## CS 228: Probabilistic Graphical Models## Stanford / Computer Science / Winter 2016-2017 |

## Announcements

Important announcements will be posted on Piazza.

## General Information

Time/Location:

- Lectures: Tue/Thu 9:00-10:20am, Skilling Auditorium
- Office Hours: See calendar
- Final Exam: March 22, 8:30-11:30

Instructor:Stefano ErmonCourse Assistants:

- Aditya Grover (aditya28@stanford.edu)
- Alex Bishara (abishara@cs.stanford.edu)
- Ethan Chan (ethancys@stanford.edu)
- Kratarth Goel(kratarth@stanford.edu)
- Xiaocheng Li (chengli1@stanford.edu)
- Bo Wang(bowang87@stanford.edu)

Calendar:Click here for detailed information of all lectures, office hours, and due dates.

Contact:Please use Piazza for all questions related to lectures and coursework. For SCPD students, please email scpdsupport@stanford.edu or call 650-741-1542.## Coursework

Course Description:

Probabilistic graphical models are a powerful framework for representing complex domains using probability distributions, with numerous applications in machine learning, computer vision, natural language processing and computational biology. Graphical models bring together graph theory and probability theory, and provide a flexible framework for modeling large collections of random variables with complex interactions. This course will provide a comprehensive survey of the topic, introducing the key formalisms and main techniques used to construct them, make predictions, and support decision-making under uncertainty.

The aim of this course is to develop the knowledge and skills necessary to design, implement and apply these models to solve real problems. The course will cover: (1) Bayesian networks, undirected graphical models and their temporal extensions; (2) exact and approximate inference methods; (3) estimation of the parameters and the structure of graphical models.

Prerequisites:Students are expected to have background in basic probability theory, statistics, programming, algorithm design and analysis.

Required Textbook:Probabilistic Graphical Models: Principles and Techniquesby Daphne Koller and Nir Friedman. MIT Press.

Lecture notes:Lecture notes are available here and will be periodically updated throughout the quarter.Further Readings:

Modeling and Reasoning with Bayesian networksby Adnan Darwiche.

Pattern Recognition and Machine Learningby Chris Bishop.

Machine Learning: a Probabilistic Perspectiveby Kevin P. Murphy.

Information Theory, Inference, and Learning Algorithmsby David J. C. Mackay. Available online.

Bayesian Reasoning and Machine Learningby David Barber. Available online.

Graphical models, exponential families, and variational inferenceby Martin J. Wainwright and Michael I. Jordan. Available online

Grading Policy:

Homeworks (70%): There will be five homeworks with both written and programming parts. Each homework is centered around an application and will also deepen your understanding of the theoretical concepts. Homeworks will be posted on Piazza.

Final Exam (30%): March 22, 8:30-11:30

Piazza: You will be awarded with up to 3% extra credit if you answer other students' questions in a substantial and helpful way, or contribute to the lecture notes with pull requests.

Assignments:

Written Assignments: Homeworks should be written up clearly and succinctly; you may lose points if your answers are unclear or unnecessarily complicated. You are encouraged to use LaTeX to writeup your homeworks (here is a template), but this is not a requirement.

Collaboration Policy and Honor Code: You are free to form study groups and discuss homeworks and projects. However, you must write up homeworks and code from scratch independently without referring to any notes from the joint session. You should not copy, refer to, or look at the solutions in preparing their answers from previous years' homeworks. It is an honor code violation to intentionally refer to a previous year's solutions, either official or written up by another student. Anybody violating the honor code will be referred to the Office of Judicial Affairs.

Submission Instructions:

We will be using the GradeScope online submission system. All students (non-SCPD and SCPD) should submit their assignments electronically via GradeScope. Students can typeset or scan their homeworks.

To register for GradeScope,

- Create an account on GradeScope if you don't have one already.
- Join CS228 course using Entry Code 9VX7J9
- Fill in this form.
Here are some tips for submitting through Gradescope.

Late Homework: You have 6 late days which you can use at any time during the term without penalty. For a particular homework, you can use only two late days. Once you run out of late days, you will incur in a 25% penalty for each extra late day you use. Each late homework should be clearly marked as "Late" on the first page.

Regrade Policy: You may submit a regrade request if you believe that the course staff made an error in grading. Any regrade requests should be submitted through Gradescope within one week of receiving your grade. Please try to be as specific as possible with your regrade request.

## Syllabus (tentative, periodically updated throughout the quarter)

Many thanks to David Sontag, Adnan Darwiche, Vibhav Gogate, and Tamir Hazan for sharing material used in slides and homeworks.

Week DateTopicReadingsAssignments1 Jan. 10-12 Introduction, Probability Theory, Bayesian NetworksChapters 1-3 Homework 1 released. Due January 24. 2 Jan. 17-19 Undirected modelsChapter 4 3 Jan. 24-26 Learning Bayes NetsChapters 16, 17 Homework 2 released. Due February 3. 4 Jan. 31- Feb 2 Exact Inference; Message PassingChapters 9, 10 Homework 3 released. Due February 17. 5 Feb. 7-9 SamplingChapter 12 6 Feb. 14-16 MAP Inference; Structured predictionChapter 13; Homework 4 released. Due March 3.

7 Feb. 21-23 Parameter LearningChapter 20; Chapter 19 8 Feb. 28- Mar. 2 Bayesian Learning; Structure LearningChapter 17; Chapter 18 9 Mar. 7-9 Exponential families; variational inferenceChapter 8; Chapter 11; Graphical models, exponential families, and variational inference (Section 3) 10 Mar. 14-16 Advanced topics and conclusions## Other Resources

There are many software packages available that can greatly simplify the use of graphical models. Here are a few examples: