CS 320 Probabilistic Graphical Models
Prerequisite: Students are expected to be familiar with probability theory, algorithms, machine learning and programming language. This is a research-oriented graduate-level course on PGMs. The course will cover two main types of PGMs, i.e., directed PGMs and undirected PGMs. For directed PGMs we will cover Bayesian networks with one of its most important variants, hidden Markov models. For undirected PGMs, we will cover Markov networks (or Markov random fields) with one of its most important variants, conditional random fields. Therefore, the course contains four (4) parts: Bayesian networks, hidden Markov models, Markov networks and conditional random fields. In each part, motivations, ideas, definitions, examples, properties, representations, inference algorithms, and applications for the corresponding PGM will be introduced. This is done through lectures by the instructor. In the next two lectures, the students will present recommended research papers and lead in-class discussions. The last lecture of each part will be an in-class quiz, the purpose of which is not to judge their ability of calculation or memorization, but to push them to think more and deeper about the contents introduced in lectures. The course will finish by a final exam lecture and two project presentation lectures. The projects are expected to be a real application or a serious theoretical work of PGMs on real research problems.