Practical Probabilistic Modeling with Graphical Models

EE 639 Advanced Topics in Signal Processing and Communication

Fall 2009 


[Home] [Lectures] [Homework]



We are going to move effectively today. Please refer to the new location and time of the lecture below.


Visit course website at and our discussion group at  



Dr. Sen-ching Cheung (cheung at

Office: FPAT 687B (x7-9113)

Office hours: MWF 9-11am

Office: Room 831 VisCenter at Kentucky Utility Building (7-1257 ext. 80299)
Office hours: By appointment only


Regular class: MW 3:00pm-4:15pm (Relocated to the small conference room 869 at VisCenter)

Final Examination: Take-home final (will be issued on 12/14, due 12/16)


Course Description

A central tenant of any empirical sciences is to construct probabilistic models for prediction and estimation based on available data. The enormous advances in computing, sensing and networking technologies provide us with an unprecedented capability in collecting and storing an inordinate amount of data. Much of these data are noisy, inter-related and high-dimensional. For decades, mathematicians and engineers in different disciplines have developed specialized probabilistic models to characterize and utilize various types of data. One particular framework has gradually emerged as the most appropriate tool to unify disparate techniques and to build complex models in a modular and algorithmic fashion. This framework is the Probabilistic Graphical Model, the focus of the EE639 course this semester.


Probabilistic graphical models are probabilistic models that encode local information of conditional independency among a large number of random variables. By choosing an appropriate (sparse) graph to describe the data, powerful and rigorous techniques exist to perform prediction, estimation, data-fusion as well as handling uncertainty and missing data. Many classical multivariate systems in pattern recognition, information theory, statistics and statistical mechanics are special cases of this general framework – examples include hidden Markov models, regression, mixture models, Kalman filters and Ising models.  In this course, we will study how graph theory and probability can be elegantly combined under this framework to represent many commonly-used probabilistic tools, and how they can be applied in solving many practical problems. We will put equal emphasis on both theoretical understanding of the subject and practical know-how on using it in real applications. The grading is based on three components: a number of homework assignments throughout the semester, a take-home final and a final project.

Tentative Topics

  1. Basic Probability and Statistics
  2. Introduction to Graphical Models
  3. Simple Graphical Models: Linear Classification and Regression
  4. Parameter Estimation for Completely Observed Graphical Models
  5. Expectation Maximization: Parameter Estimation for Incomplete Graphical Models
  6. Exact Inference on Graphical Models
  7. Factor Analysis, Hidden Markov Model and Kalman Filtering
  8. Approximate Inference on Graphical Models
  9. Kernel Methods and Sparse Kernel Machines
  10. Applications: Error Control Coding, Image Segmentation, Speech Processing and others.



Your grade will be based on:




Take-home Final (due 12/16)


Final Project (poster session and report due 12/4)





Required Text







Sen-ching Samson Cheung

Last modified: August 16, 2009.