Probabilistic Graphical
Models
EE 639 Advanced Topics in Signal
Processing and Communication
Fall 2014

.
[Home] [Lectures] [Final Project]
Dr. Senching Cheung (cheung at engr.uky.edu)
Office: DMB 217 (2180299)
Office hours: MF 12:151:30 pm (NEW TIME!)
Regular class: WF 9:00am10:15am at FPAT 460 (NEW TIME! NEW ROOM!)
Course website: http://www.vis.uky.edu/~cheung/courses/ee639
Final Examination: No final exam
A central tenant of any empirical sciences is to construct probabilistic models for prediction and estimation based on available data. The enormous advances in computing, sensing and networking technologies provide us with an unprecedented capability in collecting and storing an inordinate amount of data. Much of these data are noisy, interrelated and highdimensional. One particular framework has gradually emerged as the most appropriate tool to handle uncertainty and to build complex models in a modular and algorithmic fashion. This framework is the Probabilistic Graphical Model, the focus of the EE639 course this semester.
Probabilistic graphical models are probabilistic models that encode local information of conditional independency among a large number of random variables. By choosing an appropriate (sparse) graph to describe the data, powerful and rigorous techniques exist to perform prediction, estimation, datafusion as well as handling uncertainty and missing data. Many classical multivariate systems in pattern recognition, information theory, statistics and statistical mechanics are special cases of this general framework – examples include hidden Markov models, regression, mixture models, Kalman filters and Ising models. In this course, we will study how graph theory and probability can be elegantly combined under this framework to represent many commonlyused probabilistic tools, and how they can be applied in solving many practical problems. We will put equal emphasis on both theoretical understanding of the subject and practical knowhow on using it in real applications. The grading is based on three components: homework assignments throughout the semester, two midterms and a final project.
Your grade will be based on: 

Homework 
30% 
Midterm 1 
20% 
Midterm 2 
20% 
Final Project 
30% 
Required Text 
Recommended Text 
D. Barber Bayesian Reasoning and Machine Learning, Cambridge, 2012. 
D. Koller
and N. Friedman Probabilistic Graphical Models: Principles and Techniques, MIT Press, 2009. 
Additional material including research papers and programming examples will be given throughout the semester.