Probabilistic Graphical Models

EE 639 Advanced Topics in Signal Processing and Communication

Fall 2014 


.

[Home] [Lectures] [Final Project]

 


Professor

Dr. Sen-ching Cheung (cheung at engr.uky.edu)

Office:            DMB 217 (218-0299)

Office hours: M-F 12:15-1:30 pm (NEW TIME!)


Schedule

Regular class:             WF 9:00am-10:15am at FPAT 460 (NEW TIME! NEW ROOM!)

Course website:          http://www.vis.uky.edu/~cheung/courses/ee639

Final Examination:     No final exam

 


Course Description

A central tenant of any empirical sciences is to construct probabilistic models for prediction and estimation based on available data. The enormous advances in computing, sensing and networking technologies provide us with an unprecedented capability in collecting and storing an inordinate amount of data. Much of these data are noisy, inter-related and high-dimensional. One particular framework has gradually emerged as the most appropriate tool to handle uncertainty and to build complex models in a modular and algorithmic fashion. This framework is the Probabilistic Graphical Model, the focus of the EE639 course this semester.

 

Probabilistic graphical models are probabilistic models that encode local information of conditional independency among a large number of random variables. By choosing an appropriate (sparse) graph to describe the data, powerful and rigorous techniques exist to perform prediction, estimation, data-fusion as well as handling uncertainty and missing data. Many classical multivariate systems in pattern recognition, information theory, statistics and statistical mechanics are special cases of this general framework – examples include hidden Markov models, regression, mixture models, Kalman filters and Ising models.  In this course, we will study how graph theory and probability can be elegantly combined under this framework to represent many commonly-used probabilistic tools, and how they can be applied in solving many practical problems. We will put equal emphasis on both theoretical understanding of the subject and practical know-how on using it in real applications. The grading is based on three components: homework assignments throughout the semester, two midterms and a final project.


Tentative Topics

  1. Mathematical Preliminaries and Probabilistic Reasoning
  2. Overview of Graphical Models
  3. Efficient Inference in Trees
  4. Junction Tree Algorithm
  5. Basic Machine Learning Concepts
  6. Learning as Inference
  7. Learning with Hidden Variables
  8. Nearest Neighbor Classification
  9. Linear Dimension Reduction
  10. Mixture Models
  11. Discrete-State Markov Models
  12. Continuous-State Markov Models
  13. Approximate inference with Sampling
  14. Approximate inference with Variational techniques

 


Grading

Your grade will be based on:

 

Homework

30%

Midterm 1

20%

Midterm 2

20%

Final Project

30%

             

The numerical score is computed using the above weighting system.  The A's will be in the 90's, B's in the 80's, etc., unless, in my opinion, the difficulty of the material/tests justifies curving the grades.


Course Policy


Text

Required Text

Recommended Text

http://ecx.images-amazon.com/images/I/51Sqb96lt5L.jpg

D. Barber

Bayesian Reasoning and Machine Learning, Cambridge, 2012.

http://pgm.stanford.edu/Images/book.jpg

D. Koller and N. Friedman

Probabilistic Graphical Models: Principles and Techniques,

MIT Press, 2009.

Additional material including research papers and programming examples will be given throughout the semester.


Prerequisites: