.

University of Washington, Department of Electrical Engineering

 EE 512: Graphical Models -- Spring 2006 

Instructor

Announcements

Information

Homework

Lecture Slides

Links

Old Scribes

Older Scribes


Instructor

Prof.
Jeff A. Bilmes (bilmes |AT| ee |DOT| washington .DOT. edu)
Office: 418 EE/CS Bldg., 221-5236
Office hours: Wednesdays, 11:30pm-12:30pm, 418 EE/CS Bldg.

TA

Chris Bartels (bartels |AT| ee |DOT| washington .DOT. edu)
Office: Sieg Tutoring Center (ground floor)
Office hours: Thursdays, 3:30-4:30pm (or by appointment), Sieg Tutoring Center
Discussion Section: Fridays, 9:30-10:30am, Sieg 128


Class BLOG/Announcements


  • (May 15th) Reminder that we have class again tomorrow. Also, your progress reports are due this Thursday, 5/18. I'd like to see some substantial progress this round.
  • (May 12th) Midtern now due on Monday, May 15th at 5:00pm at my (JB) office . Version of midterm with the 2 typos fixed in problem 5 is posted below.
  • (May 5th) The midterm is posted.
  • (April 29th) Homework 4, problem 5 has been slightly updated to improve clarification.
  • (April 28th) Homework 4 is ready, see below.
  • (April 24th) Reading for this week: Jordan Chapters 4,10,12,17,18 and HMM handout.
  • (April 23rd) Next homework will come out this week, to be due May 5th.
  • (April 23rd) Reminder: Take home midterm, out May 5th, due one week after that.
  • (April 20th) Reading Assignment: Please read Chapters 11,12,15 from Jordan.
  • (April 14th) Handout 2 on the Hammersley-Clifford theorem is now online.
  • (April 14th) Homework 3 is now online (see below).
  • (April 7th) Homework 2 is now online (see below).
  • (April 3rd) Reading Assignment: Please read Chapter 3 from Jordan.
  • (April 2nd) HW1 is now ready, due Friday. Since I was late getting it out, it is relatively short (and easy :).
  • (March 28th) Lecture slides are now availble here.
  • (March 27th) Reading Assignment: Please read Chapters 1 and 2 from Jordan. (Chapter 1 is just a pointer to one of Jordan's papers, available in ps or pdf. You'll have to purchase the book from the copy center to get Chapter 2 and all subsequent chapters)


    Information

    Description: This course will serve as a thorough overview of graphical models including Bayesian networks, undirected models, and factor graphs.

    1. Intro. Graph types: conditional independence; directed, undirected, and factor models; algorithms for conditional independence (e.g., Bayes-ball, d-separation, Markov properties on graphs, factorization, Hammersley-Clifford theorems).
    2. Models: linear Gaussian models, mixture models, factor analysis, probabilistic decision trees, Markov Random Fields, Gibbs distributions, static conditional random fields (CRFs), multivariate Gaussians as graphical models.
    3. Dynamic (temporal) models: Hidden Markov Models, Kalman filtering and linear-Gaussian HMMs, linear dynamic models, dynamic Bayesian networks (DBNs), label and observation bias in natural language processing, dynamic conditional random fields (CRFs), and general dynamic graphical models.
    4. Chordal Graph Theory: moralization; triangulated, decomposable, and intersection graphs, $k$-trees, hyper-graphs. Tree-width and path-width parameters of a graph. Graph separators and their crossings. Relational data-base systems and joins, and phylogenetic tree construction.
    5. Exact Probabilistic Inference: junction trees, belief propagation in its various forms (including Pearl's formulation, Hugin, Shafer-Shenoy, Bucket-elimination, etc.); join-trees and data-structures; optimal triangulations. The elimination family of algorithms. Generalized distributed law (GDL). Relation to dynamic programming. Generality (such as Viterbi, MPE, the fast Fourier transform). NP hardness results, and NP hardness results for finding the best form.
    6. Inference as search: Search-based approaches to inference and how it relates to junction trees, message passing, and BP. Results from the constraint satisfaction (CSP) and SAT literatures, constraint solvers, and heuristics used in these domains. Cutset and recursive conditioning, comparison with elimination. Branch-and-bound.
    7. Approximate Probabilistic Inference: loopy belief propagation (BP), expectation propagation (EP), generalized belief propagation (GBP), cluster GBP, comparisons with mean-field and variational approaches, statistical physics, Bethe and Kikuchi free-energies, fixed points, recent theoretical results and open questions. Also, sampling approaches such as MCMC and particle algorithms (such as Rao-Blackwellized particle filtering). Pruning based approaches.
    8. Learning: learning Bayesian networks, hardness results for learning networks in the maximum likelihood sense, learning in the PAC (probably approximately correct) framework, EM for parameter and structure learning, alternating minimization, discriminative parameter and structure learning, CRF training vs. discriminatively trained HMM, other learning methods. Information theory and graphical models.
    9. Models in practice: Real-world static graphs. HMMs and DBNs in speech, language, and bioinformatics, QMR and Factorial HMMs, turbo-coding, low-density parity check codes, other codes on graphs, belief propagation algorithms on these graphs. Practical issues such as data structures and algorithms useful for performing inference.


    Prerequisites: basic probability, statistics, and random processes (e.g., EE505 or a Stat 5xx class or consent of the instructor). Knowledge of matlab. The course is open to students in all UW departments.

    Course Format: Four hours of lecture (TTh 1:30-3:20 EE1- 031) per week.

    Texts: We will use one main text, but draw from two other texts:

    1. The main text will be "An Introduction to Graphical Models", by Mike Jordan. This text has not yet been published, but we will use a printed version available in the communications copy center.
    2. We will also use handouts.

    A printable course overview is in PS and PDF formats, which gives more information (such as grading policy, other interesting texts, etc.).


    Homework

    1. Midterm (solutions, statistics).
    2. HW4 is now ready in PDF form. (hw4 solutions, statistics).
    3. HW3 is now ready in PDF form. (hw3 solutions, statistics).
    4. HW2 is now ready in PDF form. (hw2 solutions, statistics).
    5. HW1 is now ready in PDF form. (hw1 solutions, statistics).

    Links

    EE512 EPost bulletin board (you will need to log in with your UW netid). It is ok to discuss (but not give answers to) problem sets and project ideas, but no discussion of midterm is allowed. Instructions on how to use Epost are here

    What HMMs can do

    Uncertainty in Artificial Intelligence (UAI) Archive

    Mike Jordan's home page

    Dave Heckerman's page

    Steffen L. Lauritzen's page

    The GMTK page

    Kevin Murphy's Bayes net toolbox and collection of other software

    Various Graphical Models Papers (including structure learning)



    Maintained by Jeff Bilmes and Chris Bartels. Last updated: $Date: 2006/06/17 23:10:42 $ UTC (subtract UTC by 8 hours to get PST, subtract UTC by 7 hours to get PDT)