Reading Assignment 3: Approximate Inference, DBNs & DBMs
Chapter 19
(complete) with focus on:
- Why is inference hard/intractable?
- ELBO, pros/cons, conditions under which the bound is tight
- (optional) Expectation-Maximization algorithm (approach & pros/cons)
- (19.3 is optional) Maximum a Posteriori inference (approach & pros/cons)
- What is variational inference/variational learning?
- (19.4.1-19.4.3 can be skipped)
- How does using approximate inference influence the learning?
- Wake-Sleep algorithm
- What is an Inference Network and where is it needed?
- Inference network vs mean field fixed point equations
Chapter 20.3 on Deep Belief Networks
with focus on:
- Structure
- Layer-wise pre-training
- Generative (wake-sleep) and discriminative (DBN to MLP) fine-tuning
- Use cases, pros & cons
Bonus:
Chapter 20.4-20.4.4 on Deep Boltzmann Machines
with focus on:
- Variational Learning with Mean Field Approach (especially for DBM training)
- What is the difference between mean-field fixed-point updates and gibbs sampling?
Here are some additional/alternative resources, if you want some more visual explanations than the book.
DBN
- Salakutinov & Murray (2008) :
- Hinton:
- Hugo Larochelle:
Bonus: DBM training