Reading Assignment 2: More Sampling & Latent Variable Models
Sampling
Start by reading Section 14.3.
This explains sampling in so-called energy-based models, which is an important prerequisite to understand Boltzmann Machines
(this topic will be treated in the lecture).
Latent Variables
Most of the models we learn about in this class use so-called latent variables.
This topic is treated in Chapters 15 and 16 and forms an important basis for many future topics.
We recommend the following:
- Generally, Chapter 16 on continuous variables is more relevant to us than Chapter 15.
- Sections 16.1 and 16.2 introduce PCA and a probabilistic interpretation, respectively.
You can skip Sections 16.2.4 and onward.
- Section 16.4 gives important general background on nonlinear latent variable models, like the ones we will see in this
class.
- Chapter 15 is also interesting.
In particular, Sections 15.1 and 15.2 treat K-Means and Gaussian Mixture Models from a probabilistic perspective, and show a
simple application of the famous Expectation-Maximization algorithm.
Further Alternative Reading
In earlier iterations of this course, we relied on specific sections of the
Deep Learning Book that also cover these topics.
If you are interested, please refer to the
reading assignment from the 2024 class.