Reading Assignment 7: Normalizing Flows
Overview
The below are multiple options for getting the general idea of normalizing flows,
along with an overview over different methods.
- An approachable introduction to normalizing flows can be found in
this blog post by
Lilian Weng. You can skip the parts on autoregressive models (PixelRNN, Wavenet).
- Next, this two-part blog post by
Eric Jang provides another view; especially the discussion on the meaning
of the Jacobian determinant in the first part can be a nice addition. A lot of
information will be a repetition from the first blog. You can/should skip the
code sections; they are heavily outdated.
- Finally, Murphy’s book has an
in-depth chapter devoted to normalizing flows.
- If you need a refresher, on the substitution rule for integrals,
the Wikipedia article
should be sufficient. The change of variable theorem used for normalizing flows
is the multi-dimensional version of this.
Specific Models
This can be considered optional reading, but try to read at least one of these to
get a detailed view on a specific flow model.
- NICE: A relatively simple model and not very
powerful, but it arguably lead the groundwork for all deep flow models.
- RealNVP: A massive step up from NICE.
- Glow: Yet another iteration on RealNVP,
using invertible 1x1 convolutions.
- Parallel Wavenet: A neat application of
IAFs for efficient Wavenet sampling.
- A whole other usage
of normalizing flows is to improve the variational posteriors in VAEs.