Reading Assignment 3: Approximate Inference & Variational Autoencders

Preliminaries

You can start by reading chapter 19, intro and 19.1 for some context of what we understand under “inference”, why this is difficult as well as connecting to the previous topics (at least somewhat). You may want to check the definiton of the Kullback-Leibler Divergence first.

You can also read the remainder of the chapter if you want – however this is very hard to follow. Another, somewhat simpler explanation can be found in this short blog post. However, at the end of the day, variational inference is a highly mathematical topic, so formulas are hard to avoid.

The main point is for you to understand that we will be using a lower bound on the likelihood, namely the evidence lower bound (ELBO), as a training objective.

Finally, read section 20.9 in the Deep Learning Book on backpropagating through random operations. This is an important “trick” to successfully use VAEs. You can skip section 20.9.1 on discrete operations.

Variational Autoencoders

We offer you a variety of readings on VAEs. These explain the concept in different levels of detail and from different perspectives. See which ones work for you!