Assignment 9: Flow

Discussion: June 29nd

In this assignment, we want to implement some simple flow models on toy datasets, as well as attempt to fit simple “real” datasets like MNIST. Please note that there is a notebook in the gitlab with some starter/example code that could be useful!

NICE

The OG flow model is the NICE model from this paper. It is also a very simple, making it a good candidate for first experiences with these models. Recall that in one of the readings from the lecture, code examples for how to implement flows in Tensorflow Probability are given. However,

But first, a note on terminology: In principle, it doesn’t matter which direction of the flow you call forward or backward, in which direction a function f is applied and in which its inverse, etc. However, it’s easy to get confused here because people use different conventions. I will strictly stick to the convention from the NICE paper, which is:

You might want to proceed as follows:

That takes care of the model itself. Once this works, setting up training is very simple!

With all this taken care of, your model is ready to train. First, try it on simple toy data. See the notebook for a sample dataset (parabola). Training proceeds as usual, by gradient descent. You can use the negative log likelihood as a loss function and use standard TF optimizers. Feel free to try other toy datasets as well. Make sure you can successfully fit such datasets before moving on! If training fails, there are likely problems with your implementation.

Moving on

Building a functional NICE model from the ground up is already quite an achievement. If you want to move beyond this, try the following: