This reading assignemt focusses on unsupervised and self-supervised learning tasks as a main driver for representation learning.
Dive deeper into self-supervised representation learning approaches (like BERT) with Lilian Weng’s overview blog post “Self-Supervised Representation Learning”.
Continue with the follow-up post “Contrastive Representation Learning” that covers very recent developments.
Optionally, the blog post “Unsupervised Cross-lingual Representation Learning” by Sebastian Ruder provides a glimpse at an exciting new NLP task that is tackled by deep learning.
Further optionally, the Deep Learning Book - Chapter 15: Representation Learning also addresses the topic but does not cover the most recent work (since 2015).
For further in-depth reading, follow the links from the blog posts to the referenced articles.
The Deep Learning Book - Chapter 14: Autoencoders also covers the topic very well in depth.
Another overview of Autoencoders is provided in the blog post “Introduction to autoencoders” by Jeremy Jordan.