This reading assignemt focusses on unsupervised and self-supervised learning tasks as a main driver for representation learning.
The Deep Learning Book - Chapter 14: Autoencoders (optional reading!) covers the topic very well in depth.
However, in favor of including the second topic, the blog post “Introduction to autoencoders” by Jeremy Jordan provides the most important details.
Refer to the book chapter if you would like to know further details!
Dive deeper into self-supervised representation learning approaches (like BERT) with Lilian Weng’s overview blog post “Self-Supervised Representation Learning”.
Continue with the follow-up post “Contrastive Representation Learning” that covers very recent developments.
Optionally, the blog post “Unsupervised Cross-lingual Representation Learning” by Sebastian Ruder provides a glimpse at an exciting new NLP task that is tackled by deep learning.
Further optionally, the Deep Learning Book - Chapter 15: Representation Learning also addresses the topic but does not cover the most recent work (sincs 2015).