A notebook series introducing the theory and implementation of generative models, from classical latent variable models to modern deep architectures. Each notebook contains mathematical derivations and from-scratch implementations.
- 1.1 Mixture Models - model data as a mixture of distributions, introducing discrete latent variables to capture clusters.
- 1.2 Factor Analysis - explain data as linear combinations of continuous latent factors, uncovering hidden structure behind observed correlations.
- 1.3 Variational Autoencoders (VAEs) - introduce non-linear mappings with neural networks, using continuous latent variables and variational inference.
- 2.1 Discrete Flow Models (in progress)
- 2.2 Continuous Flow Models
- 3.1 Generative Adversarial Networks (GANs)
- 3.2 Score-Based Models
- 4.1 Denoising Diffusion Probabilistic Models (DDPM) - generate data by iteratively denoising a simple distribution through a learned reverse-time Markov chain.
Coming soon
This series loosely follows the structure of Deep Generative Modeling by Jakub M. Tomczak (Springer, 2022) and Probabilistic Machine Learning: Advanced Topics, Generation by Kevin Murphy (MIT Press, 2023) alongside key seminal papers.