PyTorch codebase for paper Training Latent Variable Models with Auto-encoding Variational Bayes: A Tutorial.
@misc{zhihan2022aevb,
url = {https://arxiv.org/abs/2208.07818},
author = {Zhi-Han, Yang},
title = {Training Latent Variable Models with Auto-encoding Variational Bayes: A Tutorial},
publisher = {arXiv},
year = {2022}
}
In the tutorial, we motivate the Auto-encoding Variational Bayes (AEVB) algorithm from the classic Expectation Maximization (EM) algorithm, and then derive from scratch the AEVB training procedure for the following models:
- Factor Analysis
- Variational Auto-Encoder (VAE)
- Conditional VAE
- Gaussian Mixture VAE by Rui Shu
- Variational RNN
This repo contains minimal PyTorch implementation of these models. Pre-trained models are included for all models except Factor Analysis (which takes less than 10 seconds to train) so it's easy to play around. All other models also take less than 30 minutes to train from scratch. To run the notebooks, create a conda environment, install the required packages with pip install -r requirements.txt
, and you should be ready.
(All plots below are created using the notebooks in this repo. It's very likely to get better quality generations if you train longer; I didn't train the models to convergence to save time.)
Factor analysis
Variational Auto-Encoder
Conditional VAE
Gaussian Mixture VAE by Rui Shu (clusters ordered manually in the plot)
Variational RNN