Skip to content

Minimal VAE, Conditional VAE (CVAE), Gaussian Mixture VAE (GMVAE) and Variational RNN (VRNN) in PyTorch, trained on MNIST.

License

Notifications You must be signed in to change notification settings

zhihanyang2022/aevb-tutorial

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

33 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

AEVB Tutorial

Intro

PyTorch codebase for paper Training Latent Variable Models with Auto-encoding Variational Bayes: A Tutorial.

@misc{zhihan2022aevb,
  url = {https://arxiv.org/abs/2208.07818},
  author = {Zhi-Han, Yang},
  title = {Training Latent Variable Models with Auto-encoding Variational Bayes: A Tutorial},
  publisher = {arXiv},
  year = {2022}
}

In the tutorial, we motivate the Auto-encoding Variational Bayes (AEVB) algorithm from the classic Expectation Maximization (EM) algorithm, and then derive from scratch the AEVB training procedure for the following models:

This repo contains minimal PyTorch implementation of these models. Pre-trained models are included for all models except Factor Analysis (which takes less than 10 seconds to train) so it's easy to play around. All other models also take less than 30 minutes to train from scratch. To run the notebooks, create a conda environment, install the required packages with pip install -r requirements.txt, and you should be ready.

Visualizations

(All plots below are created using the notebooks in this repo. It's very likely to get better quality generations if you train longer; I didn't train the models to convergence to save time.)

Factor analysis

Variational Auto-Encoder

Conditional VAE

Gaussian Mixture VAE by Rui Shu (clusters ordered manually in the plot)

Variational RNN

About

Minimal VAE, Conditional VAE (CVAE), Gaussian Mixture VAE (GMVAE) and Variational RNN (VRNN) in PyTorch, trained on MNIST.

Topics

Resources

License

Stars

Watchers

Forks