Implementations of normalizing flows for variational inference using Python (3.6+) and Tensorflow (2.0+).
pip install git+https://github.com/bgroenks96/normalizing-flows
Take a look at the intro notebook for a gentle introduction to normalizing flows.
This library currently implements the following flows:
-
Planar/radial flows (Rezende and Mohamed, 2015)
-
Triangular Sylvester flows (Van den Berg et al, 2018)
-
Glow (Kingma et al, 2018)
-
AlignFlow1 (Grover et al, 2019)
1 Implemented via JointFlowLVM; the flow architecture from the paper is not currently supported. However, Glow can (and possibly should) be used instead.
The normalizing_flows
package currently provides two interfaces for building flow-based models:
-
Marginal inference (FlowLVM, JointFlowLVM)
-
Variational autoencoder (GatedConvVAE)
Marginal inference models directly optimize the log-evidence
VAE inference minimizes the evidence lower bound (ELBO) and thus requires only forward evaluations, FlowLayer
(see the intro notebook for an example). In theory, Glow could also be used in a VAE, but this has not been tested and is not currently supported by FlowLayer
, which assumes the latent space to be dense and non-spatial.
t-SNE mapped latent space across 4 flow steps
Please note that this package was developed for the purposes of the author's own research. As such, it might not be as fully featured or rounded out as other libraries. However, contributions and discussions about new features and flow implementations is both welcome and encouraged!
Additional types flows under consideration for future versions:
-
Orthogonal/Householder Sylvester flows
-
Inverse Autoregressive Flows (Kingma et al, 2016)
-
Neural Autoregressive Flows (Huang et al, 2018)
Currently, this library has no published documentation outside of docstrings in the code. This may change in the future.
Please feel free to create an issue if anything isn't clear or more documentation is needed on specific APIs.
This library is free and open source software under the MIT license.