This package implements (conditional) PDFs with Joint Autoregressive Manifold (MY) normalizing-flows. It grew out of work for the paper Unifying supervised learning and VAEs - coverage, systematics and goodness-of-fit in normalizing-flow based neural network models for astro-particle reconstructions [arXiv:2008.05825]. Several other state-of-the art flows are implemented sometimes using slight modifications or extensions.
-
For Euclidean manifolds, it includes an updated implementation of the offical implementation of Gaussianization flows [arXiv:2003.01941], where now the inverse is differentiable (adding Newton iterations to the bisection) and made more stable using better approximations of the inverse Gaussian CDF.
-
Neural spline flows can be modified to have smoothness constraints (as decribed in arXiv:2604.19846), which allows them to behave numerically more stable in conditional settings.
The package has a simple syntax that lets the user define a PDF and get going with a single line of code that should just work. To define a 10-d PDF, with 4 Euclidean dimensions, followed by a 2-sphere, followed again by 4 Euclidean dimensions, one could for example write
import jammy_flows
pdf=jammy_flows.pdf("e4+s2+e4", "gggg+n+gggg")
The first argument describes the manifold structure, the second argument the flow layers for a particular manifold. Here "g" and "n" stand for particular normalizing flow layers that are pre-implemented (see Features below). The Euclidean parts in this example use 4 "g" layers each.

Have a look at the script that generates the above animation.
The docs can be found here.
Also check out the example notebook.
- Autoregressive conditional structure is taken care of behind the scenes and connects manifolds
- Coverage is straightforward. Everything (including spherical, interval and simplex flows) is based on a Gaussian base distribution (arXiv:2008.0582).
- Bisection & Newton iterations for differentiable inverse (used for certain non-analytic inverse flow functions)
- amortizable MLPs that can use low-rank approximations
- amortizable PDFs - the total PDF can be the output of another neural network
- unit tests that make sure backwards / and forward flow passes of all implemented flow-layers agree
- include log-lambda as an additional flow parameter to define parametrized Poisson-Processes
- easily extendible: define new Euclidean / spherical flow layers by subclassing Euclidean or spherical base classes
- Generic affine flow (Multivariate normal distribution) ("t")
- Gaussianization flow arXiv:2003.01941 ("g")
- Moebius transformations (described in arXiv:2002.02428) ("m")
- Circular rational-quadratic splines (described in arXiv:2002.02428) ("o")
- Autorregressive flow for the 2-sphere based on rational-quadratic splines (neural spline flows) (arXiv:2002.02428) ("f" with specific options)
- smooth rational-quadratic splines with von-Mises-Fisher scaling functions (arXiv:2604.19846) ("f" with specific options)
- Exponential map flow (arXiv:0906.0874/arXiv:2002.02428) ("v")
- Neural Manifold Ordinary Differential Equations arXiv:2006.10254 ("c")
- "Neural Spline Flows" (Rational-quadratic splines) (new in v 1.1 - smooth neural spline flows) arXiv:1906.04032 ("r")
- Autoregressive simplex flow (experimental) arXiv:2008.05456 ("w")
For a description of all flows and abbreviations, have a look in the docs here.
- pytorch (>=1.7)
- numpy (>=1.18.5)
- scipy (>=1.5.4)
- matplotlib (>=3.3.3)
- torchdiffeq (>=0.2.1)
The package has been built and tested with these versions, but might work just fine with older ones.
pip install git+https://github.com/thoglu/jammy_flows.git@*tag*
e.g.
pip install git+https://github.com/thoglu/jammy_flows.git@1.0.0
to install release 1.0.0.
pip install git+https://github.com/thoglu/jammy_flows.git
If you want to implement your own layer or have bug / feature suggestions, just file an issue.