Skip to content

danielward27/flowjax

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

logo

FlowJax: Normalizing Flows in Jax

Documentation

Available here.

Short example

Training a flow can be done in a few lines of code:

from flowjax.flows import block_neural_autoregressive_flow
from flowjax.train import fit_to_data
from flowjax.distributions import Normal
from jax import random
import jax.numpy as jnp

data_key, flow_key, train_key = random.split(random.PRNGKey(0), 3)

x = random.uniform(data_key, (10000, 3))  # Toy data
base_dist = Normal(jnp.zeros(x.shape[1]))
flow = block_neural_autoregressive_flow(flow_key, base_dist=base_dist)
flow, losses = fit_to_data(train_key, flow, x, learning_rate=1e-2)

# We can now evaluate the log-probability of arbitrary points
flow.log_prob(x)

The package currently includes:

Installation

pip install flowjax

Development

We can install a version for development as follows

git clone https://github.com/danielward27/flowjax.git
cd flowjax
pip install -e .[dev]
sudo apt-get install pandoc  # Required for building documentation

Warning

This package is in its early stages of development and may undergo significant changes, including breaking changes, between major releases. Whilst ideally we should be on version 0.y.z to indicate its state, we have already progressed beyond that stage.

TODO

A few limitations / things that could be worth including in the future:

  • Add ability to "reshape" bijections.

Related

We make use of the Equinox package, which facilitates object-oriented programming with Jax.

Authors

flowjax was written by Daniel Ward <danielward27@outlook.com>.