PyTorch-based library for Riemannian Manifold Hamiltonian Monte Carlo (RMHMC) and inference in Bayesian neural networks
- Perform HMC in user-defined log probabilities and in PyTorch neural networks (objects inheriting from the
torch.nn.Module
). - Available sampling schemes:
- HMC
- No-U-Turn Sampler (currently adapts step-size only)
- Implicit RMHMC
- Explicit RMHMC
- Symmetric Split HMC
pip install git+https://github.com/AdamCobb/hamiltorch
There are currently two blog posts that describe how to use hamiltorch
:
- For basic usage and an introduction please refer to my earlier post in 2019 "hamiltorch: a PyTorch Python package for sampling"
- For a more recent summary and a focus on Bayesian neural networks, please see my post "Scaling HMC to larger data sets"
There are also notebook-style tutorials:
- Sampling from generic log probabilities
- Sampling from
torch.nn.Module
(basic) - Bayesian neural networks and split HMC
Please consider citing the following papers if you use hamiltorch
in your research:
For symmetric splitting:
@article{cobb2020scaling,
title={Scaling Hamiltonian Monte Carlo Inference for Bayesian Neural Networks with Symmetric Splitting},
author={Cobb, Adam D and Jalaian, Brian},
journal={Uncertainty in Artificial Intelligence},
year={2021}
}
For RMHMC:
@article{cobb2019introducing,
title={Introducing an Explicit Symplectic Integration Scheme for Riemannian Manifold Hamiltonian Monte Carlo},
author={Cobb, Adam D and Baydin, At{\i}l{\i}m G{\"u}ne{\c{s}} and Markham, Andrew and Roberts, Stephen J},
journal={arXiv preprint arXiv:1910.06243},
year={2019}
}