Skip to content

Commit

Permalink
Set up documentation (#23)
Browse files Browse the repository at this point in the history
* Set up documentation

* Update docs/make.jl

Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>

Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
  • Loading branch information
devmotion and github-actions[bot] authored Oct 18, 2021
1 parent f083af8 commit efc3cd8
Show file tree
Hide file tree
Showing 11 changed files with 221 additions and 95 deletions.
6 changes: 6 additions & 0 deletions .github/workflows/CI.yml
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,12 @@ on:
- main
pull_request:

concurrency:
# Skip intermediate builds: always.
# Cancel intermediate builds: only if it is a pull request build.
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: ${{ startsWith(github.ref, 'refs/pull/') }}

jobs:
test:
name: Julia ${{ matrix.version }} - ${{ matrix.os }} - ${{ matrix.arch }}
Expand Down
4 changes: 2 additions & 2 deletions .github/workflows/CompatHelper.yml
Original file line number Diff line number Diff line change
Expand Up @@ -24,8 +24,8 @@ jobs:
- name: "Run CompatHelper"
run: |
import CompatHelper
CompatHelper.main()
CompatHelper.main(; subdirs=["", "docs"])
shell: julia --color=yes {0}
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
COMPATHELPER_PRIV: ${{ secrets.COMPATHELPER_PRIV }}
COMPATHELPER_PRIV: ${{ secrets.DOCUMENTER_KEY }}
31 changes: 31 additions & 0 deletions .github/workflows/Docs.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,31 @@
name: Documentation

on:
push:
branches:
- main
tags: '*'
pull_request:

concurrency:
# Skip intermediate builds: always.
# Cancel intermediate builds: only if it is a pull request build.
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: ${{ startsWith(github.ref, 'refs/pull/') }}

jobs:
docs:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- uses: julia-actions/setup-julia@latest
with:
version: '1'
- name: Install dependencies
run: julia --project=docs/ -e 'using Pkg; Pkg.develop(PackageSpec(path=pwd())); Pkg.instantiate()'
- name: Build and deploy
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} # For authentication with GitHub Actions token
DOCUMENTER_KEY: ${{ secrets.DOCUMENTER_KEY }} # For authentication with SSH deploy key
JULIA_DEBUG: Documenter # Print `@debug` statements (https://github.com/JuliaDocs/Documenter.jl/issues/955)
run: julia --project=docs/ docs/make.jl
26 changes: 26 additions & 0 deletions .github/workflows/DocsCleanupPreview.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,26 @@
name: DocsPreviewCleanup

on:
pull_request:
types: [closed]

jobs:
cleanup:
runs-on: ubuntu-latest
steps:
- name: Checkout gh-pages branch
uses: actions/checkout@v2
with:
ref: gh-pages
- name: Delete preview and history + push changes
run: |
if [ -d "previews/PR$PRNUM" ]; then
git config user.name "Documenter.jl"
git config user.email "documenter@juliadocs.github.io"
git rm -rf "previews/PR$PRNUM"
git commit -m "delete preview"
git branch gh-pages-new $(echo "delete history" | git commit-tree HEAD^{tree})
git push --force origin gh-pages-new:gh-pages
fi
env:
PRNUM: ${{ github.event.number }}
11 changes: 9 additions & 2 deletions .github/workflows/Format.yml
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,12 @@ name: Format
on:
pull_request:

concurrency:
# Skip intermediate builds: always.
# Cancel intermediate builds: only if it is a pull request build.
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: ${{ startsWith(github.ref, 'refs/pull/') }}

jobs:
format:
runs-on: ubuntu-latest
Expand All @@ -11,14 +17,15 @@ jobs:
- uses: julia-actions/setup-julia@latest
with:
version: 1
- name: Install JuliaFormatter and format code
- name: Format code
run: |
using Pkg: Pkg
using Pkg
Pkg.add(; name="JuliaFormatter", uuid="98e50ef6-434e-11e9-1051-2b60c6c9e899")
using JuliaFormatter
format("."; verbose=true)
shell: julia --color=yes {0}
- uses: reviewdog/action-suggester@v1
if: github.event_name == 'pull_request'
with:
tool_name: JuliaFormatter
fail_on_error: true
1 change: 1 addition & 0 deletions .github/workflows/TagBot.yml
Original file line number Diff line number Diff line change
Expand Up @@ -12,3 +12,4 @@ jobs:
- uses: JuliaRegistries/TagBot@v1
with:
token: ${{ secrets.GITHUB_TOKEN }}
ssh: ${{ secrets.DOCUMENTER_KEY }}
99 changes: 8 additions & 91 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,8 @@

Julia implementation of elliptical slice sampling.

[![Project Status: Active – The project has reached a stable, usable state and is being actively developed.](https://www.repostatus.org/badges/latest/active.svg)](https://www.repostatus.org/#active)
[![Stable](https://img.shields.io/badge/docs-stable-blue.svg)](https://turinglang.github.io/EllipticalSliceSampling.jl/stable)
[![Dev](https://img.shields.io/badge/docs-dev-blue.svg)](https://turinglang.github.io/EllipticalSliceSampling.jl/dev)
[![Build Status](https://github.com/TuringLang/EllipticalSliceSampling.jl/workflows/CI/badge.svg?branch=main)](https://github.com/TuringLang/EllipticalSliceSampling.jl/actions?query=workflow%3ACI%20branch%3Amain)
[![Codecov](https://codecov.io/gh/TuringLang/EllipticalSliceSampling.jl/branch/main/graph/badge.svg)](https://codecov.io/gh/TuringLang/EllipticalSliceSampling.jl)
[![Coveralls](https://coveralls.io/repos/github/TuringLang/EllipticalSliceSampling.jl/badge.svg?branch=main)](https://coveralls.io/github/TuringLang/EllipticalSliceSampling.jl?branch=main)
Expand All @@ -16,99 +17,15 @@ This package implements elliptical slice sampling in the Julia language, as desc
Elliptical slice sampling is a "Markov chain Monte Carlo algorithm for performing
inference in models with multivariate Gaussian priors" (Murray, Adams & MacKay (2010)).

Without loss of generality, the originally described algorithm assumes that the Gaussian
prior has zero mean. For convenience we allow the user to specify arbitrary Gaussian
priors with non-zero means and handle the change of variables internally.
Please check the [documentation](https://turinglang.github.io/EllipticalSliceSampling.jl/stable)
for more details.

## Usage
## Poster at JuliaCon 2021

Probably most users would like to generate a MC Markov chain of samples from
the posterior distribution by calling
```julia
sample([rng, ]ESSModel(prior, loglikelihood), ESS(), N[; kwargs...])
```
which returns a vector of `N` samples for approximating the posterior of
a model with a Gaussian prior that allows sampling from the `prior` and
evaluation of the log likelihood `loglikelihood`.
[![EllipticalSliceSampling.jl: MCMC with Gaussian priors](http://img.youtube.com/vi/S5gUED7Uq2Q/0.jpg)](https://www.youtube.com/watch?v=S5gUED7Uq2Q)

You can sample multiple chains in parallel with multiple threads or processes
by running
```julia
sample([rng, ]ESSModel(prior, loglikelihood), ESS(), MCMCThreads(), N, nchains[; kwargs...])
```
or
```julia
sample([rng, ]ESSModel(prior, loglikelihood), ESS(), MCMCDistributed(), N, nchains[; kwargs...])
```
The slides are available as [Pluto notebook](https://talks.widmann.dev/2021/07/ellipticalslicesampling/).

If you want to have more control about the sampling procedure (e.g., if you
only want to save a subset of samples or want to use another stopping
criterion), the function
```julia
AbstractMCMC.steps(
[rng,]
ESSModel(prior, loglikelihood),
ESS();
kwargs...
)
```
gives you access to an iterator from which you can generate an unlimited
number of samples.

For more details regarding `sample` and `steps` please check the documentation of
[AbstractMCMC.jl](https://github.com/TuringLang/AbstractMCMC.jl).

### Prior

You may specify Gaussian priors with arbitrary means. EllipticalSliceSampling.jl
provides first-class support for the scalar and multivariate normal distributions
in [Distributions.jl](https://github.com/JuliaStats/Distributions.jl). For
instance, if the prior distribution is a standard normal distribution, you can
choose
```julia
prior = Normal()
```

However, custom Gaussian priors are supported as well. For instance, if you want to
use a custom distribution type `GaussianPrior`, the following methods should be
implemented:
```julia
# state that the distribution is actually Gaussian
EllipticalSliceSampling.isgaussian(::Type{<:GaussianPrior}) = true

# define the mean of the distribution
# alternatively implement `proposal(prior, ...)` and
# `proposal!(out, prior, ...)` (only if the samples are mutable)
Statistics.mean(dist::GaussianPrior) = ...

# define how to sample from the distribution
# only one of the following methods is needed:
# - if the samples are immutable (e.g., numbers or static arrays) only
# `rand(rng, dist)` should be implemented
# - otherwise only `rand!(rng, dist, sample)` is required
Base.rand(rng::AbstractRNG, dist::GaussianPrior) = ...
Random.rand!(rng::AbstractRNG, dist::GaussianPrior, sample) = ...
```

### Log likelihood

In addition to the prior, you have to specify a Julia implementation of
the log likelihood function. Here the predefined log densities and log
likelihood functions in
[Distributions.jl](https://github.com/JuliaStats/Distributions.jl) might
be useful.

### Progress monitor

If you use a package such as [Juno](https://junolab.org/) or
[TerminalLoggers.jl](https://github.com/c42f/TerminalLoggers.jl) that supports
progress logs created by the
[ProgressLogging.jl](https://github.com/JunoLab/ProgressLogging.jl) API, then you can
monitor the progress of the sampling algorithm. If you do not specify a progress
logging frontend explicitly,
[AbstractMCMC.jl](https://github.com/TuringLang/AbstractMCMC.jl) picks a frontend
for you automatically.

## Bibliography
## References

Murray, I., Adams, R. & MacKay, D.. (2010). Elliptical slice sampling. Proceedings of Machine Learning Research, 9:541-548.
2 changes: 2 additions & 0 deletions docs/.gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
/Manifest.toml
/build/
5 changes: 5 additions & 0 deletions docs/Project.toml
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
[deps]
Documenter = "e30172f5-a6a5-5a46-863b-614d45cd2de4"

[compat]
Documenter = "0.27"
17 changes: 17 additions & 0 deletions docs/make.jl
Original file line number Diff line number Diff line change
@@ -0,0 +1,17 @@
using Documenter
using EllipticalSliceSampling

makedocs(;
sitename="EllipticalSliceSampling",
format=Documenter.HTML(),
modules=[EllipticalSliceSampling],
pages=["Home" => "index.md"],
strict=true,
checkdocs=:exports,
)

deploydocs(;
repo="github.com/TuringLang/EllipticalSliceSampling.jl.git",
devbranch="main",
push_preview=true,
)
114 changes: 114 additions & 0 deletions docs/src/index.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,114 @@
# EllipticalSliceSampling

*Julia implementation of elliptical slice sampling.*

## Overview

This package implements elliptical slice sampling in the Julia language, as described in
[Murray, Adams & MacKay (2010)](http://proceedings.mlr.press/v9/murray10a/murray10a.pdf).

Elliptical slice sampling is a "Markov chain Monte Carlo algorithm for performing
inference in models with multivariate Gaussian priors" (Murray, Adams & MacKay (2010)).

Without loss of generality, the originally described algorithm assumes that the Gaussian
prior has zero mean. For convenience we allow the user to specify arbitrary Gaussian
priors with non-zero means and handle the change of variables internally.

## Poster at JuliaCon 2021

[![EllipticalSliceSampling.jl: MCMC with Gaussian priors](http://img.youtube.com/vi/S5gUED7Uq2Q/0.jpg)](https://www.youtube.com/watch?v=S5gUED7Uq2Q)

The slides are available as [Pluto notebook](https://talks.widmann.dev/2021/07/ellipticalslicesampling/).

## Usage

Probably most users would like to generate a MC Markov chain of samples from
the posterior distribution by calling
```julia
sample([rng, ]ESSModel(prior, loglikelihood), ESS(), N[; kwargs...])
```
which returns a vector of `N` samples for approximating the posterior of
a model with a Gaussian prior that allows sampling from the `prior` and
evaluation of the log likelihood `loglikelihood`.

You can sample multiple chains in parallel with multiple threads or processes
by running
```julia
sample([rng, ]ESSModel(prior, loglikelihood), ESS(), MCMCThreads(), N, nchains[; kwargs...])
```
or
```julia
sample([rng, ]ESSModel(prior, loglikelihood), ESS(), MCMCDistributed(), N, nchains[; kwargs...])
```

If you want to have more control about the sampling procedure (e.g., if you
only want to save a subset of samples or want to use another stopping
criterion), the function
```julia
AbstractMCMC.steps(
[rng,]
ESSModel(prior, loglikelihood),
ESS();
kwargs...
)
```
gives you access to an iterator from which you can generate an unlimited
number of samples.

For more details regarding `sample` and `steps` please check the documentation of
[AbstractMCMC.jl](https://github.com/TuringLang/AbstractMCMC.jl).

### Prior

You may specify Gaussian priors with arbitrary means. EllipticalSliceSampling.jl
provides first-class support for the scalar and multivariate normal distributions
in [Distributions.jl](https://github.com/JuliaStats/Distributions.jl). For
instance, if the prior distribution is a standard normal distribution, you can
choose
```julia
prior = Normal()
```

However, custom Gaussian priors are supported as well. For instance, if you want to
use a custom distribution type `GaussianPrior`, the following methods should be
implemented:
```julia
# state that the distribution is actually Gaussian
EllipticalSliceSampling.isgaussian(::Type{<:GaussianPrior}) = true

# define the mean of the distribution
# alternatively implement `proposal(prior, ...)` and
# `proposal!(out, prior, ...)` (only if the samples are mutable)
Statistics.mean(dist::GaussianPrior) = ...

# define how to sample from the distribution
# only one of the following methods is needed:
# - if the samples are immutable (e.g., numbers or static arrays) only
# `rand(rng, dist)` should be implemented
# - otherwise only `rand!(rng, dist, sample)` is required
Base.rand(rng::AbstractRNG, dist::GaussianPrior) = ...
Random.rand!(rng::AbstractRNG, dist::GaussianPrior, sample) = ...
```

### Log likelihood

In addition to the prior, you have to specify a Julia implementation of
the log likelihood function. Here the predefined log densities and log
likelihood functions in
[Distributions.jl](https://github.com/JuliaStats/Distributions.jl) might
be useful.

### Progress monitor

If you use a package such as [Juno](https://junolab.org/) or
[TerminalLoggers.jl](https://github.com/c42f/TerminalLoggers.jl) that supports
progress logs created by the
[ProgressLogging.jl](https://github.com/JunoLab/ProgressLogging.jl) API, then you can
monitor the progress of the sampling algorithm. If you do not specify a progress
logging frontend explicitly,
[AbstractMCMC.jl](https://github.com/TuringLang/AbstractMCMC.jl) picks a frontend
for you automatically.

## References

Murray, I., Adams, R. & MacKay, D.. (2010). Elliptical slice sampling. Proceedings of Machine Learning Research, 9:541-548.

0 comments on commit efc3cd8

Please sign in to comment.