Skip to content

Commit

Permalink
Documentation updates (#1391)
Browse files Browse the repository at this point in the history
* Update quick-start.md

* Update performancetips.md

* Update advanced.md

* Update performancetips.md

* Update autodiff.md

* Update dynamichmc.md

* Update get-started.md

* Update guide.md

* Update compiler.md

* Document `@addlogprob!`

* Apply suggestions from code review

Co-authored-by: Hong Ge <hg344@cam.ac.uk>
Co-authored-by: Cameron Pfiffer <cpfiffer@gmail.com>

* Allow rebuilding docs manually

* Allow TagBot to trigger building the docs

Co-authored-by: Hong Ge <hg344@cam.ac.uk>
Co-authored-by: Cameron Pfiffer <cpfiffer@gmail.com>
  • Loading branch information
3 people authored Aug 25, 2020
1 parent 38fd4b8 commit 34649c7
Show file tree
Hide file tree
Showing 10 changed files with 255 additions and 250 deletions.
1 change: 1 addition & 0 deletions .github/workflows/Documentation.yml
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,7 @@ on:
branches:
- 'master'
tags: '*'
workflow_dispatch:

jobs:
docs:
Expand Down
1 change: 1 addition & 0 deletions .github/workflows/TagBot.yml
Original file line number Diff line number Diff line change
Expand Up @@ -9,3 +9,4 @@ jobs:
- uses: JuliaRegistries/TagBot@v1
with:
token: ${{ secrets.GITHUB_TOKEN }}
ssh: ${{ secrets.DOCUMENTER_KEY }}
281 changes: 122 additions & 159 deletions docs/src/for-developers/compiler.md

Large diffs are not rendered by default.

103 changes: 67 additions & 36 deletions docs/src/using-turing/advanced.md
Original file line number Diff line number Diff line change
Expand Up @@ -82,17 +82,65 @@ The vectorization syntax follows `rv ~ [distribution]`, which requires `rand` an
Distributions.logpdf(d::Flat, x::AbstractVector{<:Real}) = zero(x)
```

## Update the accumulated log probability in the model definition

## Model Internals
Turing accumulates log probabilities internally in an internal data structure that is accessible through
the internal variable `_varinfo` inside of the model definition (see below for more details about model internals).
However, since users should not have to deal with internal data structures, a macro `Turing.@addlogprob!` is provided
that increases the accumulated log probability. For instance, this allows you to
[include arbitrary terms in the likelihood](https://github.com/TuringLang/Turing.jl/issues/1332)

```julia
using Turing

myloglikelihood(x, μ) = loglikelihood(Normal(μ, 1), x)

@model function demo(x)
μ ~ Normal()
Turing.@addlogprob! myloglikelihood(x, μ)
end
```

and to [reject samples](https://github.com/TuringLang/Turing.jl/issues/1328):

```julia
using Turing
using LinearAlgebra

@model function demo(x)
m ~ MvNormal(length(x))
if dot(m, x) < 0
Turing.@addlogprob! -Inf
# Exit the model evaluation early
return
end

x ~ MvNormal(m, 1.0)
return
end
```

Note that `@addlogprob!` always increases the accumulated log probability, regardless of the provided
sampling context. For instance, if you do not want to apply `Turing.@addlogprob!` when evaluating the
prior of your model but only when computing the log likelihood and the log joint probability, then you
should [check the type of the internal variable `_context`](https://github.com/TuringLang/DynamicPPL.jl/issues/154)
such as

The `@model` macro accepts a function definition and generates a `Turing.Model` struct for use by the sampler. Models can be constructed by hand without the use of a macro. Taking the `gdemo` model as an example, the two code sections below (macro and macro-free) are equivalent.
```julia
if !isa(_context, Turing.PriorContext)
Turing.@addlogprob! myloglikelihood(x, μ)
end
```

## Model Internals


The `@model` macro accepts a function definition and rewrites it such that call of the function generates a `Model` struct for use by the sampler. Models can be constructed by hand without the use of a macro. Taking the `gdemo` model as an example, the macro-based definition

```julia
using Turing

@model gdemo(x) = begin
@model function gdemo(x)
# Set priors.
s ~ InverseGamma(2, 3)
m ~ Normal(0, sqrt(s))
Expand All @@ -101,65 +149,48 @@ using Turing
@. x ~ Normal(m, sqrt(s))
end

sample(gdemo([1.5, 2.0]), HMC(0.1, 5), 1000)
model = gdemo([1.5, 2.0])
```

is equivalent to the macro-free version

```julia
using Turing

# Initialize a NamedTuple containing our data variables.
data = (x = [1.5, 2.0],)

# Create the model function.
mf(vi, sampler, ctx, model) = begin
# Set the accumulated logp to zero.
resetlogp!(vi)
x = model.args.x

function modelf(rng, model, varinfo, sampler, context, x)
# Assume s has an InverseGamma distribution.
s, lp = Turing.Inference.tilde(
ctx,
s = Turing.DynamicPPL.tilde_assume(
rng,
context,
sampler,
InverseGamma(2, 3),
Turing.@varname(s),
(),
vi,
varinfo,
)

# Add the lp to the accumulated logp.
acclogp!(vi, lp)


# Assume m has a Normal distribution.
m, lp = Turing.Inference.tilde(
ctx,
m = Turing.DynamicPPL.tilde_assume(
rng,
context,
sampler,
Normal(0, sqrt(s)),
Turing.@varname(m),
(),
vi,
varinfo,
)

# Add the lp to the accumulated logp.
acclogp!(vi, lp)

# Observe each value of x[i], according to a
# Normal distribution.
lp = Turing.Inference.dot_tilde(ctx, sampler, Normal(m, sqrt(s)), x, vi)
acclogp!(vi, lp)
# Observe each value of x[i] according to a Normal distribution.
Turing.DynamicPPL.dot_tilde_observe(context, sampler, Normal(m, sqrt(s)), x, varinfo)
end

# Instantiate a Model object.
model = DynamicPPL.Model(mf, data, DynamicPPL.ModelGen{()}(nothing, nothing))

# Sample the model.
chain = sample(model, HMC(0.1, 5), 1000)
# Instantiate a Model object with our data variables.
model = Turing.Model(modelf, (x = [1.5, 2.0],))
```


## Task Copying


Turing [copies](https://github.com/JuliaLang/julia/issues/4085) Julia tasks to deliver efficient inference algorithms, but it also provides alternative slower implementation as a fallback. Task copying is enabled by default. Task copying requires us to use the `CTask` facility which is provided by [Libtask](https://github.com/TuringLang/Libtask.jl) to create tasks.


4 changes: 2 additions & 2 deletions docs/src/using-turing/autodiff.md
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@ Turing supports intermixed automatic differentiation methods for different varia
using Turing

# Define a simple Normal model with unknown mean and variance.
@model gdemo(x, y) = begin
@model function gdemo(x, y)
s ~ InverseGamma(2, 3)
m ~ Normal(0, sqrt(s))
x ~ Normal(m, sqrt(s))
Expand All @@ -37,7 +37,7 @@ c = sample(
HMC{Turing.ForwardDiffAD{1}}(0.1, 5, :m),
HMC{Turing.TrackerAD}(0.1, 5, :s)
),
1000
1000,
)
```

Expand Down
2 changes: 1 addition & 1 deletion docs/src/using-turing/dynamichmc.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ Here is a brief example of how to apply `DynamicNUTS`:
using LogDensityProblems, DynamicHMC, Turing

# Model definition.
@model gdemo(x, y) = begin
@model function gdemo(x, y)
s ~ InverseGamma(2, 3)
m ~ Normal(0, sqrt(s))
x ~ Normal(m, sqrt(s))
Expand Down
23 changes: 7 additions & 16 deletions docs/src/using-turing/get-started.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,31 +12,23 @@ To use Turing, you need to install Julia first and then install Turing.

### Install Julia

You will need to install Julia 1.0 or greater, which you can get from [the official Julia website](http://julialang.org/downloads/).
You will need to install Julia 1.3 or greater, which you can get from [the official Julia website](http://julialang.org/downloads/).


### Install Turing.jl

Turing is an officially registered Julia package, so the following will install a stable version of Turing while inside Julia's package manager (press `]` from the REPL):

Turing is an officially registered Julia package, so you can install a stable version of Turing by running the following in the Julia REPL:

```julia
add Turing
julia> ] add Turing
```


If you want to use the latest version of Turing with some experimental features, you can try the following instead:

You can check if all tests pass by running

```julia
add Turing#master
test Turing
julia> ] test Turing
```


If all tests pass, you're ready to start using Turing.


### Example

Here's a simple example showing the package in action:
Expand All @@ -47,7 +39,7 @@ using Turing
using StatsPlots

# Define a simple Normal model with unknown mean and variance.
@model gdemo(x, y) = begin
@model function gdemo(x, y)
s ~ InverseGamma(2, 3)
m ~ Normal(0, sqrt(s))
x ~ Normal(m, sqrt(s))
Expand All @@ -57,11 +49,10 @@ end
# Run sampler, collect results
chn = sample(gdemo(1.5, 2), HMC(0.1, 5), 1000)

# Summarise results (currently requires the master branch from MCMCChains)
# Summarise results
describe(chn)

# Plot and save results
p = plot(chn)
savefig("gdemo-plot.png")
```

5 changes: 2 additions & 3 deletions docs/src/using-turing/guide.md
Original file line number Diff line number Diff line change
Expand Up @@ -120,9 +120,8 @@ var_1 = mean(chn[:var_1]) # Taking the mean of a variable named var_1.


The key (`:var_1`) can be a `Symbol` or a `String`. For example, to fetch `x[1]`, one can use `chn[Symbol("x[1]")` or `chn["x[1]"]`.


The benefit of using a `Symbol` to index allows you to retrieve all the parameters associated with that symbol. As an example, if you have the parameters `"x[1]"`, `"x[2]"`, and `"x[3]"`, calling `chn[:x]` will return a new chain with only `"x[1]"`, `"x[2]"`, and `"x[3]"`.
If you want to retrieve all parameters associated with a specific symbol, you can use `group`. As an example, if you have the
parameters `"x[1]"`, `"x[2]"`, and `"x[3]"`, calling `group(chn, :x)` or `group(chn, "x")` will return a new chain with only `"x[1]"`, `"x[2]"`, and `"x[3]"`.


Turing does not have a declarative form. More generally, the order in which you place the lines of a `@model` macro matters. For example, the following example works:
Expand Down
Loading

2 comments on commit 34649c7

@cpfiffer
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@JuliaRegistrator
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Registration pull request created: JuliaRegistries/General/20166

After the above pull request is merged, it is recommended that a tag is created on this repository for the registered package version.

This will be done automatically if the Julia TagBot GitHub Action is installed, or can be done manually through the github interface, or via:

git tag -a v0.14.0 -m "<description of version>" 34649c776eb2f2474f5e8fb1f41dff273c5153f0
git push origin v0.14.0

Please sign in to comment.