Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
19 changes: 19 additions & 0 deletions HISTORY.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,22 @@
# 0.39.9

Revert a bug introduced in 0.39.5 in the external sampler interface.
For Turing 0.39, external samplers should define

```
Turing.Inference.getparams(::DynamicPPL.Model, ::MySamplerTransition)`
```

rather than

```
AbstractMCMC.getparams(::DynamicPPL.Model, ::MySamplerState)
```

to obtain a vector of parameters from the model.

Note that this may change in future breaking releases.

# 0.39.8

MCMCChains.jl doesn't understand vector- or matrix-valued variables, and in Turing we split up such values into their individual components.
Expand Down
2 changes: 1 addition & 1 deletion Project.toml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
name = "Turing"
uuid = "fce5fe82-541a-59a6-adf8-730c64b5f9a0"
version = "0.39.8"
version = "0.39.9"

[deps]
ADTypes = "47edcb42-4c32-4615-8424-f2b9edc5f35b"
Expand Down
18 changes: 16 additions & 2 deletions src/mcmc/Inference.jl
Original file line number Diff line number Diff line change
Expand Up @@ -161,12 +161,26 @@ metadata(vi::AbstractVarInfo) = (lp=getlogp(vi),)
# Chain making utilities #
##########################

# TODO(penelopeysm): Separate Turing.Inference.getparams (should only be
# defined for AbstractVarInfo and Turing.Inference.Transition; returns varname
# => value maps) from AbstractMCMC.getparams (defined for any sampler transition,
# returns vector).
"""
getparams(model, t)
Turing.Inference.getparams(model::Any, t::Any)

Return a named tuple of parameters.
Return a vector of parameter values from the given sampler transition `t` (i.e.,
the first return value of AbstractMCMC.step). By default, returns the `t.θ` field.

!!! note
This method only needs to be implemented for external samplers. It will be
removed in future releases and replaced with `AbstractMCMC.getparams`.
"""
getparams(model, t) = t.θ
"""
Turing.Inference.getparams(model::DynamicPPL.Model, t::AbstractVarInfo)

Return a key-value map of parameters from the varinfo.
"""
function getparams(model::DynamicPPL.Model, vi::DynamicPPL.VarInfo)
# NOTE: In the past, `invlink(vi, model)` + `values_as(vi, OrderedDict)` was used.
# Unfortunately, using `invlink` can cause issues in scenarios where the constraints
Expand Down
11 changes: 9 additions & 2 deletions src/mcmc/external_sampler.jl
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,11 @@ When implementing a new `MySampler <: AbstractSampler`,
In particular, it must implement:

- `AbstractMCMC.step` (the main function for taking a step in MCMC sampling; this is documented in AbstractMCMC.jl)
- `AbstractMCMC.getparams(::DynamicPPL.Model, external_state)`: How to extract the parameters from the state returned by your sampler (i.e., the second return value of `step`).
- `Turing.Inference.getparams(::DynamicPPL.Model, external_transition)`: How to extract the parameters from the transition returned by your sampler (i.e., the first return value of `step`).
There is a default implementation for this method, which is to return `external_transition.θ`.

!!! note
In a future breaking release of Turing, this is likely to change to `AbstractMCMC.getparams(::DynamicPPL.Model, external_state)`, with no default method. `Turing.Inference.getparams` is technically an internal method, so the aim here is to unify the interface for samplers at a higher level.

There are a few more optional functions which you can implement to improve the integration with Turing.jl:

Expand Down Expand Up @@ -119,7 +123,10 @@ function make_updated_varinfo(
f::DynamicPPL.LogDensityFunction, external_transition, external_state
)
# Set the parameters.
new_parameters = getparams(f.model, external_state)
# NOTE: This is Turing.Inference.getparams, not AbstractMCMC.getparams (!!!!!)
# The latter uses the state rather than the transition.
# TODO(penelopeysm): Make this use AbstractMCMC.getparams instead
new_parameters = getparams(f.model, external_transition)
new_varinfo = DynamicPPL.unflatten(f.varinfo, new_parameters)
# Set (or recalculate, if needed) the log density.
new_logp = getlogp_external(external_transition, external_state)
Expand Down
82 changes: 81 additions & 1 deletion test/mcmc/external_sampler.jl
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
module ExternalSamplerTests

using ..Models: gdemo_default
using AbstractMCMC: AbstractMCMC
using AdvancedMH: AdvancedMH
using Distributions: sample
Expand All @@ -14,6 +15,85 @@ using Test: @test, @test_throws, @testset
using Turing
using Turing.Inference: AdvancedHMC

@testset "External sampler interface" begin
# Turing declares an interface for external samplers (see docstring for
# ExternalSampler). We should check that implementing this interface
# and only this interface allows us to use the sampler in Turing.
struct MyTransition{V<:AbstractVector}
params::V
end
# Samplers need to implement `Turing.Inference.getparams`.
Turing.Inference.getparams(::DynamicPPL.Model, t::MyTransition) = t.params
# State doesn't matter (but we need to carry the params through to the next
# iteration).
struct MyState{V<:AbstractVector}
params::V
end

# externalsamplers must accept LogDensityModel inside their step function.
# By default Turing gives the externalsampler a LDF constructed with
# adtype=ForwardDiff, so we should expect that inside the sampler we can
# call both `logdensity` and `logdensity_and_gradient`.
#
# The behaviour of this sampler is to simply calculate logp and its
# gradient, and then return the same values.
#
# TODO: Do we also want to run ADTypeCheckContext to make sure that it is
# indeed using the adtype provided from Turing?
struct MySampler <: AbstractMCMC.AbstractSampler end
function AbstractMCMC.step(
rng::Random.AbstractRNG,
model::AbstractMCMC.LogDensityModel,
sampler::MySampler;
initial_params::AbstractVector,
kwargs...,
)
# Step 1
ldf = model.logdensity
lp = LogDensityProblems.logdensity(ldf, initial_params)
@test lp isa Real
lp, grad = LogDensityProblems.logdensity_and_gradient(ldf, initial_params)
@test lp isa Real
@test grad isa AbstractVector{<:Real}
return MyTransition(initial_params), MyState(initial_params)
end
function AbstractMCMC.step(
rng::Random.AbstractRNG,
model::AbstractMCMC.LogDensityModel,
sampler::MySampler,
state::MyState;
kwargs...,
)
# Step >= 1
params = state.params
ldf = model.logdensity
lp = LogDensityProblems.logdensity(ldf, params)
@test lp isa Real
lp, grad = LogDensityProblems.logdensity_and_gradient(ldf, params)
@test lp isa Real
@test grad isa AbstractVector{<:Real}
return MyTransition(params), MyState(params)
end

@model function test_external_sampler()
a ~ Beta(2, 2)
return b ~ Normal(a)
end
model = test_external_sampler()
a, b = 0.5, 0.0

chn = sample(model, externalsampler(MySampler()), 10; initial_params=[a, b])
@test chn isa MCMCChains.Chains
@test all(chn[:a] .== a)
@test all(chn[:b] .== b)
# TODO: Uncomment this once Turing v0.40 is released. In that version, logpdf
# will be recalculated correctly for external samplers.
# expected_logpdf = logpdf(Beta(2, 2), a) + logpdf(Normal(a), b)
# @test all(chn[:lp] .== expected_logpdf)
# @test all(chn[:logprior] .== expected_logpdf)
# @test all(chn[:loglikelihood] .== 0.0)
end

function initialize_nuts(model::DynamicPPL.Model)
# Create a linked varinfo
vi = DynamicPPL.VarInfo(model)
Expand Down Expand Up @@ -107,7 +187,7 @@ function test_initial_params(
end
end

@testset verbose = true "External samplers" begin
@testset verbose = true "Implementation of externalsampler interface for known packages" begin
@testset "AdvancedHMC.jl" begin
@testset "$(model.f)" for model in DynamicPPL.TestUtils.DEMO_MODELS
adtype = Turing.DEFAULT_ADTYPE
Expand Down
Loading