-
Notifications
You must be signed in to change notification settings - Fork 219
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Slice sampling as a Gibbs
sampler
#2300
Comments
I think I will release the changes to |
In the long run, we should just move some of these interface functions to For now, I think a package extension should work. @torfjelde, @yebai, @mhauru |
Awesome stuff @Red-Portal :) And yeah, extension for now, but we'll hopefully have this be part of the AstractMCMC.jl interface soon-ish 👍 |
Sounds great. I'll keep this issue open until then to keep track! |
The results are very promising. Consider the following example on Turing's website: using Distributions
using Turing
using SliceSampling
@model function simple_choice(xs)
p ~ Beta(2, 2)
z ~ Bernoulli(p)
for i in 1:length(xs)
if z == 1
xs[i] ~ Normal(0, 1)
else
xs[i] ~ Normal(2, 1)
end
end
end
model = simple_choice([1.5, 2.0, 0.3])
sample(model, Gibbs(HMC(0.2, 3, :p), PG(20, :z)), 1000)
sample(model, Turing.Experimental.Gibbs((p = externalsampler(SliceSteppingOut(2.0)), z = PG(20, :z))), n_samples)
sample(model, Turing.Experimental.Gibbs((p = externalsampler(SliceDoublingOut(2.0)), z = PG(20, :z))), n_samples) HMC:
Slice sampling with stepping-out:
Slice sampling with Doubling-out:
The results look very promising! Of course the performance of HMC can be improved by tuning the stepsize, but the point is that slice sampling provides very good performance with zero tuning. |
Hi all,
I looked into using the samplers in
SliceSampling
as a component toExperimental.Gibbs
. After a few patches toSliceSampling
, it seems to be feasible. Here is a snippet that works withSliceSampling#main
:The text was updated successfully, but these errors were encountered: