-
Notifications
You must be signed in to change notification settings - Fork 230
Closed
Description
I'm trying to use deterministic distribution in my model as outlined here: https://discourse.julialang.org/t/is-there-a-turing-alternative-to-pm-deterministic-from-pymc3/38667/2 and so far my code fails:
Distributions.rand(rng::AbstractRNG, d::Determin) = d.val
Distributions.logpdf(d::Determin, x::T) where T<:Real = zero(x)
Bijectors.bijector(d::Determin) = Identity{0}() # <= `0` indicates 0-dimensional, i.e. univariate
@model model1(a, ::Type{T} = Float64) where {T} = begin
ma ~ Normal(0.0, 0.5)
for i in 1:length(a)
a[i] ~ Normal(ma, 0.05)
end
b ~ Determin(-exp(-2*ma))
return ma, b
end
function test_model1()
# sampler = HMC(0.001, 10)
# sampler = HMCDA(1000, 0.75, 0.05)
sampler = NUTS(0.65)
c1 = sample(model1(a1), sampler, 2000)
println(c1)
endTypical results look like this:
julia> test_model1()
┌ Info: Found initial step size
└ ϵ = 0.025
Object of type Chains, with data of type 1000×14×1 Array{Float64,3}
Iterations = 1:1000
Thinning interval = 1
Chains = 1
Samples per chain = 1000
internals = acceptance_rate, hamiltonian_energy, hamiltonian_energy_error, is_accept, log_density, lp, max_hamiltonian_energy_error, n_steps, nom_step_size, numerical_error, step_size, tree_depth
parameters = b, ma
2-element Array{ChainDataFrame,1}
Summary Statistics
parameters mean std naive_se mcse ess r_hat
────────── ───────────────────────── ──────────────────────── ─────────────────────── ──────────────────────── ──────── ──────
b 13816403112283807744.0000 7133491167269512192.0000 225580797572648256.0000 2172001283591027712.0000 4.8969 1.3982
ma 0.0690 0.0179 0.0006 0.0007 830.2122 0.9998
Quantiles
parameters 2.5% 25.0% 50.0% 75.0% 97.5%
────────── ───────────────────────── ───────────────────────── ───────────────────────── ───────────────────────── ─────────────────────────
b -5026632495806943232.0000 12090906044107700224.0000 15653120955249981440.0000 18317459011682740224.0000 22252367610115112960.0000
ma 0.0349 0.0565 0.0694 0.0811 0.1047
HMC and HMCDA samplers also give similarly bad results bud SMC actually works. Did I make something stupid here or is this a bug? A similar model in PyMC3 works with the NUTS sampler:
a1 = [0.05, 0.03, 0.02, 0.03, 0.04, 0.09, 0.13, 0.16]
with pm.Model() as model3:
mu_a = pm.Normal('mu_a', 0.0, 0.5)
aim = pm.Normal('aim', mu=mu_a, sigma=0.05, observed=a1)
b = pm.Deterministic('b', -np.exp(-2*mu_a))
tr = pm.sample(2000, tune=700)
pm.traceplot(tr)
display(pm.summary(tr)) | mean | sd | hdi_3% | hdi_97% | mcse_mean | mcse_sd | ess_mean | ess_sd | ess_bulk | ess_tail | r_hat
-- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | --
0.069 | 0.017 | 0.037 | 0.101 | 0.000 | 0.0 | 1860.0 | 1860.0 | 1851.0 | 2777.0 | 1.0
-0.872 | 0.030 | -0.923 | -0.811 | 0.001 | 0.0 | 1860.0 | 1860.0 | 1851.0 | 2777.0 | 1.0
Metadata
Metadata
Assignees
Labels
No labels