-
Notifications
You must be signed in to change notification settings - Fork 219
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
bump compat of AdvancedHMC #2050
Conversation
Codecov ReportPatch and project coverage have no change.
Additional details and impacted files@@ Coverage Diff @@
## master #2050 +/- ##
======================================
Coverage 0.00% 0.00%
======================================
Files 22 22
Lines 1458 1458
======================================
Misses 1458 1458
☔ View full report in Codecov by Sentry. |
Failing because of Optimizers. |
Co-authored-by: David Widmann <devmotion@users.noreply.github.com>
Tests use AdvancedHMC 0.4 it seems. |
Pull Request Test Coverage Report for Build 5700989848
💛 - Coveralls |
TuringLang/AdvancedHMC.jl#342 |
Co-authored-by: David Widmann <devmotion@users.noreply.github.com>
src/inference/Inference.jl
Outdated
@@ -189,10 +189,10 @@ function AbstractMCMC.sample( | |||
kwargs... | |||
) | |||
if resume_from === nothing | |||
return AbstractMCMC.mcmcsample(rng, model, sampler, N; | |||
chain_type=chain_type, progress=progress, kwargs...) | |||
return AbstractMCMC.mcmcsample(rng, model, sampler, N + min(div(N, 10), 1_000); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This means users will get more than the requested N
samples - I think this should be reverted?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There is a change in how the number of adaptions is handled in the external sampler interface. Instead of storing n_adapts
in the sampling algorithm (see here), we now only pass them in the AbstractMCMC.sample
call. This means there is no adaption by default if the change is reverted.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Maybe we can set disgard_initial = n_adapts
here so we have a default adaption but will return the same number of MCMC samples. In addition, we can allow the user to pass a n_adapt
argument to override the default adaption settings.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sure, we have to take into account the adaption steps - but changing this line here means that a call such as sample(model, NUTS(), 10)
will return not 10 but 1010 samples? These discarded steps only have to be specified as keyword argument but not added to the positional argument (number of samples). AbstractMCMC will added discard_initial
to these internally to the requested number of samples automatically.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It turns out we have a typo in tests: we already explicitly pass the n_adapts
argument to the sample
function. Due to the typo, AHMC complains about missing n_adapts
. After correcting this typo, these default values are no longer necessary. More generally, we should introduce default options for n_adapts
in the AbstractMCMC package; see this PR.
Bump compatibility of AdvancedHMC to include the latest version that includes the interface with external samplers.
Closes #2054