-
Notifications
You must be signed in to change notification settings - Fork 3
Home
This FAQ is curated by Luigi Acerbi, and in constant expansion.
For a IBS tutorial and example, see ibs_example.m (in MATLAB); other languages to be added.
If you have questions not covered here, please feel free to ask me at luigi.acerbi@helsinki.fi (putting 'IBS' in the subject of the email).
Acknowlegments: Most of the questions currently answered here originated in a live Q&A session with the Ma lab, and thanks to Hsin-Hung Li for taking notes.
-
No, this is not okay in the sense that by doing it one would essentially be reverting IBS to be a fixed-sampling method, with all the associated problems discussed in the paper. A more principled way is to put an early-stopping threshold on the log-likelihood, as described in the paper.
-
Is it important to provide the standard deviation of the IBS estimator to the optimization/inference algorithm for every parameter combination evaluated?
It depends:
- If you are optimizing the target log-likelihood (e.g., for maximum-likelihood or maximum-a-posteriori estimation) then it might help but it is not necessary because the IBS estimator variance, somewhat surprisingly, is nearly constant across the parameter space. However, the BADS optimizer (which we recommend to use in combination with IBS; see also below) does not currently support user-provided, input-dependent noise; so in that case it is not even an option.
- If you are performing Bayesian inference, for example using the VBMC toolbox, then it is necessary to provide the standard deviation of the IBS estimate to the algorithm. Bayesian inference is very sensitive to noisy estimates of the log-likelihood (or log-posterior), so it is crucial to provide the inference algorithm with all available information about the magnitude of observation noise.
-
For computational reasons, we can often not afford to evaluate the log likelihood of every parameter combination with high precision while optimizing the parameters. However, once we have found the (supposedly) best parameter combination, we could increase the precision (e.g., the number of IBS repeats). Is this advisable?
Yes, absolutely. It should be considered standard practice, regardless of IBS. Whenever optimizing a noisy target function, after obtaining a candidate solution from an optimization method, one should evaluate the target function at the solution with higher precision.
-
In an ideal world, would you let the number of IBS repeats depend on how close the optimization algorithm thinks it is to the maximum — i.e. some form of adaptive precision?
Yes, this is a good idea and topic of ongoing research.
BADS is a robust optimizer that works well with stochastic target functions, and in particular with the noisy estimates produced by IBS. A general FAQ for BADS can be found at this link. The following questions tackle issues that are common when combining IBS and BADS.
-
Is there a way to tune BADS to optimize towards a higher precision result or to have it optimize for longer?
There are some input arguments in BADS one can modify to make it search for longer. If you want BADS to search for longer at each iteration, you can modify two key options in the OPTIONS struct that you pass to the algorithm:
- Set
OPTIONS.CompletePoll = true
(default isfalse
). This will force BADS to finish the "poll" step (more info in the paper) instead of skipping it when it thinks that it is not worth continuing. This is public option of BADS. - Change
OPTIONS.SearchNtry
. Be careful that this is a "secret" option of BADS, and I do not recommend to change it unless you have to. The default value ismax(D,floor(3+D/2))
, whereD
is the dimensionality of the target function. This quantity represents the minimum number of attempted searches (via local Bayesian optimization) per iteration. You can try and increase it to force BADS to search for longer in each iteration.
- Set