Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Once default optimizer fails, try all optimizers but with increased number of iterations #385

Open
4 tasks
danielinteractive opened this issue Dec 13, 2023 · 2 comments
Labels
enhancement New feature or request SP3

Comments

@danielinteractive
Copy link
Collaborator

This is a first step towards improving convergence behavior on difficult data sets.
See #380 for a motivating example.

To do:

  • Define optimizer_control lists to use for the second try optimization
    • e.g. eval.max = 1000, iter.max = 1000 for nlminb
    • e.g. maxit = 50000 for optim
  • For refit_multiple_optimizers default to standard optimizer_control if not provided otherwise by the user
@danielinteractive danielinteractive added enhancement New feature or request SP3 labels Dec 13, 2023
@clarkliming
Copy link
Collaborator

clarkliming commented Jan 15, 2024

idea: provide additional default optimizers that has larger maxit or eval.max, if needed they will be executed after the first failure. or, provide batches of optimizer?

with this it can be a bit cleaner, and we don't need to mess up the defaults and user control (e.g. what if a user already provides a optimizer with maxeval set to a larger number? shall we retry with a even larger max eval?)

@danielinteractive
Copy link
Collaborator Author

thanks @clarkliming , yeah additional optimizers with higher thresholds would also be fine

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request SP3
Projects
Status: Issue To start
Development

No branches or pull requests

2 participants