-
Notifications
You must be signed in to change notification settings - Fork 4
Suppress _utils warnings during tuning #198
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
8d18511 to
04e894b
Compare
Summary: Pull Request resolved: facebookincubator#198 Suppress _utils.logger alongside methods.logger during tuning trials to prevent "Unshrink is not close to 1" warnings from appearing. These warnings are expected during hyperparameter exploration but can be noisy for users running tuning jobs. Differential Revision: D91884585
04e894b to
672aa08
Compare
Summary: Pull Request resolved: facebookincubator#198 Suppress _utils.logger alongside methods.logger during tuning trials to prevent "Unshrink is not close to 1" warnings from appearing. These warnings are expected during hyperparameter exploration but can be noisy for users running tuning jobs. Differential Revision: D91884585
672aa08 to
901d231
Compare
|
Codecov Report✅ All modified and coverable lines are covered by tests. Additional details and impacted files@@ Coverage Diff @@
## main #198 +/- ##
=======================================
Coverage ? 94.92%
=======================================
Files ? 9
Lines ? 1753
Branches ? 0
=======================================
Hits ? 1664
Misses ? 89
Partials ? 0
Flags with carried forward coverage won't be shown. Click here to find out more. ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
Summary: Add a `use_model_predictions` parameter to `tune_mcgrad_params` with `False` as the default. When False, returns the actual observed best trial parameters. When True, uses the Bayesian surrogate model's predicted best parameters. The default is False because with few tuning trials, the surrogate model may not be well-calibrated and could return suboptimal parameters that don't match the actual best observed trial. Users who want the model-predicted best can set this to True. Differential Revision: D91889101
Summary: Use log_loss from sklearn instead of normalized_entropy as the tuning objective. This matches MCGrad's early stopping metric, providing directly comparable outputs between tuning and model training. The log_loss metric is a standard choice for probabilistic predictions and is widely understood. Reviewed By: Lorenzo-Perini Differential Revision: D91884587
Summary: Suppress _utils.logger alongside methods.logger during tuning trials to prevent "Unshrink is not close to 1" warnings from appearing. These warnings are expected during hyperparameter exploration but can be noisy for users running tuning jobs. Reviewed By: Lorenzo-Perini Differential Revision: D91884585
901d231 to
94bc41c
Compare
Summary:
Suppress _utils.logger alongside methods.logger during tuning trials to prevent
"Unshrink is not close to 1" warnings from appearing. These warnings are expected
during hyperparameter exploration but can be noisy for users running tuning jobs.
Differential Revision: D91884585