Skip to content

Conversation

@flinder
Copy link
Contributor

@flinder flinder commented Jan 30, 2026

Summary:
Use log_loss from sklearn instead of normalized_entropy as the tuning objective.
This matches MCGrad's early stopping metric, providing directly comparable outputs
between tuning and model training. The log_loss metric is a standard choice for
probabilistic predictions and is widely understood.

Differential Revision: D91884587

@meta-cla meta-cla bot added the CLA Signed This label is managed by the Meta Open Source bot. label Jan 30, 2026
@meta-codesync
Copy link

meta-codesync bot commented Jan 30, 2026

@flinder has exported this pull request. If you are a Meta employee, you can view the originating Diff in D91884587.

flinder added a commit to flinder/MCGrad-1 that referenced this pull request Jan 30, 2026
Summary:
Pull Request resolved: facebookincubator#197

Use log_loss from sklearn instead of normalized_entropy as the tuning objective.
This matches MCGrad's early stopping metric, providing directly comparable outputs
between tuning and model training. The log_loss metric is a standard choice for
probabilistic predictions and is widely understood.

Differential Revision: D91884587
@flinder flinder force-pushed the export-D91884587 branch 2 times, most recently from 0c9ea67 to 80f26fd Compare January 30, 2026 12:29
flinder added a commit to flinder/MCGrad-1 that referenced this pull request Jan 30, 2026
Summary:

Use log_loss from sklearn instead of normalized_entropy as the tuning objective.
This matches MCGrad's early stopping metric, providing directly comparable outputs
between tuning and model training. The log_loss metric is a standard choice for
probabilistic predictions and is widely understood.

Reviewed By: Lorenzo-Perini

Differential Revision: D91884587
Summary:
Add a `use_model_predictions` parameter to `tune_mcgrad_params` with `False` as
the default. When False, returns the actual observed best trial parameters.
When True, uses the Bayesian surrogate model's predicted best parameters.

The default is False because with few tuning trials, the surrogate model may
not be well-calibrated and could return suboptimal parameters that don't match
the actual best observed trial. Users who want the model-predicted best can
set this to True.

Differential Revision: D91889101
Summary:
Use log_loss from sklearn instead of normalized_entropy as the tuning objective.
This matches MCGrad's early stopping metric, providing directly comparable outputs
between tuning and model training. The log_loss metric is a standard choice for
probabilistic predictions and is widely understood.

Reviewed By: Lorenzo-Perini

Differential Revision: D91884587
@codecov-commenter
Copy link

⚠️ Please install the 'codecov app svg image' to ensure uploads and comments are reliably processed by Codecov.

Codecov Report

✅ All modified and coverable lines are covered by tests.
⚠️ Please upload report for BASE (main@f22ddc8). Learn more about missing BASE report.
❗ Your organization needs to install the Codecov GitHub app to enable full functionality.

Additional details and impacted files
@@           Coverage Diff           @@
##             main     #197   +/-   ##
=======================================
  Coverage        ?   94.92%           
=======================================
  Files           ?        9           
  Lines           ?     1753           
  Branches        ?        0           
=======================================
  Hits            ?     1664           
  Misses          ?       89           
  Partials        ?        0           
Flag Coverage Δ
unittests 94.92% <100.00%> (?)

Flags with carried forward coverage won't be shown. Click here to find out more.

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.
  • 📦 JS Bundle Analysis: Save yourself from yourself by tracking and limiting bundle sizes in JS merges.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

CLA Signed This label is managed by the Meta Open Source bot. fb-exported meta-exported

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants