Skip to content

Conversation

@sdaulton
Copy link
Contributor

@sdaulton sdaulton commented Mar 5, 2025

Summary:
This diff adds an incremental qLogNEI, that addresses many cases where the first candidate in the batch has positive EI (and satisfies the constraints) and subsequent arms violate the constraints (often severely).

The issue appears to stem from optimizing the joint EI of the new candidate and the pending points w.r.t the current incumbent(s). My hypothesis is that this makes the initialization strategy perform worse and choose bad starting points. Using sequential batch optimization and optimizing the incremental EI of the new arm relative to the pending points (and the current incumbent) avoids the issue by only quanitifying the improvment of the current arm being optimized.

TODO: add this for qNEI in a later diff, but that seems low pri since qLogNEI is widely used.

Reviewed By: esantorella

Differential Revision: D70288526

@facebook-github-bot facebook-github-bot added the CLA Signed Do not delete this pull request or issue due to inactivity. label Mar 5, 2025
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D70288526

@codecov
Copy link

codecov bot commented Mar 5, 2025

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 99.99%. Comparing base (78c04e2) to head (e8c88cb).
Report is 1 commits behind head on main.

Additional details and impacted files
@@           Coverage Diff           @@
##             main    #2760   +/-   ##
=======================================
  Coverage   99.99%   99.99%           
=======================================
  Files         203      203           
  Lines       18685    18705   +20     
=======================================
+ Hits        18684    18704   +20     
  Misses          1        1           

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • Test Analytics: Detect flaky tests, report on failures, and find test suite problems.
  • 📦 JS Bundle Analysis: Save yourself from yourself by tracking and limiting bundle sizes in JS merges.

sdaulton added a commit to sdaulton/botorch that referenced this pull request Mar 5, 2025
Summary:

This diff adds an incremental qLogNEI, that addresses many cases where the first candidate in the batch has positive EI (and satisfies the constraints) and subsequent arms violate the constraints (often severely).

The issue appears to stem from optimizing the joint EI of the new candidate and the pending points w.r.t the current incumbent(s). My hypothesis is that this makes the initialization strategy perform worse and choose bad starting points. Using sequential batch optimization and optimizing the incremental EI of the new arm relative to the pending points (and the current incumbent) avoids the issue by only quanitifying the improvment of the current arm being optimized.

TODO: add this for qNEI in a later diff, but that seems low pri since qLogNEI is widely used.

Reviewed By: esantorella

Differential Revision: D70288526
sdaulton added a commit to sdaulton/botorch that referenced this pull request Mar 5, 2025
Summary:

This diff adds an incremental qLogNEI, that addresses many cases where the first candidate in the batch has positive EI (and satisfies the constraints) and subsequent arms violate the constraints (often severely).

The issue appears to stem from optimizing the joint EI of the new candidate and the pending points w.r.t the current incumbent(s). My hypothesis is that this makes the initialization strategy perform worse and choose bad starting points. Using sequential batch optimization and optimizing the incremental EI of the new arm relative to the pending points (and the current incumbent) avoids the issue by only quanitifying the improvment of the current arm being optimized.

TODO: add this for qNEI in a later diff, but that seems low pri since qLogNEI is widely used.

Reviewed By: esantorella

Differential Revision: D70288526
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D70288526

sdaulton added a commit to sdaulton/botorch that referenced this pull request Mar 5, 2025
Summary:
Pull Request resolved: meta-pytorch#2760

This diff adds an incremental qLogNEI, that addresses many cases where the first candidate in the batch has positive EI (and satisfies the constraints) and subsequent arms violate the constraints (often severely).

The issue appears to stem from optimizing the joint EI of the new candidate and the pending points w.r.t the current incumbent(s). My hypothesis is that this makes the initialization strategy perform worse and choose bad starting points. Using sequential batch optimization and optimizing the incremental EI of the new arm relative to the pending points (and the current incumbent) avoids the issue by only quanitifying the improvment of the current arm being optimized.

TODO: add this for qNEI in a later diff, but that seems low pri since qLogNEI is widely used.

Reviewed By: esantorella

Differential Revision: D70288526
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D70288526

sdaulton added a commit to sdaulton/botorch that referenced this pull request Mar 5, 2025
Summary:
Pull Request resolved: meta-pytorch#2760

This diff adds an incremental qLogNEI, that addresses many cases where the first candidate in the batch has positive EI (and satisfies the constraints) and subsequent arms violate the constraints (often severely).

The issue appears to stem from optimizing the joint EI of the new candidate and the pending points w.r.t the current incumbent(s). My hypothesis is that this makes the initialization strategy perform worse and choose bad starting points. Using sequential batch optimization and optimizing the incremental EI of the new arm relative to the pending points (and the current incumbent) avoids the issue by only quanitifying the improvment of the current arm being optimized.

TODO: add this for qNEI in a later diff, but that seems low pri since qLogNEI is widely used.

Reviewed By: esantorella

Differential Revision: D70288526
sdaulton added a commit to sdaulton/botorch that referenced this pull request Mar 5, 2025
Summary:

This diff adds an incremental qLogNEI, that addresses many cases where the first candidate in the batch has positive EI (and satisfies the constraints) and subsequent arms violate the constraints (often severely).

The issue appears to stem from optimizing the joint EI of the new candidate and the pending points w.r.t the current incumbent(s). My hypothesis is that this makes the initialization strategy perform worse and choose bad starting points. Using sequential batch optimization and optimizing the incremental EI of the new arm relative to the pending points (and the current incumbent) avoids the issue by only quanitifying the improvment of the current arm being optimized.

TODO: add this for qNEI in a later diff, but that seems low pri since qLogNEI is widely used.

Reviewed By: esantorella

Differential Revision: D70288526
sdaulton added a commit to sdaulton/botorch that referenced this pull request Mar 5, 2025
Summary:

This diff adds an incremental qLogNEI, that addresses many cases where the first candidate in the batch has positive EI (and satisfies the constraints) and subsequent arms violate the constraints (often severely).

The issue appears to stem from optimizing the joint EI of the new candidate and the pending points w.r.t the current incumbent(s). My hypothesis is that this makes the initialization strategy perform worse and choose bad starting points. Using sequential batch optimization and optimizing the incremental EI of the new arm relative to the pending points (and the current incumbent) avoids the issue by only quanitifying the improvment of the current arm being optimized.

TODO: add this for qNEI in a later diff, but that seems low pri since qLogNEI is widely used.

Reviewed By: esantorella

Differential Revision: D70288526
@sdaulton sdaulton force-pushed the export-D70288526 branch 2 times, most recently from 0500e05 to 7c5117a Compare March 5, 2025 20:15
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D70288526

sdaulton added a commit to sdaulton/botorch that referenced this pull request Mar 5, 2025
Summary:
Pull Request resolved: meta-pytorch#2760

This diff adds an incremental qLogNEI, that addresses many cases where the first candidate in the batch has positive EI (and satisfies the constraints) and subsequent arms violate the constraints (often severely).

The issue appears to stem from optimizing the joint EI of the new candidate and the pending points w.r.t the current incumbent(s). My hypothesis is that this makes the initialization strategy perform worse and choose bad starting points. Using sequential batch optimization and optimizing the incremental EI of the new arm relative to the pending points (and the current incumbent) avoids the issue by only quanitifying the improvment of the current arm being optimized.

TODO: add this for qNEI in a later diff, but that seems low pri since qLogNEI is widely used.

Reviewed By: esantorella

Differential Revision: D70288526
Summary:
Pull Request resolved: meta-pytorch#2760

This diff adds an incremental qLogNEI, that addresses many cases where the first candidate in the batch has positive EI (and satisfies the constraints) and subsequent arms violate the constraints (often severely).

The issue appears to stem from optimizing the joint EI of the new candidate and the pending points w.r.t the current incumbent(s). My hypothesis is that this makes the initialization strategy perform worse and choose bad starting points. Using sequential batch optimization and optimizing the incremental EI of the new arm relative to the pending points (and the current incumbent) avoids the issue by only quanitifying the improvment of the current arm being optimized.

TODO: add this for qNEI in a later diff, but that seems low pri since qLogNEI is widely used.

Reviewed By: esantorella

Differential Revision: D70288526
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D70288526

@facebook-github-bot
Copy link
Contributor

This pull request has been merged in 290c0ba.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

CLA Signed Do not delete this pull request or issue due to inactivity. fb-exported Merged

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants