- 
                Notifications
    
You must be signed in to change notification settings  - Fork 451
 
incremental qLogNEI #2760
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
incremental qLogNEI #2760
Conversation
| 
           This pull request was exported from Phabricator. Differential Revision: D70288526  | 
    
2033d27    to
    d1d4c5c      
    Compare
  
    
          Codecov ReportAll modified and coverable lines are covered by tests ✅ 
 Additional details and impacted files@@           Coverage Diff           @@
##             main    #2760   +/-   ##
=======================================
  Coverage   99.99%   99.99%           
=======================================
  Files         203      203           
  Lines       18685    18705   +20     
=======================================
+ Hits        18684    18704   +20     
  Misses          1        1           ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
  | 
    
Summary: This diff adds an incremental qLogNEI, that addresses many cases where the first candidate in the batch has positive EI (and satisfies the constraints) and subsequent arms violate the constraints (often severely). The issue appears to stem from optimizing the joint EI of the new candidate and the pending points w.r.t the current incumbent(s). My hypothesis is that this makes the initialization strategy perform worse and choose bad starting points. Using sequential batch optimization and optimizing the incremental EI of the new arm relative to the pending points (and the current incumbent) avoids the issue by only quanitifying the improvment of the current arm being optimized. TODO: add this for qNEI in a later diff, but that seems low pri since qLogNEI is widely used. Reviewed By: esantorella Differential Revision: D70288526
d1d4c5c    to
    5c4f885      
    Compare
  
    Summary: This diff adds an incremental qLogNEI, that addresses many cases where the first candidate in the batch has positive EI (and satisfies the constraints) and subsequent arms violate the constraints (often severely). The issue appears to stem from optimizing the joint EI of the new candidate and the pending points w.r.t the current incumbent(s). My hypothesis is that this makes the initialization strategy perform worse and choose bad starting points. Using sequential batch optimization and optimizing the incremental EI of the new arm relative to the pending points (and the current incumbent) avoids the issue by only quanitifying the improvment of the current arm being optimized. TODO: add this for qNEI in a later diff, but that seems low pri since qLogNEI is widely used. Reviewed By: esantorella Differential Revision: D70288526
5c4f885    to
    9b6cb72      
    Compare
  
    | 
           This pull request was exported from Phabricator. Differential Revision: D70288526  | 
    
Summary: Pull Request resolved: meta-pytorch#2760 This diff adds an incremental qLogNEI, that addresses many cases where the first candidate in the batch has positive EI (and satisfies the constraints) and subsequent arms violate the constraints (often severely). The issue appears to stem from optimizing the joint EI of the new candidate and the pending points w.r.t the current incumbent(s). My hypothesis is that this makes the initialization strategy perform worse and choose bad starting points. Using sequential batch optimization and optimizing the incremental EI of the new arm relative to the pending points (and the current incumbent) avoids the issue by only quanitifying the improvment of the current arm being optimized. TODO: add this for qNEI in a later diff, but that seems low pri since qLogNEI is widely used. Reviewed By: esantorella Differential Revision: D70288526
9b6cb72    to
    99d4093      
    Compare
  
    | 
           This pull request was exported from Phabricator. Differential Revision: D70288526  | 
    
Summary: Pull Request resolved: meta-pytorch#2760 This diff adds an incremental qLogNEI, that addresses many cases where the first candidate in the batch has positive EI (and satisfies the constraints) and subsequent arms violate the constraints (often severely). The issue appears to stem from optimizing the joint EI of the new candidate and the pending points w.r.t the current incumbent(s). My hypothesis is that this makes the initialization strategy perform worse and choose bad starting points. Using sequential batch optimization and optimizing the incremental EI of the new arm relative to the pending points (and the current incumbent) avoids the issue by only quanitifying the improvment of the current arm being optimized. TODO: add this for qNEI in a later diff, but that seems low pri since qLogNEI is widely used. Reviewed By: esantorella Differential Revision: D70288526
99d4093    to
    d101274      
    Compare
  
    Summary: This diff adds an incremental qLogNEI, that addresses many cases where the first candidate in the batch has positive EI (and satisfies the constraints) and subsequent arms violate the constraints (often severely). The issue appears to stem from optimizing the joint EI of the new candidate and the pending points w.r.t the current incumbent(s). My hypothesis is that this makes the initialization strategy perform worse and choose bad starting points. Using sequential batch optimization and optimizing the incremental EI of the new arm relative to the pending points (and the current incumbent) avoids the issue by only quanitifying the improvment of the current arm being optimized. TODO: add this for qNEI in a later diff, but that seems low pri since qLogNEI is widely used. Reviewed By: esantorella Differential Revision: D70288526
d101274    to
    7c5117a      
    Compare
  
    Summary: This diff adds an incremental qLogNEI, that addresses many cases where the first candidate in the batch has positive EI (and satisfies the constraints) and subsequent arms violate the constraints (often severely). The issue appears to stem from optimizing the joint EI of the new candidate and the pending points w.r.t the current incumbent(s). My hypothesis is that this makes the initialization strategy perform worse and choose bad starting points. Using sequential batch optimization and optimizing the incremental EI of the new arm relative to the pending points (and the current incumbent) avoids the issue by only quanitifying the improvment of the current arm being optimized. TODO: add this for qNEI in a later diff, but that seems low pri since qLogNEI is widely used. Reviewed By: esantorella Differential Revision: D70288526
0500e05    to
    7c5117a      
    Compare
  
    | 
           This pull request was exported from Phabricator. Differential Revision: D70288526  | 
    
7c5117a    to
    fc6757f      
    Compare
  
    Summary: Pull Request resolved: meta-pytorch#2760 This diff adds an incremental qLogNEI, that addresses many cases where the first candidate in the batch has positive EI (and satisfies the constraints) and subsequent arms violate the constraints (often severely). The issue appears to stem from optimizing the joint EI of the new candidate and the pending points w.r.t the current incumbent(s). My hypothesis is that this makes the initialization strategy perform worse and choose bad starting points. Using sequential batch optimization and optimizing the incremental EI of the new arm relative to the pending points (and the current incumbent) avoids the issue by only quanitifying the improvment of the current arm being optimized. TODO: add this for qNEI in a later diff, but that seems low pri since qLogNEI is widely used. Reviewed By: esantorella Differential Revision: D70288526
Summary: Pull Request resolved: meta-pytorch#2760 This diff adds an incremental qLogNEI, that addresses many cases where the first candidate in the batch has positive EI (and satisfies the constraints) and subsequent arms violate the constraints (often severely). The issue appears to stem from optimizing the joint EI of the new candidate and the pending points w.r.t the current incumbent(s). My hypothesis is that this makes the initialization strategy perform worse and choose bad starting points. Using sequential batch optimization and optimizing the incremental EI of the new arm relative to the pending points (and the current incumbent) avoids the issue by only quanitifying the improvment of the current arm being optimized. TODO: add this for qNEI in a later diff, but that seems low pri since qLogNEI is widely used. Reviewed By: esantorella Differential Revision: D70288526
| 
           This pull request was exported from Phabricator. Differential Revision: D70288526  | 
    
fc6757f    to
    e8c88cb      
    Compare
  
    | 
           This pull request has been merged in 290c0ba.  | 
    
Summary:
This diff adds an incremental qLogNEI, that addresses many cases where the first candidate in the batch has positive EI (and satisfies the constraints) and subsequent arms violate the constraints (often severely).
The issue appears to stem from optimizing the joint EI of the new candidate and the pending points w.r.t the current incumbent(s). My hypothesis is that this makes the initialization strategy perform worse and choose bad starting points. Using sequential batch optimization and optimizing the incremental EI of the new arm relative to the pending points (and the current incumbent) avoids the issue by only quanitifying the improvment of the current arm being optimized.
TODO: add this for qNEI in a later diff, but that seems low pri since qLogNEI is widely used.
Reviewed By: esantorella
Differential Revision: D70288526