- 
                Notifications
    
You must be signed in to change notification settings  - Fork 451
 
Use Standardize by default for SingleTaskGP #2458
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
| 
           This pull request was exported from Phabricator. Differential Revision: D60492937  | 
    
| 
           This pull request was exported from Phabricator. Differential Revision: D60492937  | 
    
) Summary: X-link: facebook/Ax#2630 Pull Request resolved: meta-pytorch#2458 D60080819 recently updated the default `SingleTaskGP` BoTorch priors. One significant change was to remove the use of an outputscale, which may not work well if the outputs aren't standardized. This diff changes the `SingleTaskGP` to use `Standardize` and `Normalize` by default if no input/outcome transforms are specified (this allows users to explicitly pass in `None` if they don't want to use any transforms). Differential Revision: D60492937
) Summary: X-link: facebook/Ax#2630 Pull Request resolved: meta-pytorch#2458 D60080819 recently updated the default `SingleTaskGP` BoTorch priors. One significant change was to remove the use of an outputscale, which may not work well if the outputs aren't standardized. This diff changes the `SingleTaskGP` to use `Standardize` and `Normalize` by default if no input/outcome transforms are specified (this allows users to explicitly pass in `None` if they don't want to use any transforms). Differential Revision: D60492937
| 
           This pull request was exported from Phabricator. Differential Revision: D60492937  | 
    
          Codecov ReportAll modified and coverable lines are covered by tests ✅ 
 Additional details and impacted files@@           Coverage Diff           @@
##             main    #2458   +/-   ##
=======================================
  Coverage   99.98%   99.98%           
=======================================
  Files         191      191           
  Lines       16814    16823    +9     
=======================================
+ Hits        16812    16821    +9     
  Misses          2        2           ☔ View full report in Codecov by Sentry.  | 
    
| 
           This pull request was exported from Phabricator. Differential Revision: D60492937  | 
    
) Summary: X-link: facebook/Ax#2630 Pull Request resolved: meta-pytorch#2458 D60080819 recently updated the default `SingleTaskGP` BoTorch priors. One significant change was to remove the use of an outputscale, which may not work well if the outputs aren't standardized. This diff changes the `SingleTaskGP` to use `Standardize` and `Normalize` by default if no input/outcome transforms are specified (this allows users to explicitly pass in `None` if they don't want to use any transforms). Differential Revision: D60492937
| 
           This pull request was exported from Phabricator. Differential Revision: D60492937  | 
    
) Summary: X-link: facebook/Ax#2630 Pull Request resolved: meta-pytorch#2458 D60080819 recently updated the default `SingleTaskGP` BoTorch priors. One significant change was to remove the use of an outputscale, which may not work well if the outputs aren't standardized. This diff changes the `SingleTaskGP` to use `Standardize` and `Normalize` by default if no input/outcome transforms are specified (this allows users to explicitly pass in `None` if they don't want to use any transforms). Differential Revision: D60492937
Summary: Pull Request resolved: facebook#2630 X-link: meta-pytorch/botorch#2458 D60080819 recently updated the default `SingleTaskGP` BoTorch priors. One significant change was to remove the use of an outputscale, which may not work well if the outputs aren't standardized. This diff changes the `SingleTaskGP` to use `Standardize` and `Normalize` by default if no input/outcome transforms are specified (this allows users to explicitly pass in `None` if they don't want to use any transforms). Differential Revision: D60492937
| 
           This pull request was exported from Phabricator. Differential Revision: D60492937  | 
    
) Summary: X-link: facebook/Ax#2630 Pull Request resolved: meta-pytorch#2458 D60080819 recently updated the default `SingleTaskGP` BoTorch priors. One significant change was to remove the use of an outputscale, which may not work well if the outputs aren't standardized. This diff changes the `SingleTaskGP` to use `Standardize` and `Normalize` by default if no input/outcome transforms are specified (this allows users to explicitly pass in `None` if they don't want to use any transforms). Differential Revision: D60492937
Summary: Pull Request resolved: facebook#2630 X-link: meta-pytorch/botorch#2458 D60080819 recently updated the default `SingleTaskGP` BoTorch priors. One significant change was to remove the use of an outputscale, which may not work well if the outputs aren't standardized. This diff changes the `SingleTaskGP` to use `Standardize` and `Normalize` by default if no input/outcome transforms are specified (this allows users to explicitly pass in `None` if they don't want to use any transforms). Differential Revision: D60492937
| 
           This pull request was exported from Phabricator. Differential Revision: D60492937  | 
    
) Summary: X-link: facebook/Ax#2630 Pull Request resolved: meta-pytorch#2458 D60080819 recently updated the default `SingleTaskGP` BoTorch priors. One significant change was to remove the use of an outputscale, which may not work well if the outputs aren't standardized. This diff changes the `SingleTaskGP` to use `Standardize` and `Normalize` by default if no input/outcome transforms are specified (this allows users to explicitly pass in `None` if they don't want to use any transforms). Differential Revision: D60492937
| 
           This pull request was exported from Phabricator. Differential Revision: D60492937  | 
    
) Summary: X-link: facebook/Ax#2630 Pull Request resolved: meta-pytorch#2458 D60080819 recently updated the default `SingleTaskGP` BoTorch priors. One significant change was to remove the use of an outputscale, which may not work well if the outputs aren't standardized. This diff changes the `SingleTaskGP` to use `Standardize` and `Normalize` by default if no input/outcome transforms are specified (this allows users to explicitly pass in `None` if they don't want to use any transforms). Differential Revision: D60492937
Summary: Pull Request resolved: facebook#2630 X-link: meta-pytorch/botorch#2458 D60080819 recently updated the default `SingleTaskGP` BoTorch priors. One significant change was to remove the use of an outputscale, which may not work well if the outputs aren't standardized. This diff changes the `SingleTaskGP` to use `Standardize` and `Normalize` by default if no input/outcome transforms are specified (this allows users to explicitly pass in `None` if they don't want to use any transforms). Differential Revision: D60492937
| 
           This pull request was exported from Phabricator. Differential Revision: D60492937  | 
    
) Summary: X-link: facebook/Ax#2630 Pull Request resolved: meta-pytorch#2458 D60080819 recently updated the default `SingleTaskGP` BoTorch priors. One significant change was to remove the use of an outputscale, which may not work well if the outputs aren't standardized. This diff changes the `SingleTaskGP` to use `Standardize` and `Normalize` by default if no input/outcome transforms are specified (this allows users to explicitly pass in `None` if they don't want to use any transforms). Differential Revision: D60492937
| 
           This pull request was exported from Phabricator. Differential Revision: D60492937  | 
    
Summary: X-link: facebook/Ax#2630 Pull Request resolved: meta-pytorch#2458 D60080819 recently updated the default `SingleTaskGP` BoTorch priors. One significant change was to remove the use of an outputscale, which may not work well if the outputs aren't standardized. This diff changes the `SingleTaskGP` to use `Standardize` by default if no outcome transforms are specified (this allows users to explicitly pass in `None` if they don't want to use any transforms). Differential Revision: D60492937
Summary: Pull Request resolved: facebook#2630 X-link: meta-pytorch/botorch#2458 D60080819 recently updated the default `SingleTaskGP` BoTorch priors. One significant change was to remove the use of an outputscale, which may not work well if the outputs aren't standardized. This diff changes the `SingleTaskGP` to use `Standardize` by default if no outcome transforms are specified (this allows users to explicitly pass in `None` if they don't want to use any transforms). Differential Revision: D60492937
Summary: Pull Request resolved: facebook#2630 X-link: meta-pytorch/botorch#2458 D60080819 recently updated the default `SingleTaskGP` BoTorch priors. One significant change was to remove the use of an outputscale, which may not work well if the outputs aren't standardized. This diff changes the `SingleTaskGP` to use `Standardize` by default if no outcome transforms are specified (this allows users to explicitly pass in `None` if they don't want to use any transforms). Reviewed By: esantorella Differential Revision: D60492937
Summary: X-link: facebook/Ax#2630 Pull Request resolved: meta-pytorch#2458 D60080819 recently updated the default `SingleTaskGP` BoTorch priors. One significant change was to remove the use of an outputscale, which may not work well if the outputs aren't standardized. This diff changes the `SingleTaskGP` to use `Standardize` by default if no outcome transforms are specified (this allows users to explicitly pass in `None` if they don't want to use any transforms). Reviewed By: esantorella Differential Revision: D60492937
| 
           This pull request was exported from Phabricator. Differential Revision: D60492937  | 
    
Summary: Pull Request resolved: #2630 X-link: meta-pytorch/botorch#2458 D60080819 recently updated the default `SingleTaskGP` BoTorch priors. One significant change was to remove the use of an outputscale, which may not work well if the outputs aren't standardized. This diff changes the `SingleTaskGP` to use `Standardize` by default if no outcome transforms are specified (this allows users to explicitly pass in `None` if they don't want to use any transforms). Reviewed By: esantorella Differential Revision: D60492937 fbshipit-source-id: 833ff6a2e617e93f1495d978552e29a7ee943e74
| 
           This pull request has been merged in bcdea09.  | 
    
…omparing against other methods Summary: Context: This tutorial has been taking too long to run. Also, a tutorial doesn't need to serve both as a demonstration tha the method works better than other methods (in a statistically significant way) and as a demonstration of how to use it. This PR: * Only ones one replication, rather than 3. (Putting a CI on 3 data points is a little silly anyway.) * Removes the comparision methods, Sobol and qNEI with a non-warped GP. * Uses qLogNEI instead of qNEI * Use SingleTaskGP instead of deprecated FixedNoiseGP * No longer manually specifies outcome transform (building on meta-pytorch#2458) * Makes copy edits Differential Revision: D61054473
…omparing against other methods Summary: Context: This tutorial has been taking too long to run. Also, a tutorial doesn't need to serve both as a demonstration tha the method works better than other methods (in a statistically significant way) and as a demonstration of how to use it. This PR: * Only ones one replication, rather than 3. (Putting a CI on 3 data points is a little silly anyway.) * Removes the comparision methods, Sobol and qNEI with a non-warped GP. * Uses qLogNEI instead of qNEI * Use SingleTaskGP instead of deprecated FixedNoiseGP * No longer manually specifies outcome transform (building on meta-pytorch#2458) * Makes copy edits Differential Revision: D61054473
…omparing against other methods (meta-pytorch#2462) Summary: Pull Request resolved: meta-pytorch#2462 Context: This tutorial has been taking too long to run. Also, a tutorial doesn't need to serve both as a demonstration tha the method works better than other methods (in a statistically significant way) and as a demonstration of how to use it. This PR: * Only ones one replication, rather than 3. (Putting a CI on 3 data points is a little silly anyway.) * Removes the comparision methods, Sobol and qNEI with a non-warped GP. * Uses qLogNEI instead of qNEI * Use SingleTaskGP instead of deprecated FixedNoiseGP * No longer manually specifies outcome transform (building on meta-pytorch#2458) * Makes copy edits Differential Revision: D61054473
…omparing against other methods (#2462) Summary: Pull Request resolved: #2462 Context: This tutorial has been taking too long to run. Also, a tutorial doesn't need to serve both as a demonstration tha the method works better than other methods (in a statistically significant way) and as a demonstration of how to use it. This PR: * Only ones one replication, rather than 3. (Putting a CI on 3 data points is a little silly anyway.) * Removes the comparision methods, Sobol and qNEI with a non-warped GP. * Uses qLogNEI instead of qNEI * Use SingleTaskGP instead of deprecated FixedNoiseGP * No longer manually specifies outcome transform (building on #2458) * Makes copy edits Reviewed By: Balandat Differential Revision: D61054473 fbshipit-source-id: 05c97e6e908a1411b68c0af8b5175317421daf87
Summary: Makes models which had their priors updated in meta-pytorch#2507 use the `Standardize` outcome transform by default, mimicking meta-pytorch#2458 Also removes some deprecated functionality in the process, namely the `data_fidelity` argument to `SingleTaskMultiFidelityGP` as well as the `FixedNoiseMultiFidelityGP` and `FixedNoiseLCEMGP` models. TODO: Some unit tests don't actually test things with (even the now default) outcome transform - those will need to be updated. Differential Revision: D62552307
…orch#2532) Summary: Pull Request resolved: meta-pytorch#2532 Makes models which had their priors updated in meta-pytorch#2507 use the `Standardize` outcome transform by default, mimicking meta-pytorch#2458 Also removes some deprecated functionality in the process, namely the `data_fidelity` argument to `SingleTaskMultiFidelityGP` as well as the `FixedNoiseMultiFidelityGP` and `FixedNoiseLCEMGP` models. Reviewed By: saitcakmak, esantorella Differential Revision: D62552307
…orch#2532) Summary: Pull Request resolved: meta-pytorch#2532 Makes models which had their priors updated in meta-pytorch#2507 use the `Standardize` outcome transform by default, mimicking meta-pytorch#2458 Also removes some deprecated functionality in the process, namely the `data_fidelity` argument to `SingleTaskMultiFidelityGP` as well as the `FixedNoiseMultiFidelityGP` and `FixedNoiseLCEMGP` models. Reviewed By: saitcakmak, esantorella Differential Revision: D62552307
…orch#2532) Summary: Pull Request resolved: meta-pytorch#2532 Makes models which had their priors updated in meta-pytorch#2507 use the `Standardize` outcome transform by default, mimicking meta-pytorch#2458 Also removes some deprecated functionality in the process, namely the `data_fidelity` argument to `SingleTaskMultiFidelityGP` as well as the `FixedNoiseMultiFidelityGP` and `FixedNoiseLCEMGP` models. Reviewed By: saitcakmak, esantorella Differential Revision: D62552307
Summary: Makes models which had their priors updated in meta-pytorch#2507 use the `Standardize` outcome transform by default, mimicking meta-pytorch#2458 TODO: Some unit tests don't actually test things with (even the now default) outcome transform - those will need to be updated. Differential Revision: D62552307
…orch#2532) Summary: Pull Request resolved: meta-pytorch#2532 Makes models which had their priors updated in meta-pytorch#2507 use the `Standardize` outcome transform by default, mimicking meta-pytorch#2458 Also removes some deprecated functionality in the process, namely the `data_fidelity` argument to `SingleTaskMultiFidelityGP` as well as the `FixedNoiseMultiFidelityGP` and `FixedNoiseLCEMGP` models. Reviewed By: saitcakmak, esantorella Differential Revision: D62552307
…orch#2532) Summary: Pull Request resolved: meta-pytorch#2532 Makes models which had their priors updated in meta-pytorch#2507 use the `Standardize` outcome transform by default, mimicking meta-pytorch#2458 Also removes some deprecated functionality in the process, namely the `data_fidelity` argument to `SingleTaskMultiFidelityGP`, as well as the `FixedNoiseMultiFidelityGP` and `FixedNoiseLCEMGP` models. Differential Revision: D62552307 Reviewed By: saitcakmak, esantorella
…orch#2532) Summary: Pull Request resolved: meta-pytorch#2532 Makes models which had their priors updated in meta-pytorch#2507 use the `Standardize` outcome transform by default, mimicking meta-pytorch#2458 Also removes some deprecated functionality in the process, namely the `data_fidelity` argument to `SingleTaskMultiFidelityGP`, as well as the `FixedNoiseMultiFidelityGP` and `FixedNoiseLCEMGP` models. Reviewed By: saitcakmak, esantorella Differential Revision: D62552307
Summary: Pull Request resolved: #2532 Makes models which had their priors updated in #2507 use the `Standardize` outcome transform by default, mimicking #2458 Also removes some deprecated functionality in the process, namely the `data_fidelity` argument to `SingleTaskMultiFidelityGP`, as well as the `FixedNoiseMultiFidelityGP` and `FixedNoiseLCEMGP` models. Reviewed By: saitcakmak, esantorella Differential Revision: D62552307 fbshipit-source-id: fac80b577b312e0462a669821ab2290ac87fb849
Summary: D60080819 recently updated the default
SingleTaskGPBoTorch priors. One significant change was to remove the use of an outputscale, which may not work well if the outputs aren't standardized. This diff changes the SingleTaskGP to use Standardize by default if no outcome transforms are specified (this allows users to explicitly pass in None if they don't want to use any transforms).Differential Revision: D60492937