-
Notifications
You must be signed in to change notification settings - Fork 450
Allow batched fixed features in gen_candidates_scipy and gen_candidates_torch
#2893
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
This pull request was exported from Phabricator. Differential Revision: D77043260 |
…dates_torch` (meta-pytorch#2893) Summary: This is a PR that should enable batching for mixed optimization later. To enable it, we need to allow setting different fixed features for different initial conditions during optimizations. We do this by allowing passing tensors of shape [b] or [b,q] to `gen_candidates_scipy` and for `gen_candidates_torch` for compatibility. Differential Revision: D77043260
890115c to
f273a2e
Compare
|
This pull request was exported from Phabricator. Differential Revision: D77043260 |
…dates_torch` (meta-pytorch#2893) Summary: Pull Request resolved: meta-pytorch#2893 This is a PR that should enable batching for mixed optimization later. To enable it, we need to allow setting different fixed features for different initial conditions during optimizations. We do this by allowing passing tensors of shape [b] or [b,q] to `gen_candidates_scipy` and for `gen_candidates_torch` for compatibility. Differential Revision: D77043260
f273a2e to
d5e40db
Compare
…dates_torch` (meta-pytorch#2893) Summary: This is a PR that should enable batching for mixed optimization later. To enable it, we need to allow setting different fixed features for different initial conditions during optimizations. We do this by allowing passing tensors of shape [b] or [b,q] to `gen_candidates_scipy` and for `gen_candidates_torch` for compatibility. Differential Revision: D77043260
d5e40db to
996d52a
Compare
|
This pull request was exported from Phabricator. Differential Revision: D77043260 |
…dates_torch` (meta-pytorch#2893) Summary: Pull Request resolved: meta-pytorch#2893 This is a PR that should enable batching for mixed optimization later. To enable it, we need to allow setting different fixed features for different initial conditions during optimizations. We do this by allowing passing tensors of shape [b] or [b,q] to `gen_candidates_scipy` and for `gen_candidates_torch` for compatibility. Differential Revision: D77043260
996d52a to
952720b
Compare
Codecov ReportAll modified and coverable lines are covered by tests ✅
Additional details and impacted files@@ Coverage Diff @@
## main #2893 +/- ##
=========================================
Coverage 100.00% 100.00%
=========================================
Files 212 212
Lines 19778 19794 +16
=========================================
+ Hits 19778 19794 +16 ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
…dates_torch` (meta-pytorch#2893) Summary: This is a PR that should enable batching for mixed optimization later. To enable it, we need to allow setting different fixed features for different initial conditions during optimizations. We do this by allowing passing tensors of shape [b] or [b,q] to `gen_candidates_scipy` and for `gen_candidates_torch` for compatibility. Differential Revision: D77043260
…dates_torch` (meta-pytorch#2893) Summary: This is a PR that should enable batching for mixed optimization later. To enable it, we need to allow setting different fixed features for different initial conditions during optimizations. We do this by allowing passing tensors of shape [b] or [b,q] to `gen_candidates_scipy` and for `gen_candidates_torch` for compatibility. Reviewed By: saitcakmak Differential Revision: D77043260
…dates_torch` (meta-pytorch#2893) Summary: This is a PR that should enable batching for mixed optimization later. To enable it, we need to allow setting different fixed features for different initial conditions during optimizations. We do this by allowing passing tensors of shape [b] or [b,q] to `gen_candidates_scipy` and for `gen_candidates_torch` for compatibility. Reviewed By: saitcakmak Differential Revision: D77043260
…dates_torch` (meta-pytorch#2893) Summary: This is a PR that should enable batching for mixed optimization later. To enable it, we need to allow setting different fixed features for different initial conditions during optimizations. We do this by allowing passing tensors of shape [b] or [b,q] to `gen_candidates_scipy` and for `gen_candidates_torch` for compatibility. Reviewed By: saitcakmak Differential Revision: D77043260
…dates_torch` (meta-pytorch#2893) Summary: This is a PR that should enable batching for mixed optimization later. To enable it, we need to allow setting different fixed features for different initial conditions during optimizations. We do this by allowing passing tensors of shape [b] or [b,q] to `gen_candidates_scipy` and for `gen_candidates_torch` for compatibility. Reviewed By: saitcakmak Differential Revision: D77043260
…dates_torch` (meta-pytorch#2893) Summary: This is a PR that should enable batching for mixed optimization later. To enable it, we need to allow setting different fixed features for different initial conditions during optimizations. We do this by allowing passing tensors of shape [b] or [b,q] to `gen_candidates_scipy` and for `gen_candidates_torch` for compatibility. Reviewed By: saitcakmak Differential Revision: D77043260
…dates_torch` (meta-pytorch#2893) Summary: This is a PR that should enable batching for mixed optimization later. To enable it, we need to allow setting different fixed features for different initial conditions during optimizations. We do this by allowing passing tensors of shape [b] or [b,q] to `gen_candidates_scipy` and for `gen_candidates_torch` for compatibility. Reviewed By: saitcakmak Differential Revision: D77043260
…dates_torch` (meta-pytorch#2893) Summary: This is a PR that should enable batching for mixed optimization later. To enable it, we need to allow setting different fixed features for different initial conditions during optimizations. We do this by allowing passing tensors of shape [b] or [b,q] to `gen_candidates_scipy` and for `gen_candidates_torch` for compatibility. Reviewed By: saitcakmak Differential Revision: D77043260
…dates_torch` (meta-pytorch#2893) Summary: This is a PR that should enable batching for mixed optimization later. To enable it, we need to allow setting different fixed features for different initial conditions during optimizations. We do this by allowing passing tensors of shape [b] or [b,q] to `gen_candidates_scipy` and for `gen_candidates_torch` for compatibility. Reviewed By: saitcakmak Differential Revision: D77043260
…dates_torch` (meta-pytorch#2893) Summary: This is a PR that should enable batching for mixed optimization later. To enable it, we need to allow setting different fixed features for different initial conditions during optimizations. We do this by allowing passing tensors of shape [b] or [b,q] to `gen_candidates_scipy` and for `gen_candidates_torch` for compatibility. Reviewed By: saitcakmak Differential Revision: D77043260
…dates_torch` (meta-pytorch#2893) Summary: This is a PR that should enable batching for mixed optimization later. To enable it, we need to allow setting different fixed features for different initial conditions during optimizations. We do this by allowing passing tensors of shape [b] or [b,q] to `gen_candidates_scipy` and for `gen_candidates_torch` for compatibility. Reviewed By: saitcakmak Differential Revision: D77043260
952720b to
d2a188d
Compare
|
This pull request was exported from Phabricator. Differential Revision: D77043260 |
…dates_torch` (meta-pytorch#2893) Summary: Pull Request resolved: meta-pytorch#2893 This is a PR that should enable batching for mixed optimization later. To enable it, we need to allow setting different fixed features for different initial conditions during optimizations. We do this by allowing passing tensors of shape [b] or [b,q] to `gen_candidates_scipy` and for `gen_candidates_torch` for compatibility. Reviewed By: saitcakmak Differential Revision: D77043260
d2a188d to
5fa64d4
Compare
…dates_torch` (meta-pytorch#2893) Summary: This is a PR that should enable batching for mixed optimization later. To enable it, we need to allow setting different fixed features for different initial conditions during optimizations. We do this by allowing passing tensors of shape [b] or [b,q] to `gen_candidates_scipy` and for `gen_candidates_torch` for compatibility. Reviewed By: saitcakmak Differential Revision: D77043260
5fa64d4 to
1038467
Compare
|
This pull request was exported from Phabricator. Differential Revision: D77043260 |
1038467 to
75623ed
Compare
…dates_torch` (meta-pytorch#2893) Summary: This is a PR that should enable batching for mixed optimization later. To enable it, we need to allow setting different fixed features for different initial conditions during optimizations. We do this by allowing passing tensors of shape [b] or [b,q] to `gen_candidates_scipy` and for `gen_candidates_torch` for compatibility. Reviewed By: saitcakmak Differential Revision: D77043260
|
This pull request was exported from Phabricator. Differential Revision: D77043260 |
…dates_torch` (meta-pytorch#2893) Summary: Pull Request resolved: meta-pytorch#2893 This is a PR that should enable batching for mixed optimization later. To enable it, we need to allow setting different fixed features for different initial conditions during optimizations. We do this by allowing passing tensors of shape [b] or [b,q] to `gen_candidates_scipy` and for `gen_candidates_torch` for compatibility. Reviewed By: saitcakmak Differential Revision: D77043260
75623ed to
47e6034
Compare
…dates_torch` (meta-pytorch#2893) Summary: This is a PR that should enable batching for mixed optimization later. To enable it, we need to allow setting different fixed features for different initial conditions during optimizations. We do this by allowing passing tensors of shape [b] or [b,q] to `gen_candidates_scipy` and for `gen_candidates_torch` for compatibility. Reviewed By: saitcakmak Differential Revision: D77043260
…dates_torch` (meta-pytorch#2893) Summary: This is a PR that should enable batching for mixed optimization later. To enable it, we need to allow setting different fixed features for different initial conditions during optimizations. We do this by allowing passing tensors of shape [b] or [b,q] to `gen_candidates_scipy` and for `gen_candidates_torch` for compatibility. Reviewed By: saitcakmak Differential Revision: D77043260
47e6034 to
f029193
Compare
|
This pull request was exported from Phabricator. Differential Revision: D77043260 |
|
This pull request has been merged in 78de86d. |
Summary:
This is a PR that should enable batching for mixed optimization later. To enable it, we need to allow setting different fixed features for different initial conditions during optimizations.
We do this by allowing passing tensors of shape [b] or [b,q] to
gen_candidates_scipyand forgen_candidates_torchfor compatibility.Differential Revision: D77043260