-
Notifications
You must be signed in to change notification settings - Fork 536
IOManager Interface #10418
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
IOManager Interface #10418
Conversation
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/executorch/10418
Note: Links to docs will display an error until the docs builds have been completed. ❗ 1 Active SEVsThere are 1 currently active SEVs. If your PR is affected, please view them below: ✅ No FailuresAs of commit eb2e374 with merge base de98d25 ( This comment was automatically generated by Dr. CI and updates every 15 minutes. |
This pull request was exported from Phabricator. Differential Revision: D73450877 |
934cad4
to
ecc2362
Compare
Summary: Hopefully this is sufficient for the contract. Going to do 2 follow up tests. Add a basic cpu implementation add a static attention implementation. Differential Revision: D73450877
This pull request was exported from Phabricator. Differential Revision: D73450877 |
ecc2362
to
bf3ac0b
Compare
Summary: Hopefully this is sufficient for the contract. Going to do 2 follow up tests. Add a basic cpu implementation add a static attention implementation. Differential Revision: D73450877
bf3ac0b
to
f5b9dbd
Compare
This pull request was exported from Phabricator. Differential Revision: D73450877 |
Summary: Hopefully this is sufficient for the contract. Going to do 2 follow up tests. Add a basic cpu implementation add a static attention implementation. Differential Revision: D73450877
f5b9dbd
to
37d4964
Compare
This pull request was exported from Phabricator. Differential Revision: D73450877 |
37d4964
to
aea5220
Compare
Summary: Hopefully this is sufficient for the contract. Going to do 2 follow up tests. Add a basic cpu implementation add a static attention implementation. Differential Revision: D73450877
This pull request was exported from Phabricator. Differential Revision: D73450877 |
Summary: Hopefully this is sufficient for the contract. Going to do 2 follow up tests. Add a basic cpu implementation add a static attention implementation. Differential Revision: D73450877
aea5220
to
13a543a
Compare
This pull request was exported from Phabricator. Differential Revision: D73450877 |
Summary: Hopefully this is sufficient for the contract. Going to do 2 follow up tests. Add a basic cpu implementation add a static attention implementation. Differential Revision: D73450877
13a543a
to
1b3b148
Compare
This pull request was exported from Phabricator. Differential Revision: D73450877 |
Summary: Hopefully this is sufficient for the contract. Going to do 2 follow up tests. Add a basic cpu implementation add a static attention implementation. Differential Revision: D73450877
1b3b148
to
afca5e8
Compare
Summary: Pull Request resolved: pytorch#10418 Hopefully this is sufficient for the contract. Going to do 2 follow up tests. Add a basic cpu implementation add a static attention implementation. Differential Revision: D73450877
afca5e8
to
2d2ba9c
Compare
This pull request was exported from Phabricator. Differential Revision: D73450877 |
2d2ba9c
to
7abb2d8
Compare
Summary: Hopefully this is sufficient for the contract. Going to do 2 follow up tests. Add a basic cpu implementation add a static attention implementation. Differential Revision: D73450877
Summary: Pull Request resolved: pytorch#10418 Hopefully this is sufficient for the contract. Going to do 2 follow up tests. Add a basic cpu implementation add a static attention implementation. Differential Revision: D73450877
This pull request was exported from Phabricator. Differential Revision: D73450877 |
7abb2d8
to
eb2e374
Compare
Summary:
Hopefully this is sufficient for the contract.
Going to do 2 follow up tests.
Add a basic cpu implementation
add a static attention implementation.
Differential Revision: D73450877
cc @larryliu0820 @mergennachin @cccclai @helunwencser @jackzhxng