-
Notifications
You must be signed in to change notification settings - Fork 603
Migrate ExecuTorch's use of pt2e from torch.ao to torchao #10294
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/executorch/10294
Note: Links to docs will display an error until the docs builds have been completed. ❌ 1 New Failure, 1 Unrelated FailureAs of commit 5f9da89 with merge base 9aaea31 ( NEW FAILURE - The following job has failed:
FLAKY - The following job failed but was likely due to flakiness present on trunk:
This comment was automatically generated by Dr. CI and updates every 15 minutes. |
This PR needs a
|
@@ -34,25 +34,25 @@ | |||
is_ethosu, | |||
) # usort: skip | |||
from executorch.exir.backend.compile_spec_schema import CompileSpec | |||
from torch.ao.quantization.fake_quantize import ( | |||
from torch.fx import GraphModule, Node | |||
from torchao.quantization.pt2e import _ObserverOrFakeQuantizeConstructor |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Correct me if I am wrong but torchao
isn't a mandetory dep today but now it is?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
How do we define mandatory dependencies? It is installed by the install_requirements script?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Seems like we pull in source -
Line 60 in 647e1f1
url = https://github.com/pytorch/ao.git |
So this submodule is already updated since the tests are passing here.
check (1) if we run tests on et wheels with something quant, (2) if we do are they passing for this diff.
@@ -16,25 +16,25 @@ | |||
propagate_annotation, | |||
QuantizationConfig, | |||
) | |||
from torch.ao.quantization.fake_quantize import ( | |||
from torchao.quantization.pt2e.fake_quantize import ( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Adding partners for visibility |
from torch.ao.quantization.pt2e.graph_utils import find_sequential_partitions | ||
from torch.ao.quantization.quantizer import QuantizationSpec, Quantizer | ||
from torchao.quantization.pt2e import find_sequential_partitions | ||
from torchao.quantization.pt2e.observer import HistogramObserver, MinMaxObserver |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
we can remove observer
here
from torchao.quantization.pt2e.fake_quantize import FakeQuantize | ||
from torchao.quantization.pt2e.observer import MinMaxObserver, PerChannelMinMaxObserver |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
can remove observer and fake_quantize from path
Looks reasonable to me. Let's just use trunk to trigger more CI jobs. |
trunk is already triggered. Thanks! |
@metascroy has imported this pull request. If you are a Meta employee, you can view this diff on Phabricator. |
@metascroy has imported this pull request. If you are a Meta employee, you can view this diff on Phabricator. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Vulkan changes LGTM!
This pull request was exported from Phabricator. Differential Revision: D74694311 |
Summary: Most code related to PT2E quantization is migrating from torch.ao.quantization to torchao.quantization.pt2e. torchao.quantization.pt2e contains an exact copy of PT2E code in torch.ao.quantization. The torchao pin in ExecuTorch has already been bumped pick up these changes. Pull Request resolved: #10294 Reviewed By: SS-JIA Differential Revision: D74694311 Pulled By: metascroy
This pull request was exported from Phabricator. Differential Revision: D74694311 |
Summary: Most code related to PT2E quantization is migrating from torch.ao.quantization to torchao.quantization.pt2e. torchao.quantization.pt2e contains an exact copy of PT2E code in torch.ao.quantization. The torchao pin in ExecuTorch has already been bumped pick up these changes. Pull Request resolved: #10294 Reviewed By: SS-JIA Differential Revision: D74694311 Pulled By: metascroy
This pull request was exported from Phabricator. Differential Revision: D74694311 |
Summary: Most code related to PT2E quantization is migrating from torch.ao.quantization to torchao.quantization.pt2e. torchao.quantization.pt2e contains an exact copy of PT2E code in torch.ao.quantization. The torchao pin in ExecuTorch has already been bumped pick up these changes. Pull Request resolved: #10294 Reviewed By: SS-JIA Differential Revision: D74694311 Pulled By: metascroy
@metascroy has imported this pull request. If you are a Meta employee, you can view this diff on Phabricator. |
This pull request was exported from Phabricator. Differential Revision: D74694311 |
Summary: Most code related to PT2E quantization is migrating from torch.ao.quantization to torchao.quantization.pt2e. torchao.quantization.pt2e contains an exact copy of PT2E code in torch.ao.quantization. The torchao pin in ExecuTorch has already been bumped pick up these changes. Pull Request resolved: #10294 Reviewed By: SS-JIA Differential Revision: D74694311 Pulled By: metascroy
Summary: Most code related to PT2E quantization is migrating from torch.ao.quantization to torchao.quantization.pt2e. torchao.quantization.pt2e contains an exact copy of PT2E code in torch.ao.quantization. The torchao pin in ExecuTorch has already been bumped pick up these changes. Pull Request resolved: #10294 Reviewed By: SS-JIA Differential Revision: D74694311 Pulled By: metascroy
This pull request was exported from Phabricator. Differential Revision: D74694311 |
Closing this. The migration was instead done in pieces. |
Most code related to PT2E quantization is migrating from torch.ao.quantization to torchao.quantization.pt2e.
torchao.quantization.pt2e contains an exact copy of PT2E code in torch.ao.quantization.
The torchao pin in ExecuTorch has already been bumped pick up these changes.