Skip to content

Updates to use torchao's updated choose_qparams_affine and quantize/dequantize_affine #11070

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 4 commits into
base: main
Choose a base branch
from

Conversation

jainapurva
Copy link
Contributor

@jainapurva jainapurva commented May 22, 2025

Summary: Updates to use torchao's updated choose_qparams_affine and quantize/dequantize_affine. Remove zero_point_domain dependency

Differential Revision: D75228037

Copy link

pytorch-bot bot commented May 22, 2025

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/executorch/11070

Note: Links to docs will display an error until the docs builds have been completed.

✅ You can merge normally! (2 Unrelated Failures)

As of commit b205f2f with merge base a6e2961 (image):

BROKEN TRUNK - The following jobs failed but were present on the merge base:

👉 Rebase onto the `viable/strict` branch to avoid these failures

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label May 22, 2025
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D75228037

@mcr229 mcr229 added the release notes: none Do not include this in the release notes label May 22, 2025
@metascroy
Copy link
Contributor

@jainapurva I think this needs to be accompanied by a torchao pin bump in OSS executorch (checkout the torchao submodule in ET and bump the commit hash)

Copy link
Contributor

@metascroy metascroy left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Needs a pin bump

@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D75228037

jainapurva added a commit to jainapurva/executorch that referenced this pull request May 22, 2025
…equantize_affine (pytorch#11070)

Summary:
Pull Request resolved: pytorch#11070

Updates to use torchao's updated choose_qparams_affine and quantize/dequantize_affine without the zero_point_domain arg

Differential Revision: D75228037
@jainapurva jainapurva requested a review from GregoryComer as a code owner May 22, 2025 18:00
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D75228037

jainapurva added a commit to jainapurva/executorch that referenced this pull request May 22, 2025
…equantize_affine (pytorch#11070)

Summary:
Pull Request resolved: pytorch#11070

Updates to use torchao's updated choose_qparams_affine and quantize/dequantize_affine without the zero_point_domain arg

Differential Revision: D75228037
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D75228037

jainapurva added a commit to jainapurva/executorch that referenced this pull request May 22, 2025
…equantize_affine (pytorch#11070)

Summary:
Pull Request resolved: pytorch#11070

Updates to use torchao's updated choose_qparams_affine and quantize/dequantize_affine without the zero_point_domain arg

Differential Revision: D75228037
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D75228037

jainapurva added a commit to jainapurva/executorch that referenced this pull request May 22, 2025
…equantize_affine (pytorch#11070)

Summary:
Pull Request resolved: pytorch#11070

Updates to use torchao's updated choose_qparams_affine and quantize/dequantize_affine without the zero_point_domain arg

Differential Revision: D75228037
…equantize_affine (pytorch#11070)

Summary:
Pull Request resolved: pytorch#11070

Updates to use torchao's updated choose_qparams_affine and quantize/dequantize_affine without the zero_point_domain arg

Differential Revision: D75228037
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D75228037

@facebook-github-bot
Copy link
Contributor

@jainapurva has imported this pull request. If you are a Meta employee, you can view this diff on Phabricator.

@jerryzh168 jerryzh168 requested a review from metascroy May 22, 2025 22:56
@jerryzh168
Copy link
Contributor

@metascroy please take a look again

@facebook-github-bot
Copy link
Contributor

@jainapurva has imported this pull request. If you are a Meta employee, you can view this diff on Phabricator.

@facebook-github-bot
Copy link
Contributor

@jainapurva has imported this pull request. If you are a Meta employee, you can view this diff on Phabricator.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. fb-exported release notes: none Do not include this in the release notes
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants