Skip to content

Conversation

@armandsauzay
Copy link

Summary:
X-link: pytorch/FBGEMM#5154

X-link: https://github.com/facebookresearch/FBGEMM/pull/2154

Adding fp8_output_dtype parameter to the qcomms config allowing fp8 to dequantize in different float formats as opposed to only FP32

Reviewed By: spcyppt

Differential Revision: D86890315

@meta-codesync
Copy link
Contributor

meta-codesync bot commented Nov 20, 2025

@armandsauzay has exported this pull request. If you are a Meta employee, you can view the originating Diff in D86890315.

@meta-cla meta-cla bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Nov 20, 2025
armandsauzay pushed a commit to armandsauzay/FBGEMM-1 that referenced this pull request Nov 20, 2025
…ch#5154)

Summary:
X-link: meta-pytorch/torchrec#3568


X-link: facebookresearch/FBGEMM#2154

Adding fp8_output_dtype parameter to the qcomms config allowing fp8 to dequantize in different float formats as opposed to only FP32

Reviewed By: spcyppt

Differential Revision: D86890315
…pytorch#3568)

Summary:

X-link: pytorch/FBGEMM#5154

X-link: facebookresearch/FBGEMM#2154

Adding fp8_output_dtype parameter to the qcomms config allowing fp8 to dequantize in different float formats as opposed to only FP32

Reviewed By: spcyppt

Differential Revision: D86890315
armandsauzay pushed a commit to armandsauzay/FBGEMM-1 that referenced this pull request Nov 21, 2025
…ch#5154)

Summary:
X-link: meta-pytorch/torchrec#3568


X-link: facebookresearch/FBGEMM#2154

Adding fp8_output_dtype parameter to the qcomms config allowing fp8 to dequantize in different float formats as opposed to only FP32

Reviewed By: spcyppt

Differential Revision: D86890315
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. fb-exported meta-exported

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant