Skip to content

[fp8] Support fp8e4m3 in torch_xla #8005

Open
@miladm

Description

@miladm

🚀 Feature

Please enable fp8e4m3 in torch_xla. This feature is in flight in openxla: https://github.com/openxla/xla/pull/16585/files

Today, PyTorch doesn't support fp8e4m3 yet, only the funz variants are supported. @amithrm wants to see this feature as an alternative to fp8e4m3fn.

cc @amithrm @JackCaoG

Metadata

Metadata

Assignees

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions