-
-
Couldn't load subscription status.
- Fork 0
Added supported architecture information to HFONNX_TOOL_GUIDE file #3
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull Request Overview
This PR adds documentation about supported architectures for the convert command in the HFONNX tool guide. The update provides users with information about which model architectures are supported by the Optimum library's ONNX export functionality and how to contribute support for unsupported architectures.
- Added a new "Supported Architectures for the Convert Command" section
- Included reference links to Optimum ONNX documentation for supported architectures and contribution guidelines
Tip: Customize your code reviews with copilot-instructions.md. Create the file or learn how to get started.
|
|
||
| ### Supported Architectures for the Convert Command | ||
|
|
||
| Optimum library handles the export of PyTorch models to ONNX in the exporters.onnx module. It provides classes, functions, and a command line interface to perform the export easily. |
Copilot
AI
Oct 21, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[nitpick] The description references 'exporters.onnx module' but this technical detail may be outdated or unnecessary for user-facing documentation. Consider simplifying to focus on what users need to know: 'The Optimum library handles the export of PyTorch models to ONNX format, providing tools to perform the export easily.'
| Optimum library handles the export of PyTorch models to ONNX in the exporters.onnx module. It provides classes, functions, and a command line interface to perform the export easily. | |
| The Optimum library handles the export of PyTorch models to ONNX format, providing tools to perform the export easily. |
|
|
||
| The complete list of supported architectures from Transformers, Diffusers, Timm & Sentence Transformers is available at the following link: | ||
|
|
||
| https://huggingface.co/docs/optimum-onnx/onnx/overview |
Copilot
AI
Oct 21, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[nitpick] Consider using markdown link syntax for better readability: [Optimum ONNX supported architectures](https://huggingface.co/docs/optimum-onnx/onnx/overview) instead of plain URLs.
| https://huggingface.co/docs/optimum-onnx/onnx/overview | |
| [Optimum ONNX supported architectures](https://huggingface.co/docs/optimum-onnx/onnx/overview) |
|
|
||
| For any unsupported architecture, detailed instructions for adding support for such architectures can be found here: | ||
|
|
||
| https://huggingface.co/docs/optimum-onnx/onnx/usage_guides/contribute |
Copilot
AI
Oct 21, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[nitpick] Consider using markdown link syntax for better readability: [Contributing support for new architectures](https://huggingface.co/docs/optimum-onnx/onnx/usage_guides/contribute) instead of plain URLs.
| https://huggingface.co/docs/optimum-onnx/onnx/usage_guides/contribute | |
| [Contributing support for new architectures](https://huggingface.co/docs/optimum-onnx/onnx/usage_guides/contribute) |
No description provided.