Skip to content

Conversation

@aemmatty
Copy link
Collaborator

No description provided.

@aemmatty aemmatty requested a review from Copilot October 21, 2025 13:47
Copy link

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull Request Overview

This PR adds documentation about supported architectures for the convert command in the HFONNX tool guide. The update provides users with information about which model architectures are supported by the Optimum library's ONNX export functionality and how to contribute support for unsupported architectures.

  • Added a new "Supported Architectures for the Convert Command" section
  • Included reference links to Optimum ONNX documentation for supported architectures and contribution guidelines

Tip: Customize your code reviews with copilot-instructions.md. Create the file or learn how to get started.


### Supported Architectures for the Convert Command

Optimum library handles the export of PyTorch models to ONNX in the exporters.onnx module. It provides classes, functions, and a command line interface to perform the export easily.
Copy link

Copilot AI Oct 21, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[nitpick] The description references 'exporters.onnx module' but this technical detail may be outdated or unnecessary for user-facing documentation. Consider simplifying to focus on what users need to know: 'The Optimum library handles the export of PyTorch models to ONNX format, providing tools to perform the export easily.'

Suggested change
Optimum library handles the export of PyTorch models to ONNX in the exporters.onnx module. It provides classes, functions, and a command line interface to perform the export easily.
The Optimum library handles the export of PyTorch models to ONNX format, providing tools to perform the export easily.

Copilot uses AI. Check for mistakes.

The complete list of supported architectures from Transformers, Diffusers, Timm & Sentence Transformers is available at the following link:

https://huggingface.co/docs/optimum-onnx/onnx/overview
Copy link

Copilot AI Oct 21, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[nitpick] Consider using markdown link syntax for better readability: [Optimum ONNX supported architectures](https://huggingface.co/docs/optimum-onnx/onnx/overview) instead of plain URLs.

Suggested change
https://huggingface.co/docs/optimum-onnx/onnx/overview
[Optimum ONNX supported architectures](https://huggingface.co/docs/optimum-onnx/onnx/overview)

Copilot uses AI. Check for mistakes.

For any unsupported architecture, detailed instructions for adding support for such architectures can be found here:

https://huggingface.co/docs/optimum-onnx/onnx/usage_guides/contribute
Copy link

Copilot AI Oct 21, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[nitpick] Consider using markdown link syntax for better readability: [Contributing support for new architectures](https://huggingface.co/docs/optimum-onnx/onnx/usage_guides/contribute) instead of plain URLs.

Suggested change
https://huggingface.co/docs/optimum-onnx/onnx/usage_guides/contribute
[Contributing support for new architectures](https://huggingface.co/docs/optimum-onnx/onnx/usage_guides/contribute)

Copilot uses AI. Check for mistakes.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants