Skip to content
Open
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
12 changes: 12 additions & 0 deletions docs/HFONNX_TOOL_GUIDE.md
Original file line number Diff line number Diff line change
Expand Up @@ -45,6 +45,18 @@ pip install "optimum[exporters]" onnx onnxruntime transformers

**Note:** The `inspect` and `fetch` commands do NOT require Python. Only use the `convert` command if you need to export a model that doesn't already have ONNX files in its repository.

### Supported Architectures for the Convert Command

Optimum library handles the export of PyTorch models to ONNX in the exporters.onnx module. It provides classes, functions, and a command line interface to perform the export easily.
Copy link

Copilot AI Oct 21, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[nitpick] The description references 'exporters.onnx module' but this technical detail may be outdated or unnecessary for user-facing documentation. Consider simplifying to focus on what users need to know: 'The Optimum library handles the export of PyTorch models to ONNX format, providing tools to perform the export easily.'

Suggested change
Optimum library handles the export of PyTorch models to ONNX in the exporters.onnx module. It provides classes, functions, and a command line interface to perform the export easily.
The Optimum library handles the export of PyTorch models to ONNX format, providing tools to perform the export easily.

Copilot uses AI. Check for mistakes.

The complete list of supported architectures from Transformers, Diffusers, Timm & Sentence Transformers is available at the following link:

https://huggingface.co/docs/optimum-onnx/onnx/overview
Copy link

Copilot AI Oct 21, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[nitpick] Consider using markdown link syntax for better readability: [Optimum ONNX supported architectures](https://huggingface.co/docs/optimum-onnx/onnx/overview) instead of plain URLs.

Suggested change
https://huggingface.co/docs/optimum-onnx/onnx/overview
[Optimum ONNX supported architectures](https://huggingface.co/docs/optimum-onnx/onnx/overview)

Copilot uses AI. Check for mistakes.

For any unsupported architecture, detailed instructions for adding support for such architectures can be found here:

https://huggingface.co/docs/optimum-onnx/onnx/usage_guides/contribute
Copy link

Copilot AI Oct 21, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[nitpick] Consider using markdown link syntax for better readability: [Contributing support for new architectures](https://huggingface.co/docs/optimum-onnx/onnx/usage_guides/contribute) instead of plain URLs.

Suggested change
https://huggingface.co/docs/optimum-onnx/onnx/usage_guides/contribute
[Contributing support for new architectures](https://huggingface.co/docs/optimum-onnx/onnx/usage_guides/contribute)

Copilot uses AI. Check for mistakes.

## Installation

### Option 1: Build from Source
Expand Down