- 
          
- 
                Notifications
    You must be signed in to change notification settings 
- Fork 0
Added supported architecture information to HFONNX_TOOL_GUIDE file #3
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Changes from all commits
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change | ||||
|---|---|---|---|---|---|---|
|  | @@ -45,6 +45,18 @@ pip install "optimum[exporters]" onnx onnxruntime transformers | |||||
|  | ||||||
| **Note:** The `inspect` and `fetch` commands do NOT require Python. Only use the `convert` command if you need to export a model that doesn't already have ONNX files in its repository. | ||||||
|  | ||||||
| ### Supported Architectures for the Convert Command | ||||||
|  | ||||||
| Optimum library handles the export of PyTorch models to ONNX in the exporters.onnx module. It provides classes, functions, and a command line interface to perform the export easily. | ||||||
|  | ||||||
| The complete list of supported architectures from Transformers, Diffusers, Timm & Sentence Transformers is available at the following link: | ||||||
|  | ||||||
| https://huggingface.co/docs/optimum-onnx/onnx/overview | ||||||
| 
     | ||||||
| https://huggingface.co/docs/optimum-onnx/onnx/overview | |
| [Optimum ONNX supported architectures](https://huggingface.co/docs/optimum-onnx/onnx/overview) | 
    
      
    
      Copilot
AI
    
    
    
      Oct 21, 2025 
    
  
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[nitpick] Consider using markdown link syntax for better readability: [Contributing support for new architectures](https://huggingface.co/docs/optimum-onnx/onnx/usage_guides/contribute) instead of plain URLs.
| https://huggingface.co/docs/optimum-onnx/onnx/usage_guides/contribute | |
| [Contributing support for new architectures](https://huggingface.co/docs/optimum-onnx/onnx/usage_guides/contribute) | 
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[nitpick] The description references 'exporters.onnx module' but this technical detail may be outdated or unnecessary for user-facing documentation. Consider simplifying to focus on what users need to know: 'The Optimum library handles the export of PyTorch models to ONNX format, providing tools to perform the export easily.'