Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Tracking issue for supported ONNX operators #14

Open
robertknight opened this issue Sep 17, 2023 · 4 comments
Open

Tracking issue for supported ONNX operators #14

robertknight opened this issue Sep 17, 2023 · 4 comments

Comments

@robertknight
Copy link
Owner

robertknight commented Sep 17, 2023

This is a tracking issue listing which ONNX operators are currently supported.

Some things to note:

  • Support for an operator does not mean that all attributes or data types listed in the current spec are supported.
  • Some operators require additional dependencies to support. These typically require enabling additional crate features. For example the Random* ops require enabling the random crate feature
  • Some operators are deprecated in the spec. Their non-deprecated replacements are implemented. This includes: Scatter, Upsample.
  • Operators are usually implemented after finding a model that needs them. These models then serve as an initial test case. If you need an operator which is not currently listed as supported, it is helpful (but not essential) if you can point to an open source ONNX model which needs it.
Script used to generate list
from bs4 import BeautifulSoup
import requests

# URL of the ONNX operators page
url = "https://onnx.ai/onnx/operators/"

# Path to FlatBuffers schema listing supported ops.
schema_path = "src/schema.fbs"

# Fetch ONNX operators page and extract the list of operators.
#
# This assumes the operators are listed in the first table on the page and
# that the operator name is the first column.
response = requests.get(url)
soup = BeautifulSoup(response.content, 'html.parser')
operator_names = []

table = soup.find('table')
rows = table.find_all('tr')
for row in rows:
    cols = row.find_all('td')
    if len(cols) >= 1:
        operator_name = cols[0].text.strip()
        operator_names.append(operator_name)

# Scan FlatBuffers schema and extract supported operator names
supported_ops = set()
with open(schema_path) as fp:
    in_operator_type_enum = False
    for line in fp:
        if line.startswith('enum OperatorType'):
            in_operator_type_enum = True
            continue
        if line.startswith('}') and in_operator_type_enum:
            break

        if in_operator_type_enum:
            op_name = line.strip().replace(',', '')
            supported_ops.add(op_name)

# List operators and support status
for operator in operator_names:
    if operator in supported_ops:
        print(f"- [x] {operator}")
    else:
        print(f"- [ ] {operator}")

Operator list

  • Abs
  • Acos
  • Acosh
  • Add
  • AffineGrid
  • And
  • ArgMax
  • ArgMin
  • Asin
  • Asinh
  • Atan
  • Atanh
  • AveragePool
  • BatchNormalization
  • Bernoulli
  • BitShift
  • BitwiseAnd
  • BitwiseNot
  • BitwiseOr
  • BitwiseXor
  • BlackmanWindow
  • Cast
  • CastLike
  • Ceil
  • Celu
  • CenterCropPad
  • Clip
  • Col2Im
  • Compress
  • Concat
  • ConcatFromSequence
  • Constant
  • ConstantOfShape
  • Conv
  • ConvInteger
  • ConvTranspose
  • Cos
  • Cosh
  • CumSum
  • DFT
  • DeformConv
  • DepthToSpace
  • DequantizeLinear
  • Det
  • Div
  • Dropout
  • DynamicQuantizeLinear
  • Einsum
  • Elu
  • Equal
  • Erf
  • Exp
  • Expand
  • EyeLike
  • Flatten
  • Floor
  • GRU
  • Gather
  • GatherElements
  • GatherND
  • Gelu
  • Gemm
  • GlobalAveragePool
  • GlobalLpPool
  • GlobalMaxPool
  • Greater
  • GreaterOrEqual
  • GridSample
  • GroupNormalization
  • HammingWindow
  • HannWindow
  • HardSigmoid
  • HardSwish
  • Hardmax
  • Identity
  • If
  • ImageDecoder
  • InstanceNormalization
  • IsInf
  • IsNaN
  • LRN
  • LSTM
  • LayerNormalization
  • LeakyRelu
  • Less
  • LessOrEqual
  • Log
  • LogSoftmax
  • Loop
  • LpNormalization
  • LpPool
  • MatMul
  • MatMulInteger
  • Max
  • MaxPool
  • MaxRoiPool
  • MaxUnpool
  • Mean
  • MeanVarianceNormalization
  • MelWeightMatrix
  • Min
  • Mish
  • Mod
  • Mul
  • Multinomial
  • Neg
  • NegativeLogLikelihoodLoss
  • NonMaxSuppression
  • NonZero
  • Not
  • OneHot
  • Optional
  • OptionalGetElement
  • OptionalHasElement
  • Or
  • PRelu
  • Pad
  • Pow
  • QLinearConv
  • QLinearMatMul
  • QuantizeLinear
  • RNN
  • RandomNormal
  • RandomNormalLike
  • RandomUniform
  • RandomUniformLike
  • Range
  • Reciprocal
  • ReduceL1
  • ReduceL2
  • ReduceLogSum
  • ReduceLogSumExp
  • ReduceMax
  • ReduceMean
  • ReduceMin
  • ReduceProd
  • ReduceSum
  • ReduceSumSquare
  • RegexFullMatch
  • Relu
  • Reshape
  • Resize
  • ReverseSequence
  • RoiAlign
  • Round
  • STFT
  • Scan
  • Scatter
  • ScatterElements
  • ScatterND
  • Selu
  • SequenceAt
  • SequenceConstruct
  • SequenceEmpty
  • SequenceErase
  • SequenceInsert
  • SequenceLength
  • SequenceMap
  • Shape
  • Shrink
  • Sigmoid
  • Sign
  • Sin
  • Sinh
  • Size
  • Slice
  • Softmax
  • SoftmaxCrossEntropyLoss
  • Softplus
  • Softsign
  • SpaceToDepth
  • Split
  • SplitToSequence
  • Sqrt
  • Squeeze
  • StringConcat
  • StringNormalizer
  • StringSplit
  • Sub
  • Sum
  • Tan
  • Tanh
  • TfIdfVectorizer
  • ThresholdedRelu
  • Tile
  • TopK
  • Transpose
  • Trilu
  • Unique
  • Unsqueeze
  • Upsample
  • Where
  • Xor
@robertknight
Copy link
Owner Author

Strings that look like ONNX operator names (match "[A-Z][A-Za-z]+"') grepped from torch/onnx/symbolic_opset*.py. This gives a rough idea of which ONNX operators models exported from PyTorch might actually use.

Abs
Acos
Add
Affine
And
ArgMax
ArgMin
Asin
Atan
AveragePool
BatchNormalization
Bernoulli
BitShift
Bool
CRD
Cast
Ceil
Celu
Clip
Concat
ConcatFromSequence
Constant
ConstantFill
ConstantOfShape
Conv
ConvTranspose
Cos
CumSum
Delete
DepthToSpace
DequantizeLinear
Det
Div
Dropout
DynamicSlice
Einsum
Elu
Equal
Erf
Exp
Expand
EyeLike
Flatten
Floor
GRU
Gather
GatherElements
GatherND
Gemm
GlobalAveragePool
GlobalMaxPool
Greater
GreaterOrEqual
GridSample
HardSigmoid
HardSwish
Identity
If
InstanceNormalization
IsInf
IsNaN
LEFT
LSTM
LayerNormalization
LeakyRelu
Less
LessOrEqual
Log
LogSoftmax
Loop
MatMul
Max
MaxPool
Min
Mod
Mul
Multinomial
Neg
NegativeLogLikelihoodLoss
NonZero
Not
OneHot
OptionalGetElement
OptionalHasElement
Or
PRelu
Pad
Pow
QuantizeLinear
RIGHT
RNN
RandomNormal
RandomNormalLike
RandomUniform
RandomUniformLike
Range
Reciprocal
ReduceLogSumExp
ReduceMax
ReduceMean
ReduceMin
ReduceProd
ReduceSum
Relu
Reshape
Resize
Round
STFT
ScaledTanh
Scatter
ScatterElements
ScatterND
Selu
SequenceAt
SequenceConstruct
SequenceEmpty
SequenceErase
SequenceInsert
SequenceLength
Shape
Sigmoid
Sign
Sin
Size
Slice
Softmax
SoftmaxCrossEntropyLoss
Softplus
Softsign
Sort
Split
SplitToSequence
Sqrt
Squeeze
Sub
Tan
Tanh
Tensor
Tensordot
ThresholdedRelu
Tile
TopK
Transpose
Trilu
Unfold
Unique
Unsqueeze
Upsample
VALID
Where
Xor

@Caellian
Copy link

Caellian commented Oct 9, 2024

It would be cool if you could pin this issue, and link to it from README. For a newcomer, it was easier to gauge the state of the project (and that I didn't enable a feature) than going through schema currently linked to in the README.

@robertknight
Copy link
Owner Author

robertknight commented Oct 10, 2024

Hello @Caellian, which operators did you run into issues with? Given that you mentioned enabling a feature, I'm guessing it was one of the Random* ops? There is an issue about producing a more helpful error when an operator is supported, but not enabled due to the current crate features.

@robertknight robertknight pinned this issue Oct 10, 2024
robertknight added a commit that referenced this issue Oct 10, 2024
@Caellian
Copy link

I'm guessing it was one of the Random* ops?

Yup, it was RandomNormalLike, figured it out by searching the code for it and seeing cfg attribute on one use, after seeing it marked as supported here.

@robertknight robertknight changed the title Support for missing ONNX operators Tracking issue for supported ONNX operators Oct 13, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants