A tool to show ONNX model summary like torchinfo
git clone
this repositorycd onnxinfo
cmake -S . -B build/
cmake --build build/ [--parallel <thread number>]
to build dependency and onnxinfo
Your Python file should be in the same directory as the .so
file built by CMake.
import onnxinfo
onnxinfo.summary('<onnx model path>')
Support node types so far: Conv, Relu, MaxPool, AveragePool, Add, GlobalAveragePool, Flatten, Gemm
Will be run when shape inferencing by default.
Doesn't count for non MACs operations like Relu
, MaxPool
and so on.
Calculate trainable parameters for each node.
Aims to calculate the memory usage of each node when input and output. (Bytes)
python3 -m pytest -v
Use model(resnet18_Opset16.onnx) from ONNX Model Zoo to test.
- Run
docker build -t onnxinfo -f docker/Dockerfile .
first.- You can type
docker run onnxinfo
to run tests. - Or type
docker run -it onnxinfo bash
to enter the environment which has onnxinfo.
- You can type
There are many node types in ONNX, but I only support some of them now.