-
Notifications
You must be signed in to change notification settings - Fork 506
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[MLIR][TORCH] Add OnnxToTorch support for BlackmanWindow function #3181
Conversation
cfcf831
to
fcfa8cd
Compare
@vivekkhandelwal1 gentle ping. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Also needs an e2e test.
This can either be done in torch-mlir/projects/pt1 or Shark-TestSuite
SharkTestSuite is what we're moving towards, and in the past, for ONNX models, it has given me less trouble.
All you need to do is:
- Make a new folder named after your op in
https://github.com/nod-ai/SHARK-TestSuite/tree/main/e2eshark/onnx/operators - Add a file named 'model.py' inside that folder. Reference https://github.com/nod-ai/SHARK-TestSuite/blob/main/e2eshark/onnx/operators/If/model.py for an example i just pushed today
- run it with
PYTHONPATH="/home/azureuser/torch-mlir/build/tools/torch-mlir/python_packages/torch_mlir:$PYTHONPATH" python ./run.py -c ~/torch-mlir/build -i ~/iree-build --tests onnx/operators/If --verbose --cachedir /tmp/ --verbose --torchtolinalg
Note that you have to
cd Shark-TestSuite/e2eshark
and modify the string after --tests
to match the directory of the oprerator you're testing. Here, i'm testing onnx.If
I have written the test, but would it require to be pushed to the SHARK-TestSuite repository? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi @vinayakdsci, I think you have adopted the PyTorch Blackman Window instead of Onnx Blackman Window implementation for this lowering. Although they both might be same but I think it's better for an onnx op to be lowered as per their implementation, and it's more readable as well to compare the algo.
72ee53a
to
aedb725
Compare
c1bd761
to
fb027f6
Compare
fb027f6
to
1f29ffa
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Apart from a small nit, everything else looks fine.
1f29ffa
to
9cf87f7
Compare
9cf87f7
to
531a2ee
Compare
@vinayakdsci, you need an approval from @renxida too to merge this. |
…vm#3181) Implements OnnxToTorch lowering for the BlackmanWindow Function.
Implements OnnxToTorch lowering for the BlackmanWindow Function.