-
Notifications
You must be signed in to change notification settings - Fork 517
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add aten.unflatten.int support and its torch-to-tosa lowering #2509
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
I only see a unit test for TorchToTosa. What about other tests? Any e2e tests expected to pass? Can you confirm that we are missing anything?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can you also add an e2e test similar to the flatten ones:
class FlattenStaticModule(torch.nn.Module): |
Just added an e2e test. Note that the aten.unflatten.int op only has TOSA lowering for now, and the e2e test will fail on the Linalg-on-Tensors backend. @ramiro050 Please let me know if you think this is ok. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks! Other than a small change request, LGTM
Add aten.unflatten.int op
Add its torch-to-tosa lowering
Update the TorchToTosa/basic.mlir tests
To test e2e tosa lowering:
python -m e2e_testing.main -v -c=tosa