Open
Description
🚀 The feature, motivation and pitch
Currently XNNPACK only supports batchnorm when it succeeds a convolution. This is because we can fuse it with the preceeding convolution operation. However we can also do this for Linear layers, by fusing batchnorm with the previous linear. Take a look at the code for fusing with conv, and try to add a pass to do the same for linear:
Alternatives
No response
Additional context
No response
RFC (Optional)
No response
Metadata
Metadata
Assignees
Labels
Type
Projects
Status
No status