Skip to content

Support Linear Fused Batchnorm #11587

Open
@mcr229

Description

@mcr229

🚀 The feature, motivation and pitch

Currently XNNPACK only supports batchnorm when it succeeds a convolution. This is because we can fuse it with the preceeding convolution operation. However we can also do this for Linear layers, by fusing batchnorm with the previous linear. Take a look at the code for fusing with conv, and try to add a pass to do the same for linear:

https://github.com/pytorch/executorch/blob/main/backends/xnnpack/_passes/fuse_batch_norm_with_conv.py

Alternatives

No response

Additional context

No response

RFC (Optional)

No response

Metadata

Metadata

Assignees

Labels

Type

No type

Projects

Status

No status

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions