Skip to content

Comments

Fix pytorch_wrap regex to catch both nn.X and torch.nn.X#140

Open
nataliakokoromyti wants to merge 1 commit intomainfrom
fix-pytorch-wrap-regex
Open

Fix pytorch_wrap regex to catch both nn.X and torch.nn.X#140
nataliakokoromyti wants to merge 1 commit intomainfrom
fix-pytorch-wrap-regex

Conversation

@nataliakokoromyti
Copy link
Collaborator

The previous pattern torch\.nn\.(?!...) only caught stuff like torch.nn.Linear but didn't catch kernels withfrom torch import nn and then nn.Conv2d, nn.Linear, etc. I changed the pattern to (?<!\w)nn\.(?!...) which matches both import styles while avoiding false positives on identifiers like cnn.layer via the negative lookbehind.

The previous pattern `torch\.nn\.(?!...)` only matched fully qualified
`torch.nn.Linear` style calls.  Generated kernels commonly use
`from torch import nn` and then `nn.Conv2d`, `nn.Linear`, etc., which
slipped through the check.

Changed the pattern to `(?<!\w)nn\.(?!...)` which matches both import
styles while avoiding false positives on identifiers like `cnn.layer`
via the negative lookbehind.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant