-
Notifications
You must be signed in to change notification settings - Fork 7.2k
Open
Description
🐛 Describe the bug
I'm not sure if this is machine-specific, but I think the fast rotation implementation for 90-degree multiples (#8295) produces non-contiguous tensors:
x = torch.rand((1, 1, 512, 512)) # Binary Segmentation Target
print(x.is_contiguous()) # True
x = torch.rot90(x, k=1, dims=(1, 2))
print(x.is_contiguous()) # FalseMy understanding is that this happens because rot90 uses transpose internally, for which this behavior is already documented.
This is an issue because it slows down training due to cache misses and also makes certain subsequent operations such as views impossible. So we should either require the output to be contiguous or document this behavior and let users handle it.
Versions
0.25.0
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels