Open
Description
🐛 Describe the bug
The symbol _ZN3c105ErrorC2ENS_14SourceLocationESs
is exported in cu124's version, but missing in cu126: some nm
outputs in Dao-AILab/flash-attention#1644
I understand that because of missing symbols, flash_attention has stopped working with torch 2.7. But it was a bit surprising that the exported symbols differ between cu124 and cu126 version of the same release...
Also, a question is why torch exported _ZN3c105ErrorC2ENS_14SourceLocationESs
and why flash_attention depends on it...
Versions
torch 2.6.0-cu126 and cu124
cc @ezyang @gchanan @zou3519 @kadeng @msaroufim @seemethere @malfet @osalpekar @atalman @ptrblck @eqy @jerryzh168
Metadata
Metadata
Assignees
Labels
Anything related to official binaries that we release to usersRelated to torch.cuda, and CUDA support in generalSomeone else needs to try reproducing the issue given the instructions. No action needed from userThis issue has been looked at a team member, and triaged and prioritized into an appropriate module