Skip to content

[CXX11ABI] torch 2.6.0-cu126 and cu124 have different exported symbols #152790

Open
@vadimkantorov

Description

@vadimkantorov

🐛 Describe the bug

The symbol _ZN3c105ErrorC2ENS_14SourceLocationESs is exported in cu124's version, but missing in cu126: some nm outputs in Dao-AILab/flash-attention#1644

I understand that because of missing symbols, flash_attention has stopped working with torch 2.7. But it was a bit surprising that the exported symbols differ between cu124 and cu126 version of the same release...

Also, a question is why torch exported _ZN3c105ErrorC2ENS_14SourceLocationESs and why flash_attention depends on it...

@malfet

Versions

torch 2.6.0-cu126 and cu124

cc @ezyang @gchanan @zou3519 @kadeng @msaroufim @seemethere @malfet @osalpekar @atalman @ptrblck @eqy @jerryzh168

Metadata

Metadata

Assignees

No one assigned

    Labels

    module: binariesAnything related to official binaries that we release to usersmodule: cudaRelated to torch.cuda, and CUDA support in generalneeds reproductionSomeone else needs to try reproducing the issue given the instructions. No action needed from usertriagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate module

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions