Skip to content

[Bug]: Add corresponding CI to cover the CUDA + FlashAttention tricky bug #19355

Open
@houseroad

Description

@houseroad

Your current environment

This is a follow up of #19321

We should introduce an appropriate test to ensure the logic is covered appropriately.

🐛 Describe the bug

See some repro shape in https://gist.github.com/yinghai/4d72cb67a056a033a7a86e59d8051d90

Before submitting a new issue...

  • Make sure you already searched for relevant issues, and asked the chatbot living at the bottom right corner of the documentation page, which can answer lots of frequently asked questions.

Metadata

Metadata

Labels

bugSomething isn't working

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions