Skip to content

Tags: leizhao1234/flash-attention

Tags

v2.0.0

Toggle v2.0.0's commit message
FlashAttention-2 release

v1.0.9

Toggle v1.0.9's commit message
Bump to v1.0.9

v1.0.8

Toggle v1.0.8's commit message
Bump to v1.0.8

v1.0.7

Toggle v1.0.7's commit message
Bump version to 1.0.7

v1.0.6

Toggle v1.0.6's commit message

Verified

This commit was created on GitHub.com and signed with GitHub’s verified signature. The key has expired.
Merge pull request Dao-AILab#243 from ksivaman/bump_version_to_v1_0_6

bump to v1.0.6

v1.0.5

Toggle v1.0.5's commit message
Add ninja to pyproject.toml build-system, bump to v1.0.5

v1.0.4

Toggle v1.0.4's commit message
[Docs] Clearer error message for bwd d > 64, bump to v1.0.4

v1.0.3

Toggle v1.0.3's commit message
Bump version to 1.0.3

v1.0.3.post0

Toggle v1.0.3.post0's commit message
Bump version to v1.0.3.post0

v1.0.2

Toggle v1.0.2's commit message
Bump to v1.0.2