Skip to content

Tags: flash-algo/flash-sparse-attention

Tags

v1.2.3

Toggle v1.2.3's commit message

Verified

This commit was created on GitHub.com and signed with GitHub’s verified signature.
Fix documentation and references for Flash Sparse Attention

v1.2.2

Toggle v1.2.2's commit message
Bump package version to 1.2.2

v1.2.1

Toggle v1.2.1's commit message
Bump version to 1.2.1

v1.2.0

Toggle v1.2.0's commit message
Bump version to 1.2.0

v1.1.9

Toggle v1.1.9's commit message
Bump version to 1.1.9

v1.1.8

Toggle v1.1.8's commit message

Verified

This commit was created on GitHub.com and signed with GitHub’s verified signature.
Merge pull request #176 from SmallDoges:auto-workflow

Bump version to 1.1.8

v1.1.7

Toggle v1.1.7's commit message

Verified

This commit was created on GitHub.com and signed with GitHub’s verified signature.
Merge pull request #175 from SmallDoges:auto-workflow

Increase GitHub Actions build timeout to 6 hours

v1.1.6

Toggle v1.1.6's commit message

Verified

This commit was created on GitHub.com and signed with GitHub’s verified signature.
Merge pull request #174 from SmallDoges:auto-workflow

Remove CUDA architecture '120' for compatibility

v1.1.5

Toggle v1.1.5's commit message

Verified

This commit was created on GitHub.com and signed with GitHub’s verified signature.
Merge pull request #173 from SmallDoges:auto-workflow

Expand build matrix for ARM64 and additional CUDA architectures

v1.1.4

Toggle v1.1.4's commit message

Verified

This commit was created on GitHub.com and signed with GitHub’s verified signature.
Merge pull request #172 from SmallDoges/auto-workflow

Refine build matrix and CUDA architecture specifications