forked from Dao-AILab/flash-attention
-
Notifications
You must be signed in to change notification settings - Fork 49
Insights: ROCm/flash-attention
Overview
-
0 Active issues
-
- 1 Merged pull request
- 0 Open pull requests
- 0 Closed issues
- 0 New issues
Loading
Could not load contribution data
Please try again later
Loading
1 Release published by 1 person
-
v2.7.3-cktile
published
Jan 13, 2025
1 Pull request merged by 1 person
-
update CK
#118 merged
Jan 8, 2025
4 Unresolved conversations
Sometimes conversations happen on old items that aren’t yet closed. Here is a list of all the Issues and Pull Requests with unresolved conversations.
-
Add `fp8` support to `fwd_prefill` kernel
#115 commented on
Jan 10, 2025 • 14 new comments -
MI100 Support
#24 commented on
Jan 9, 2025 • 0 new comments -
[Issue]: gfx1100 is invalid or not supported by Flash-Attention
#93 commented on
Jan 10, 2025 • 0 new comments -
fp8
#116 commented on
Jan 13, 2025 • 0 new comments