You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Comment out Flash-DMA banner in README and README_zh
Commented out the top banner image markdown in both English and Chinese README files to prevent the banner from rendering (e.g., missing/undesired asset display).
Flash-DMA is a high-performance attention implementation that integrates Flash Attention's memory efficiency with Dynamic Mask Attention's sparse computation capabilities for processing extremely long sequences in transformer models.
0 commit comments