Skip to content

Commit 89241a6

Browse files
committed
Comment out Flash-DMA banner in README and README_zh
Commented out the top banner image markdown in both English and Chinese README files to prevent the banner from rendering (e.g., missing/undesired asset display).
1 parent c49dead commit 89241a6

File tree

2 files changed

+2
-2
lines changed

2 files changed

+2
-2
lines changed

README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,7 @@
1010
</div>
1111

1212

13-
![Flash-DMA Banner](assets/flash_dmattn_banner.png)
13+
<!-- ![Flash-DMA Banner](assets/flash_dmattn_banner.png) -->
1414

1515
Flash-DMA is a high-performance attention implementation that integrates Flash Attention's memory efficiency with Dynamic Mask Attention's sparse computation capabilities for processing extremely long sequences in transformer models.
1616

README_zh.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,7 @@
1010
</div>
1111

1212

13-
![Flash-DMA Banner](assets/flash_dmattn_banner.png)
13+
<!-- ![Flash-DMA Banner](assets/flash_dmattn_banner.png) -->
1414

1515
Flash-DMA 是一个高性能的注意力实现,将 Flash Attention 的内存效率与动态掩码注意力的稀疏计算能力相结合,用于在 Transformer 模型中处理超长序列。
1616

0 commit comments

Comments
 (0)