Releases: lucidrains/memory-efficient-attention-pytorch
Releases · lucidrains/memory-efficient-attention-pytorch
0.1.6
0.1.5
add new trick from flash attention 2 that saves on division
0.1.4
bump version
0.1.2
0.1.2
0.1.1
fix tests
0.1.0
0.1.0
0.0.27
bring in the further simplification to flash attention that @tridao d…
0.0.26
fix cosine sim flash attention as well
0.0.25
test out flash attention in GPT
0.0.24
test out flash attention in GPT