Skip to content

Releases: lucidrains/memory-efficient-attention-pytorch

0.1.6

18 Jul 02:41
Compare
Choose a tag to compare
further simplification

0.1.5

18 Jul 00:52
Compare
Choose a tag to compare
add new trick from flash attention 2 that saves on division

0.1.4

16 Jul 16:08
Compare
Choose a tag to compare
bump version

0.1.2

05 Mar 17:52
Compare
Choose a tag to compare
0.1.2

0.1.1

30 Dec 18:56
Compare
Choose a tag to compare
fix tests

0.1.0

30 Dec 18:31
d76fba8
Compare
Choose a tag to compare
0.1.0

0.0.27

01 Nov 18:02
Compare
Choose a tag to compare
bring in the further simplification to flash attention that @tridao d…

0.0.26

23 Jul 21:55
Compare
Choose a tag to compare
fix cosine sim flash attention as well

0.0.25

23 Jul 21:45
Compare
Choose a tag to compare
test out flash attention in GPT

0.0.24

23 Jul 21:44
Compare
Choose a tag to compare
test out flash attention in GPT