Skip to content

Releases: Dao-AILab/flash-attention

v2.7.0.post2

13 Nov 04:02
Compare
Choose a tag to compare
[CI] Pytorch 2.5.1 does not support python 3.8

v2.7.0.post1

12 Nov 22:29
Compare
Choose a tag to compare
[CI] Switch back to CUDA 12.4

v2.7.0

12 Nov 22:12
Compare
Choose a tag to compare
Bump to v2.7.0

v2.6.3

25 Jul 08:33
Compare
Choose a tag to compare
Bump to v2.6.3

v2.6.2

23 Jul 09:30
Compare
Choose a tag to compare
Bump to v2.6.2

v2.6.1

11 Jul 15:29
Compare
Choose a tag to compare
Bump to v2.6.1

v2.6.0.post1

11 Jul 09:55
Compare
Choose a tag to compare
[CI] Compile with pytorch 2.4.0.dev20240514

v2.6.0

11 Jul 04:35
Compare
Choose a tag to compare
Bump v2.6.0

v2.5.9.post1

26 May 22:36
Compare
Choose a tag to compare
Limit to MAX_JOBS=1 with CUDA 12.2

v2.5.9

26 May 21:02
Compare
Choose a tag to compare
Bump to 2.5.9