Skip to content

Tags: Dao-AILab/flash-attention

Tags

v2.7.0.post2

Toggle v2.7.0.post2's commit message
[CI] Pytorch 2.5.1 does not support python 3.8

v2.7.0

Toggle v2.7.0's commit message
Bump to v2.7.0

v2.7.0.post1

Toggle v2.7.0.post1's commit message
[CI] Switch back to CUDA 12.4

v2.6.3

Toggle v2.6.3's commit message
Bump to v2.6.3

v2.6.2

Toggle v2.6.2's commit message
Bump to v2.6.2

v2.6.1

Toggle v2.6.1's commit message
Bump to v2.6.1

v2.6.0

Toggle v2.6.0's commit message
Bump v2.6.0

v2.6.0.post1

Toggle v2.6.0.post1's commit message
[CI] Compile with pytorch 2.4.0.dev20240514

v2.5.9

Toggle v2.5.9's commit message
Bump to 2.5.9

v2.5.9.post1

Toggle v2.5.9.post1's commit message
Limit to MAX_JOBS=1 with CUDA 12.2