lzyhha commited on
Commit
175c8d0
·
1 Parent(s): 161ae6e

flashattention

Browse files
Files changed (1) hide show
  1. requirements.txt +1 -1
requirements.txt CHANGED
@@ -13,4 +13,4 @@ numba
13
  scipy
14
  tqdm
15
  einops
16
- flash-attn
 
13
  scipy
14
  tqdm
15
  einops
16
+ https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.2.post1/flash_attn-2.7.2.post1+cu11torch2.1cxx11abiFALSE-cp310-cp310-linux_x86_64.whl