Releases: OpenProteinAI/flash-attention
Releases · OpenProteinAI/flash-attention
v2.6.3-alibi-as-bh-bias-15
alibi.h use the non max seqlen formula which seems more correct actua…
v2.6.3-alibi-as-bh-bias-12
also modify alibi.h
v2.6.3-alibi-as-bh-bias-11
follow the original expression more closely
v2.6.3-alibi-as-bh-bias-10
diagonal noncausal try accounting for max_seqlens too
v2.6.3-alibi-as-bh-bias-8
redo causal and noncausal alibi for diagonal
v2.6.3-alibi-as-bh-bias-7
...and only compile for py310 and torch24
v2.6.3-alibi-as-bh-bias-6
compile for torch 2.4.1 too
v2.6.3-alibi-as-bh-bias-5
fix TORCH_CUDA_VERSION env var for pytorch 2.5
v2.6.3-alibi-as-bh-bias-4
compile for pytorch 2.5.1
v2.6.3-alibi-as-bh-bias-3
fix bh bias diagonal handling