Skip to content

Releases: OpenProteinAI/flash-attention

v2.6.3-alibi-as-bh-bias-15

08 Feb 14:39
4bda2db
Compare
Choose a tag to compare
alibi.h use the non max seqlen formula which seems more correct actua…

v2.6.3-alibi-as-bh-bias-12

08 Feb 14:23
efbbaf4
Compare
Choose a tag to compare
also modify alibi.h

v2.6.3-alibi-as-bh-bias-11

08 Feb 13:32
f5ce6ee
Compare
Choose a tag to compare
follow the original expression more closely

v2.6.3-alibi-as-bh-bias-10

08 Feb 02:38
a645edd
Compare
Choose a tag to compare
diagonal noncausal try accounting for max_seqlens too

v2.6.3-alibi-as-bh-bias-8

07 Feb 14:47
a8d7fcc
Compare
Choose a tag to compare
redo causal and noncausal alibi for diagonal

v2.6.3-alibi-as-bh-bias-7

07 Feb 14:30
c42bf6b
Compare
Choose a tag to compare
...and only compile for py310 and torch24

v2.6.3-alibi-as-bh-bias-6

07 Feb 14:13
d2dd0fd
Compare
Choose a tag to compare
compile for torch 2.4.1 too

v2.6.3-alibi-as-bh-bias-5

05 Feb 23:59
f124d98
Compare
Choose a tag to compare
fix TORCH_CUDA_VERSION env var for pytorch 2.5

v2.6.3-alibi-as-bh-bias-4

05 Feb 23:45
42a895e
Compare
Choose a tag to compare
compile for pytorch 2.5.1

v2.6.3-alibi-as-bh-bias-3

08 Sep 01:39
Compare
Choose a tag to compare
fix bh bias diagonal handling