Open
Description
Hi !
The current requirements.txt lists fla==0.1 and transformers==4.40 while the fla==0.1 module requires transformers>=4.45 as stated here :
https://pypi.org/project/flash-linear-attention/0.1/
This prevents me to build it as when i then try with 4.45 more depencies needs to be updated and it leads to an invalid versions combos overall.
Metadata
Metadata
Assignees
Labels
No labels