⭐ Star AlbumentationsX on GitHub — 307+ stars and counting!

Star on GitHub
vllm-project

flash-attention

vllm-project/flash-attention

Fast and memory-efficient exact attention

120stars
Forks
134
Open issues
23
Watchers
120
Size
19.7 MB
PythonBSD 3-Clause "New" or "Revised" License
Created: Mar 28, 2024
Updated: Apr 11, 2026
Last push: Apr 14, 2026
Fork