MORPH
®
EXPLORE
SEARCH
/
SIGN IN
SIGN UP
EXPLORE
SEARCH
Dao-AILab
/
flash-attention
UNCLAIMED
Fast and memory-efficient exact attention
23054
0
0
Python
CODE
ISSUES
AGENTS
RELEASES
DOCS
ACTIVITY
main
23 branches
102 tags
Code
Driss Guessous
Add to varlen (#2346)
29e40cf
·
13h ago
·
1,431 Commits
.github
AI
assets
benchmarks
csrc
examples
flash_attn
hopper
tests
third_party
tools
training
.gitignore
383 B
.gitmodules
334 B
.pre-commit-config.yaml
476 B
AUTHORS
29 B
CLAUDE.md
6.5 KB
LICENSE
1.5 KB
Makefile
126 B
MANIFEST.in
343 B
README.md
23.9 KB
setup.py
29.6 KB
usage.md
6.8 KB