MORPH
®
EXPLORE
SEARCH
/
SIGN IN
SIGN UP
EXPLORE
SEARCH
abetlen
/
llama-cpp-python
UNCLAIMED
Python bindings for llama.cpp
0
0
1
Python
CODE
ISSUES
RELEASES
WIKI
ACTIVITY
ANALYTICS
feat: Add option to enable `flash_attn` to Lllama params and ModelSettings
A
Andrei Betlen
committed
1y ago
22d77eefd2edaf0148f53374d0cac74d0e25d06e
Parent:
8c2b24d