COMMITS
February 14, 2024
A
Bump version
Andrei Betlen committed
A
Merge branch 'main' of https://github.com/abetlen/llama-cpp-python into main
Andrei Betlen committed
A
Update llama.cpp
Andrei Betlen committed
D
feat: Support batch embeddings (#1186)
Douglas Hanley committed
A
misc: fix makefile build commands
Andrei Betlen committed
A
Update llama.cpp
Andrei Betlen committed
A
fix: destructor exception where internal classes are missing some uninitialized attributes
Andrei Betlen committed
A
fix: Update openbuddy prompt format. Closes #1155
Andrei Betlen committed
A
Update llama.cpp
Andrei Betlen committed
A
fix: submodule kompute is not included in sdist. Closes #1165
Andrei Betlen committed
A
fix: more chatml-function-calling fixes
Andrei Betlen committed
February 13, 2024
A
Bump version
Andrei Betlen committed
A
fix: sample idx off-by-one error for logit_processors (#1179)
Andrew Lapp committed
A
Update llama.cpp
Andrei Betlen committed
A
fix: missing generation_prompt in chatml-function-calling
Andrei Betlen committed
A
fix: minor formatting bugs for chatml-function-calling
Andrei Betlen committed
A
Bump version
Andrei Betlen committed
A
A
Update llama.cpp
Andrei Betlen committed
A
docs: Fix typo
Andrei Betlen committed
February 12, 2024
A
Bump version
Andrei Betlen committed
A
docs: Temporarily revert function calling docs
Andrei Betlen committed
A
fix: Always set logits_all = True when using speculative decoding
Andrei Betlen committed
A
feat: Generic chatml Function Calling (#957)
Andrei committed
A
Update llama.cpp
Andrei Betlen committed
February 11, 2024
A
Update llama.cpp
Andrei Betlen committed
C
fix: Circular dependancy preventing early Llama object free (#1176)
Connor committed
A
docs: Set the correct command for compiling with syscl support (#1172)
Akarshan Biswas committed
D
feat: use gpu backend for clip if available (#1175)
Douglas Hanley committed
February 9, 2024
A
Update llama.cpp
Andrei Betlen committed