COMMITS
May 14, 2023
A
Bump version
Andrei Betlen committed
A
Update llama.cpp
Andrei Betlen committed
May 12, 2023
A
Allow model to tokenize strings longer than context length and set add_bos. Closes #92
Andrei Betlen committed
A
Only support generating one prompt at a time.
Andrei Betlen committed
A
Revert "llama_cpp server: prompt is a string". Closes #187
Andrei Betlen committed
A
Fix docker command
Andrei Betlen committed
A
Merge branch 'main' of github.com:abetlen/llama_cpp_python into main
Andrei Betlen committed
A
Bump version
Andrei Betlen committed
A
Add missing tfs_z paramter
Andrei Betlen committed
A
Update llama.cpp
Andrei Betlen committed
May 11, 2023
A
Update llama.cpp
Andrei Betlen committed
May 10, 2023
A
Bugfix: Ensure logs are printed when streaming
Andrei Betlen committed
A
Merge pull request #177 from joelkurian/main
Andrei committed
May 9, 2023
L
llama_cpp server: document presence_penalty and frequency_penalty, mark as supported
Lucas Doyle committed
J
Updated installation instructions for BLAS backends
Joel Kurian committed
A
Implement sampling as in llama.cpp main example
Andrei Betlen committed
May 8, 2023
A
Merge branch 'main' of github.com:abetlen/llama_cpp_python into Maximilian-Winter/main
Andrei Betlen committed
A
Revert changes to llama.cpp and setup.py
Andrei Betlen committed
A
Merge pull request #126 from Stonelinks/deprecate-example-server
Andrei committed
A
A
Merge branch 'main' of github.com:abetlen/llama_cpp_python into main
Andrei Betlen committed
A
Bump version
Andrei Betlen committed
A
Fix: default repeat_penalty
Andrei Betlen committed
D
Bump mkdocs-material from 9.1.9 to 9.1.11
dependabot[bot] committed
A
Merge pull request #153 from SagsMug/main
Andrei committed
A
Bump version
Andrei Betlen committed
A
Bugfix: not falling back to environment variables when default is value is set.
Andrei Betlen committed
A
Bump version
Andrei Betlen committed
A
Show default value when --help is called
Andrei Betlen committed