COMMITS
/ llama_cpp/llama.py April 4, 2023
A
Update to more sensible return signature
Andrei Betlen committed
April 2, 2023
A
Bugfix: Stop sequences and missing max_tokens check
Andrei Betlen committed
A
Move workaround to new sample method
Andrei Betlen committed
A
Update api to allow for easier interactive mode
Andrei Betlen committed
April 1, 2023
A
Fix example documentation
Andrei Betlen committed
A
Add documentation for generate method
Andrei Betlen committed
A
Add static methods for beginning and end of sequence tokens.
Andrei Betlen committed
A
Update high-level api
Andrei Betlen committed
March 28, 2023
A
Add support to get embeddings from high-level api. Closes #4
Andrei Betlen committed
A
Add support for stream parameter. Closes #1
Andrei Betlen committed
A
Extract generate method
Andrei Betlen committed
A
Refactor Llama class and add tokenize / detokenize methods Closes #3
Andrei Betlen committed
March 25, 2023
A
Update Llama to add params
Andrei Betlen committed
A
Update docstring
Andrei Betlen committed
March 24, 2023
A
Add mkdocs
Andrei Betlen committed
A
Handle errors returned by llama.cpp
Andrei Betlen committed
A
Black formatting
Andrei Betlen committed
A
Use n_ctx provided from actual context not params
Andrei Betlen committed
A
Black formatting
Andrei Betlen committed
A
Implement prompt batch processing as in main.cpp
Andrei Betlen committed
A
Remove model_name param
Andrei Betlen committed
March 23, 2023
A
Bugfix: avoid decoding partial utf-8 characters
Andrei Betlen committed
A
Add support for logprobs
Andrei Betlen committed
A
Initial commit
Andrei Betlen committed