SIGN IN SIGN UP

COMMITS

/ llama_cpp/llama.py
batch-processing
January 18, 2024
January 17, 2024
January 15, 2024
January 10, 2024
A
Use sampling context
Andrei Betlen committed
January 5, 2024
A
Fix #1038
Andrei Betlen committed
December 22, 2023
December 18, 2023
December 16, 2023
December 12, 2023
December 11, 2023
A
Remove f16_kv
Andrei Betlen committed
November 30, 2023
A
Refactor _create_completion
Andrei Betlen committed
November 29, 2023
K
Fix #891 (#952)
kddubey committed
November 28, 2023
A
Add sampling context
Andrei Betlen committed
November 26, 2023
A
docs: Update Llama docs
Andrei Betlen committed
November 24, 2023
November 23, 2023
A
docs: Add Llama class example
Andrei Betlen committed
November 21, 2023
A
Format
Andrei Betlen committed
T
Added support for min_p (#921)
TK-Master committed
A
Fix #929
Andrei Betlen committed
November 10, 2023
November 8, 2023
November 6, 2023
A
Fix type bug
Andrei Betlen committed
A
Refactor Llama class internals
Andrei Betlen committed
November 3, 2023
A
Clean up stdout / stderr suppression
Andrei Betlen committed