SIGN IN SIGN UP
BerriAI / litellm UNCLAIMED

Python SDK, Proxy Server (AI Gateway) to call 100+ LLM APIs in OpenAI (or native) format, with cost tracking, guardrails, loadbalancing and logging. [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, VLLM, NVIDIA NIM]

0 0 180 Python

fix(ci): add missing provider docs, fix deprecated model refs in cost tests

- Add black_forest_labs and charity_engine to provider_endpoints_support.json
  (fixes check_code_and_doc_quality job)
- Replace o1-mini with o1 in test_reasoning_tokens_no_price_set (model removed
  from cost map)
- Replace gemini-2.5-pro-exp-03-25 with gemini-2.5-pro in
  test_generic_cost_per_token_above_200k_tokens (model removed from cost map)
- Fix test_get_cost_for_anthropic_web_search to use claude-3-7-sonnet-20250219
  with custom_llm_provider='anthropic' so web search cost is computed correctly

Co-authored-by: yuneng-jiang <yuneng-jiang@users.noreply.github.com>
C
Cursor Agent committed
aacc7b18f865e80722f0eb3c8c06f3e83ddf34c3
Parent: 5d7bfe6