SIGN IN SIGN UP
BerriAI / litellm UNCLAIMED

Python SDK, Proxy Server (AI Gateway) to call 100+ LLM APIs in OpenAI (or native) format, with cost tracking, guardrails, loadbalancing and logging. [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, VLLM, NVIDIA NIM]

0 0 1 Python
# Data Privacy and Security
## Security Measures
### LiteLLM Github
- All commits run through Github's CodeQL checking
### Self-hosted Instances LiteLLM
- **No data or telemetry is stored on LiteLLM Servers when you self host**
- For installation and configuration, see: [Self-hosting guided](https://docs.litellm.ai/docs/proxy/deploy)
- **Telemetry** We run no telemetry when you self host LiteLLM
Corrected docs updates sept 2025 (#14916) * docs: Corrected documentation updates from Sept 2025 This PR contains the actual intended documentation changes, properly synced with main: ✅ Real changes applied: - Added AWS authentication link to bedrock guardrails documentation - Updated Vertex AI with Gemini API alternative configuration - Added async_post_call_success_hook code snippet to custom callback docs - Added SSO free for up to 5 users information to enterprise and custom_sso docs - Added SSO free information block to security.md - Added cancel response API usage and curl example to response_api.md - Added image for modifying default user budget via admin UI - Re-ordered sidebars in documentation ❌ Sync issues resolved: - Kept all upstream changes that were added to main after branch diverged - Preserved Provider-Specific Metadata Parameters section that was added upstream - Maintained proper curl parameter formatting (-d instead of -D) This corrects the sync issues from the original PR #14769. * docs: Restore missing files from original PR Added back ~16 missing documentation files that were part of the original PR: ✅ Restored files: - docs/my-website/docs/completion/usage.md - docs/my-website/docs/fine_tuning.md - docs/my-website/docs/getting_started.md - docs/my-website/docs/image_edits.md - docs/my-website/docs/image_generation.md - docs/my-website/docs/index.md - docs/my-website/docs/moderation.md - docs/my-website/docs/observability/callbacks.md - docs/my-website/docs/providers/bedrock.md - docs/my-website/docs/proxy/caching.md - docs/my-website/docs/proxy/config_settings.md - docs/my-website/docs/proxy/db_deadlocks.md - docs/my-website/docs/proxy/load_balancing.md - docs/my-website/docs/proxy_api.md - docs/my-website/docs/rerank.md ✅ Fixed context-caching issue: - Restored provider_specific_params.md to main version (preserving Provider-Specific Metadata Parameters section) - Your original PR didn't intend to modify this file - it was just a sync issue Now includes all ~26 documentation files from the original PR #14769. * docs: Remove files that were deleted in original PR - Removed docs/my-website/docs/providers/azure_ai_img_edit.md (was deleted in original PR) - sdk/headers.md was already not present Now matches the complete intended changes from original PR #14769. * docs: Restore azure_ai_img_edit.md from main - Restored docs/my-website/docs/providers/azure_ai_img_edit.md from main branch - This file should not have been deleted as it was a newer commit - SDK headers file doesn't exist in main (was reverted) and wasn't part of your original changes Fixes the file restoration issues. * docs: Fix vertex.md - preserve context caching from newer commit - Restored vertex.md to main version to preserve context caching content (lines 817-887) - Added back only your intended change: alternative gemini config example - Context caching content from newer commit is now preserved Fixes the vertex.md sync issue where newer content was incorrectly deleted. * docs: Fix providers/bedrock.md - restore deleted content from newer commit - Restored providers/bedrock.md to main version - Preserves 'Usage - Request Metadata' section that was added in newer commit - Your actual intended change was to proxy/guardrails/bedrock.md (authentication tip) which is preserved - Now only has additions, no subtractions as intended Fixes the bedrock.md sync issue. * docs: Restore missing IAM policy section in bedrock.md Added back your intended IAM policy documentation that was lost when restoring main version: ✅ Added IAM AssumeRole Policy section: - Explains requirement for sts:AssumeRole permission - Shows error message example when permission missing - Provides complete IAM policy JSON example - Links to AWS AssumeRole documentation - Clarifies trust policy requirements Now bedrock.md has both: - All newer content preserved (Request Metadata section) - Your intended IAM policy addition restored --------- Co-authored-by: Cursor Agent <cursoragent@cursor.com>
2025-09-25 15:49:19 -07:00
:::info
✨ SSO is free for up to 5 users. After that, an enterprise license is required. [Get Started with Enterprise here](https://www.litellm.ai/enterprise)
:::
### LiteLLM Cloud
- We encrypt all data stored using your `LITELLM_MASTER_KEY` and in transit using TLS.
- Our database and application run on GCP, AWS infrastructure, partly managed by NeonDB.
- US data region: Northern California (AWS/GCP `us-west-1`) & Virginia (AWS `us-east-1`)
- EU data region Germany/Frankfurt (AWS/GCP `eu-central-1`)
- All users have access to SSO (Single Sign-On) through OAuth 2.0 with Google, Okta, Microsoft, KeyCloak.
- Audit Logs with retention policy
- Control Allowed IP Addresses that can access your Cloud LiteLLM Instance
For security inquiries, please contact us at support@berri.ai
#### Supported data regions for LiteLLM Cloud
LiteLLM supports the following data regions:
- US, Northern California (AWS/GCP `us-west-1`)
- Europe, Frankfurt, Germany (AWS/GCP `eu-central-1`)
All data, user accounts, and infrastructure are completely separated between these two regions
### Security Vulnerability Reporting Guidelines
We value the security community's role in protecting our systems and users. To report a security vulnerability:
- Email support@berri.ai with details
- Include steps to reproduce the issue
- Provide any relevant additional information
We'll review all reports promptly. Note that we don't currently offer a bug bounty program.