LiteLLM is a leading open-source LLM gateway and proxy server that provides a unified API interface for 100+ large language model providers. With over 38,000 GitHub stars and adoption by companies like Netflix, LiteLLM simplifies LLM integration by standardizing API calls across OpenAI, Anthropic, Azure, Google, AWS Bedrock, and many more providers.
License: MIT (dual-licensed with LiteLLM Commercial License for enterprise features)
GitHub: BerriAI/litellm
- π 100+ LLM Providers - Unified API for OpenAI, Anthropic, Azure, Google, AWS Bedrock, and more
- π Proxy Server - Centralized LLM gateway with authentication, rate limiting, and budgeting
- π° Cost Tracking - Track token usage and costs across all providers
- π Caching - Redis caching for reduced latency and costs
- π Observability - Logging, tracing, and analytics with Langfuse, Helicone, and more
- π Enterprise Features - SSO, custom SLAs, dedicated support (commercial license)
- β‘ Low Latency - P95 latency: 8ms at 1k RPS
- π‘οΈ Security - API key management, request validation, and audit logs
- Multi-Provider LLM Applications - Abstract away provider-specific APIs
- Cost Optimization - Route requests to cheapest available provider
- Enterprise LLM Gateway - Centralized access control and monitoring
- LLM Fallback - Automatic failover between providers
- Usage Analytics - Track and analyze LLM usage across teams
| Component |
Technology |
| Language |
Python (82.4%), TypeScript (15.9%) |
| Package Manager |
pip, Poetry |
| Deployment |
Docker, Docker Compose, Kubernetes, Helm |
| Caching |
Redis |
| Database |
PostgreSQL, MySQL, SQLite |
| Docker Image |
docker.litellm.ai/berriai/litellm:main-stable |
| Component |
Minimum |
Recommended (Production) |
| OS |
Linux, macOS, Windows |
Linux (Ubuntu 20.04+) |
| Python |
3.9+ |
3.11+ |
| CPU |
1 core |
4+ cores |
| RAM |
512 MB |
8+ GB |
| Disk |
1 GB |
10+ GB |
- β
Open-source (MIT License)
- β
Active development (v1.82.0 - March 1, 2026)
- β
38.2k+ GitHub stars, 6.3k forks
- β
1,238+ releases, 34,676+ commits
- β
PyPI package:
litellm
- β
Official Docker images available
- β
Used by 18.6k+ repositories
- β
Helm chart available (BETA)
ΒΆ History and References
Any questions?
Feel free to contact us. Find all contact information on our contact page.