BerriAI/litellm
litellm
Python SDK, Proxy Server (AI Gateway) to call 100+ LLM APIs in OpenAI (or native) format, with cost tracking, guardrails, loadbalancing and logging. [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, VLLM, NVIDIA NIM]
Builder

BerriAI
BerriAI • individual
Stars
42,004
Using upstream star count
Forks
6,952
Using upstream fork count
Open Issues
0
Activity Score
0/100
1482 commits in 30d
Created
Jul 27, 2023
Project creation date
README Summary
LiteLLM is a Python SDK and proxy server that provides a unified interface to call 100+ different LLM APIs using OpenAI format. It includes features like cost tracking, guardrails, load balancing, and comprehensive logging across major AI providers including Bedrock, Azure, OpenAI, VertexAI, Cohere, and Anthropic.
AI Dev Skills
Unmapped
Tags
Taxonomy
AI Trends
Deployment Context
Modalities
Skill Areas
Recent Activity
Updated 1 months ago
7 Days
90
30 Days
1482
90 Days
7817
Quality
production- Quality
- high
- Maturity
- production
Categories
PM Skills
Languages
Timeline
- Project created
- Jul 27, 2023
- Forked
- Mar 13, 2026
- Your last push
- 1 months ago
- Upstream last push
- 6 days ago
- Tracked since
- Mar 13, 2026
Similar Repos
pgvector cosine similarity · $0
Loading…