LiteLLM Proxy Projects .

Technology

LiteLLM Proxy

A high-performance middleware that unifies 100+ LLM APIs into a single OpenAI-compatible endpoint.

LiteLLM Proxy acts as a centralized gateway for managing diverse AI models (including GPT-4, Claude 3.5, and Llama 3) through one standardized interface. It handles the heavy lifting of production deployments: automatic retries, load balancing across multiple keys, and granular spend tracking for over 100 providers. By mapping custom headers to specific model groups, teams can switch backend providers (like moving from Azure to Bedrock) without changing a single line of application code. The proxy also integrates directly with Prometheus for monitoring and provides built-in virtual keys to enforce per-user rate limits and budgets.

https://github.com/BerriAI/litellm
0 projects · 0 cities

Recent Talks & Demos

Showing 1-0 of 0

Members-Only

Sign in to see who built these projects

No public projects found for this technology yet.