-
Updated
Jul 18, 2025 - Python
#
litellm-ai-gateway
Here are 3 public repositories matching this topic...
Connect any LLM-powered client app, such as a coding agent, to any supported inference backend/model.
proxy gemini-api openai-api llm claude-ai claude-api openrouter llm-proxy litellm deepseek google-vertex-api gemini-cli anthropic-api llm-agentic-ai grok-api qwen-coder claude-code qwen3 litellm-ai-gateway claude-proxy
-
Updated
Sep 13, 2025 - Python
High-performance LLM Gateway built in Go - OpenAI compatible proxy with multi-provider support, adaptive routing, and enterprise features
kubernetes golang enterprise microservices ai azure api-gateway proxy gateway load-balancer bedrock openai cloud-native llm anthropic openrouter litellm openrouter-api openrouter-go litellm-ai-gateway
-
Updated
Sep 13, 2025 - Go
Improve this page
Add a description, image, and links to the litellm-ai-gateway topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with the litellm-ai-gateway topic, visit your repo's landing page and select "manage topics."