Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
-
Updated
Sep 20, 2025 - Python
Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
Customize and Extend Claude Code with ccproxy: Route to OpenAI, Gemini, Qwen, OpenRouter, and Ollama. Gain full control of your Claude Max/Pro Subscription with your own router.
A personal LLM gateway with fault-tolerant capabilities for calls to LLM models from any provider with OpenAI-compatible APIs. Advanced features like retry, model sequencing, and body parameter injection are also available. Especially useful to work with AI coders like Cline and RooCode and providers like OpenRouter.
An open-source, security-first LLM Gateway designed to provide a unified, secure, and observable entry point to any Large Language Model.
Lightweight AI inference gateway - local model registry & parameter transformer (Python SDK) - with optional Envoy proxy processor and FastAPI registry server deployment options.
MCP Connection Hub - A unified Model Context Protocol Gateway
A secure, governable AI gateway for Splunk with operational guardrails. An alternative to Splunk AI Assistant focused on safety, compliance, and predictable results using a 'Configuration as Code' approach."
Enterprise LLM Orchestration Platform - Intelligently route requests across multiple AI providers with cost optimisation and real-time monitoring
This repository contains my development branches for contributing to llmcord 2β
Add a description, image, and links to the llm-gateway topic page so that developers can more easily learn about it.
To associate your repository with the llm-gateway topic, visit your repo's landing page and select "manage topics."