Open Source LLM proxy that transparently captures and logs all interactions with LLM API
-
Updated
Jun 15, 2025 - HTML
Open Source LLM proxy that transparently captures and logs all interactions with LLM API
This is a robust and configurable LLM proxy server built with Node.js, Express, and PostgreSQL. It acts as an intermediary between your applications and various Large Language Model (LLM) providers
A personal LLM gateway with fault-tolerant capabilities for calls to LLM models from any provider with OpenAI-compatible APIs. Advanced features like retry, model sequencing, and body parameter injection are also available. Especially useful to work with AI coders like Cline and RooCode and providers like OpenRouter.
Allows any BYOK AI editor or extension, such as Cursor or Continue, to connect to any openai-compatible LLM by aliasing it as a different model
Store your knowledge (privately), lead LLMs with it and cure hallucinations.
A TypeScript wrapper to seamlessly route multiple Vercel AI providers by model name, offering unparalleled flexibility and extensibility for managing diverse AI services—inspired by the provider-routing architecture of litellm but optimized for TypeScript/Vercel workflows
[WIP] Sorai is a lightweight, high-performance, and open-source LLM proxy gateway.
Add a description, image, and links to the llm-proxy topic page so that developers can more easily learn about it.
To associate your repository with the llm-proxy topic, visit your repo's landing page and select "manage topics."