Production-ready Python library for multi-provider LLM orchestration
-
Updated
Oct 10, 2025 - Python
Production-ready Python library for multi-provider LLM orchestration
Multi cloud control of VM Instances across AWS, Azure, GCP and AliCloud - unified instance management
Your Digital Companion. Self-hosted Telegram bot orchestrating multiple AI providers (OpenAI, Anthropic, Google, xAI, DeepSeek, Mistral, Alibaba, MiniMax) with autonomous agent capabilities, MCP integrations, and async task execution. Not a tool. A partner.
Python for logic. English for intelligence.
A flexible agent framework for building AI agents with MCP (Model Context Protocol) integration, Core abstractions for LLM and Embedding models using mcp architecture. to specifically make AI agents easier to build.
Production-ready Python template for LLM development. Multi-provider support, enforcement tests, proper logging. The template is the lesson.
A modular Python library providing a unified interface for multiple LLM providers (OpenAI ChatGPT and Google Gemini) using Factory and Facade design patterns. Features conversation history management and easy provider switching.
Unified API for multiple LLM providers. Use as a Python library or HTTP API server.
Standalone, hardware-agnostic, zero-cost AI memory orchestrator
Enterprise LLM Orchestration Platform - Intelligently route requests across multiple AI providers with cost optimisation and real-time monitoring
Secure personal AI assistant with encrypted memory, multi-provider LLM routing, and privacy-first design. Inspired by OpenClaw.
Infer AI-friendly environmental and geographic metadata about biosamples from multiple authoritative sources
Multi-provider image generation skill showcasing how to build extensible AI image generation systems
Add a description, image, and links to the multi-provider topic page so that developers can more easily learn about it.
To associate your repository with the multi-provider topic, visit your repo's landing page and select "manage topics."