🦍 The API and AI Gateway
-
Updated
Jan 19, 2026 - Lua
🦍 The API and AI Gateway
Run any open-source LLMs, such as DeepSeek and Llama, as OpenAI compatible API endpoint in the cloud.
Deploy serverless AI workflows at scale. Firebase for AI agents
AutoRAG: An Open-Source Framework for Retrieval-Augmented Generation (RAG) Evaluation & Optimization with AutoML-Style Automation
RAG (Retrieval Augmented Generation) Framework for building modular, open source applications for production by TrueFoundry
The platform for LLM evaluations and AI agent testing
Your autonomous engineering team in a CLI. Point Zeroshot at an issue, walk away, and return to production-grade code. Supports Claude Code, OpenAI Codex, OpenCode, and Gemini CLI.
The collaborative spreadsheet for AI. Chain cells into powerful pipelines, experiment with prompts and models, and evaluate LLM responses in real-time. Work together seamlessly to build and iterate on AI applications.
AIConfig is a config-based framework to build generative AI applications.
Python SDK for running evaluations on LLM generated responses
An end-to-end LLM reference implementation providing a Q&A interface for Airflow and Astronomer
The context API for AI agents
Operations toolkit and methodology for persistent LLM agents — watchdogs, memory reset, cost monitoring, tool proxy, Docker Compose stacks, and a cookbook of practical recipes. Framework-agnostic patterns from running 3 AI agents 24/7.
[⛔️ DEPRECATED] Friendli: the fastest serving engine for generative AI
Quality Control for AI Artifact Management
Production operations framework for AI-powered SaaS. The architectural patterns, failure modes, and operational playbooks that determine whether your AI systems scale profitably or fail expensively.
🔍 AI observability skill for Claude Code. Debug LangChain/LangGraph agents by fetching execution traces from LangSmith Studio directly in your terminal.
Miscellaneous codes and writings for MLOps
Add a description, image, and links to the llm-ops topic page so that developers can more easily learn about it.
To associate your repository with the llm-ops topic, visit your repo's landing page and select "manage topics."