Fully-featured, beautiful web interface for vLLM - built with NextJS.
-
Updated
May 9, 2025 - TypeScript
Fully-featured, beautiful web interface for vLLM - built with NextJS.
vLLM Documentation in Chinese Simplified / vLLM 中文文档
Enterprise-grade LLM automated deployment tool that makes AI servers truly "plug-and-play".
AI-based search done right
LLM Call Center,AI Call Center,大模型呼叫中心,大模型客服系统,可以对接市面上主流模型与私有模型:OpenAI,LLaMA,Kimi,通义千问,智谱AI,讯飞星火,Gemini,Xorbits Inference,Amazon Bedrock,火山引擎,腾讯混元,Claude,Bard,DeepSeek,Azure OpenAI,千帆大模型,Ollama,qwen,vLLM
Enterprise-grade LLM routing microservice with multi-provider support, intelligent failover, and cost optimization.
Visual Query Bot – Developed an interactive chat application that enables users to upload images, draw bounding boxes on specific regions, and ask targeted questions about those areas. Leveraged LangChain and LangGraph to build robust information retrieval agents for context-aware visual querying
Add a description, image, and links to the vllm topic page so that developers can more easily learn about it.
To associate your repository with the vllm topic, visit your repo's landing page and select "manage topics."