Highlights
- Pro
Pinned Loading
-
vllm-project/vllm
vllm-project/vllm PublicA high-throughput and memory-efficient inference and serving engine for LLMs
-
oobabooga/text-generation-webui
oobabooga/text-generation-webui PublicThe definitive Web UI for local AI, with powerful features and easy setup.
-
OpenBMB/MiniCPM-V
OpenBMB/MiniCPM-V PublicMiniCPM-V 4.5: A GPT-4o Level MLLM for Single Image, Multi Image and High-FPS Video Understanding on Your Phone
-
huggingface/llm-vscode
huggingface/llm-vscode PublicLLM powered development for VSCode
-
-
llm-vscode-inference-server
llm-vscode-inference-server PublicAn endpoint server for efficiently serving quantized open-source LLMs for code.
Something went wrong, please refresh the page to try again.
If the problem persists, check the GitHub status page or contact support.
If the problem persists, check the GitHub status page or contact support.