End-to-end documentation to set up your own local & fully private LLM server on Debian. Equipped with chat, web search, RAG, model management, MCP servers, image generation, and TTS.
-
Updated
Oct 30, 2025
End-to-end documentation to set up your own local & fully private LLM server on Debian. Equipped with chat, web search, RAG, model management, MCP servers, image generation, and TTS.
A robust, production-ready Python toolkit to automate the synchronization between a directory of .gguf model files and a llama-swap config.yaml
Custom Llama Swap Container Image
Add a description, image, and links to the llama-swap topic page so that developers can more easily learn about it.
To associate your repository with the llama-swap topic, visit your repo's landing page and select "manage topics."