Open Chat Playground (OCP) is a web UI that is able to connect virtually any LLM from any platform.
-
Updated
Sep 23, 2025 - C#
Open Chat Playground (OCP) is a web UI that is able to connect virtually any LLM from any platform.
Saraf AI is a fully local assistant built with Next.js and Docker. It connects seamlessly to LLMs via Docker Model Runner using Docker Compose.
This provides sample codes that uses Microsoft.Extensions.AI for locally running LLMs through Docker Model Runner, Foundry Local, Hugging Face and Ollama
A flexible, extensible AI agent backend built with NestJS—designed for running local, open-source LLMs (Llama, Gemma, Qwen, DeepSeek, etc.) via Docker Model Runner. Real-time streaming, Redis messaging, web search, and Postgres memory out of the box. No cloud APIs required!
Offline Kiwix ZIM-to-vector RAG system for local LLM knowledge retrieval
A streamlined chat application that leverages Docker Model Runner to serve Large Language Models (LLMs) through a modern Streamlit interface. This project demonstrates containerized LLM deployment with a user-friendly web interface.
Docker Desktop extension that deploys Open WebUI with Docker Model Runner integration in one click
Personal Knowledge Base (PKB) Links tool to organize, store and search effectively
Demo of Docker Model Runner in both development and production environments.
This project demonstrates how to configure Spring AI to interact with Ollama and Docker Model Runner
Maven plugin for AI security scanning using local LLMs to detect secrets, API keys & passwords in your code
Add a description, image, and links to the docker-model-runner topic page so that developers can more easily learn about it.
To associate your repository with the docker-model-runner topic, visit your repo's landing page and select "manage topics."