User-friendly AI Interface (Supports Ollama, OpenAI API, ...)
-
Updated
Mar 25, 2026 - Python
User-friendly AI Interface (Supports Ollama, OpenAI API, ...)
Free, high-quality text-to-speech API endpoint to replace OpenAI, Azure, or ElevenLabs
Belullama is a comprehensive AI application that bundles Ollama, Open WebUI, and Automatic1111 (Stable Diffusion WebUI) into a single, easy-to-use package.
This repository provides resources and guidelines to facilitate the integration of Open-WebUI and Langfuse, enabling seamless monitoring and management of AI model usage statistics.
A modern web interface for [Ollama](https://ollama.ai/), with DeepSeek in next version.
BeautifyOllama is an open-source web interface that enhances your local Ollama AI model interactions with a beautiful, modern design. Built with cutting-edge web technologies, it provides a seamless chat experience with stunning visual effects and enterprise-grade functionality.
A Docker Compose to run a local ChatGPT-like application using Ollama, Ollama Web UI, Mistral NeMo & DeepSeek R1.
PuPu is a lightweight, cross-platform desktop AI client that works with both local and cloud-hosted models. Whether you prefer running models on your own machine or connecting to providers like OpenAI and Anthropic, PuPu gives you a unified, elegant interface — your AI, your rules.
Ollama with Let's Encrypt Using Docker Compose
Persian Ollama Project: Enhance Persian (Farsi) prompts when chatting with Ollama LLMs.
Better, open-source, LLM wrapper (user management, text processing, file processing, image generation, web search, local models)
Ollama Web UI is a simple yet powerful web-based interface for interacting with large language models. It offers chat history, voice commands, voice output, model download and management, conversation saving, terminal access, multi-model chat, and more—all in one streamlined platform.
A minimal interface in pure HTML/CSS for talking with Ollama focused on ensuring you can read the code.
Simple web UI for Ollama
This Docker Compose setup provides an isolated application with Ollama, Open-WebUI, and Nginx reverse proxy to enable secure HTTPS access. Since Open-WebUI does not support SSL natively, Nginx acts as a reverse proxy, handling SSL termination.
Minimal Ollama chat UI - no login, no heavy features.
Add a description, image, and links to the ollama-webui topic page so that developers can more easily learn about it.
To associate your repository with the ollama-webui topic, visit your repo's landing page and select "manage topics."