This repository provides a local development environment for Open WebUI using Docker Compose, which includes the following services:
- PostgreSQL (with pgvector extension)
- Redis (with custom configuration)
- Ollama (for language model inference)
- Node (for Svelte frontend development via Vite)
- Open WebUI Python backend (FastAPI / Uvicorn)
Below is an overview of how to set up, run, and use this environment.
- Docker
- Docker Compose
- Often included in modern Docker Desktop installations.
- Make (optional, but recommended)
There are two main files of interest:
- docker-compose.yaml Describes how to build and run the services.
- Makefile Provides convenient targets to set up and run the Docker Compose environment.
If you do not already have the open-webui repository downloaded, the Makefile will handle that for you:
make open-webuiThis target clones the Open WebUI repository into the ./open-webui folder if it is not already present.
Once you have everything in place, simply run:
make upThis command will:
- Clone the Open WebUI repository if necessary.
- Start all containers in detached mode (
-d).
Docker Compose will build and run the needed services:
postgresredisollamanpm(for the frontend)open-webui(the Python backend)
-
Node-based Vite Dev Server (Frontend) Accessible at http://127.0.0.1:5173. It provides hot reloading for Svelte changes.
-
Python Backend (FastAPI) Accessible at http://127.0.0.1:8080.
-
Postgres
- By default, listens on
127.0.0.1:5432. - Database credentials come from environment variables (see docker-compose.yaml).
- By default, listens on
-
Redis
- Exposed internally to containers at
redis://redis:6379. - Not exposed to the host by default.
- Password:
open-webui.
- Exposed internally to containers at
-
Ollama
- By default, runs on port
11434inside the container. - Not exposed to the host by default.
- By default, runs on port
To test locally with Ollama, you need to pull at least one language model into the ollama container. For example, you might pull the “qwen2.5:0.5b” model.
- Ensure the containers are up and running:
make up
- Exec into the
ollamacontainer:docker compose exec ollama /bin/sh - Inside the container, run:
(Or replace
ollama pull qwen2.5:0.5b
qwen2.5:0.5bwith any other model you want.)
The pulled model files will be stored in the volume mounted at ./open-webui/backend/data/ollama.
The Python backend container (open-webui) is configured to open a debugger at port 5678. To connect remotely from VS Code, you can configure your project’s .vscode/launch.json with the following snippet:
{
"version": "0.2.0",
"configurations": [
{
"name": "Python Debugger: Remote Attach",
"type": "debugpy",
"request": "attach",
"connect": {
"host": "localhost",
"port": 5678
},
"pathMappings": [
{
"localRoot": "${workspaceFolder}",
"remoteRoot": "${workspaceFolder}"
}
]
}
]
}Since we’re running inside a development container, it’s important that localRoot and remoteRoot are set to the same path. Make sure your ${workspaceFolder} is the same path both inside and outside the container so that breakpoints will map correctly.
A .devcontainer/devcontainer.json file is already present in this repository. This file helps configure and launch a VS Code Dev Container based on the Docker Compose setup.
- “dockerComposeFile” references your Docker Compose config file (in this case,
../docker-compose.devcontainer.yaml). - “service” specifies which service should be your “container workspace” (here it’s “open-webui”).
- “workspaceFolder” sets the container’s working directory to “/app”, matching how your code volume is mounted.
- “customizations.vscode.extensions” defines recommended extensions to automatically install into the dev container.
- “customizations.vscode.settings” configures debugging, interpreter path, formatting, etc.
- “postStartCommand” adjusts Git permissions for the container workspace.
To use this existing setup, open the Command Palette in VS Code and select: Remote-Containers: “Open Folder in Container…” (or “Reopen in Container” if you already have the folder open)
This will spin up the specified Docker services, attach VS Code to the open-webui container, and ensure all extensions and Python debugging are ready to go.
-
View Logs You can view the logs of a specific service (e.g.,
open-webui) with:docker compose logs -f open-webui
Replace
open-webuiwith any service name to view its logs. -
Bring Everything Down To stop all containers and remove them:
make down
or
docker compose -f docker-compose.devcontainer.yaml down
-
Cleaning Up If you need to remove volumes or clean up data:
docker compose down -v
This will remove all containers and associated volumes. Use with caution.
Below are some useful environment variables (defaults shown in parentheses):
POSTGRES_DB(openwebui)POSTGRES_USER(openwebui)POSTGRES_PASSWORD(openwebui)POSTGRES_PORT(5432)OPEN_WEBUI_PORT(8080)OLLAMA_NUM_THREADS(2)WEBUI_SECRET_KEY(tQD5RHiU42ubYJ1SeRn1)
They can be overridden by setting them in a .env file or exporting them in your shell before running docker compose.
- The npm container runs the Svelte frontend in development mode using Vite on port 5173.
- The open-webui container runs the Python backend via Uvicorn on port 8080 and has a debug port open at 5678.
- By default, CORS is set to "*" to allow local cross-origin requests from the frontend.
If you have additions or improvements, feel free to open pull requests or issues in the respective repositories: