rovertChat is a powerful, self-hosted AI platform that prioritizes privacy and user experience. It seamlessly integrates with LLM runners like Ollama and supports OpenAI-compatible APIs, giving you complete control over your AI interactions.
Rich markdown support for enhanced conversations
Easily switch between different AI models
Smooth loading experience during AI responses
- Self-hosted & Privacy-focused: Keep your data on your own infrastructure
- Real-time AI chat: Smooth, responsive conversations with AI models
- Model flexibility: Connect to various LLM backends (Ollama, OpenAI API compatible services)
- Multi-user support: Role-based access control (guest, user, admin)
- Complete chat history: All conversations securely stored and easily accessible
- Responsive design: Works on desktop and mobile devices
- Authentication: JWT-based secure authentication system
- User profiles: Personalize your experience with custom settings and preferences
- Markdown support: Write and render Markdown in chat
- Backend: Python with FastAPI for API endpoints
- Frontend: Vue.js with Tailwind CSS 4 for a modern, responsive UI and Markdown support with Bun as the build tool
- Database: PostgreSQL/SQLite database for storing user auth, user profiles, chat history, and configurations
- Reverse Proxy: Caddy with automatic HTTPS (Let's Encrypt) and modern security headers
- Deployment: Docker containers with docker-compose orchestration
- Migrated from nginx to Caddy: Automatic HTTPS with Let's Encrypt, simplified configuration, no manual certificate management required
- Simplified deployment: One-command deployment with automatic SSL certificates
- Enhanced security: Modern TLS configuration and security headers included by default
See CADDY_MIGRATION.md for detailed migration information.
Before getting started, ensure you have the following installed:
- Git: To clone the repository
- Docker & Docker Compose: For containerized deployment
- Ollama: For local LLM running
- Python 3.9+: For backend development (if running locally)
- Node.js: For frontend development (if running locally)
- Bun: For frontend package management and build (if running locally)
- UV: For backend dependency management (if running locally)
- PostgreSQL: For database (SQLite if running locally)
-
Clone the repository
git clone https://github.com/R-udren/rovertAIChat.git cd rovertAIChat -
Set up environment variables
cp .env.example .env
-
Configure your environment Edit the
.envfile with your settings:# Database Configuration DATABASE_PASSWORD=your_secure_password # Domain Configuration DOMAIN=localhost # or your domain name FRONTEND_ORIGINS=https://localhost # or https://yourdomain.com VITE_API_BASE_URL=https://localhost # or https://yourdomain.com # JWT Configuration (generate secure random strings) JWT_SECRET_KEY=your_super_secret_jwt_key_here JWT_REFRESH_SECRET_KEY=your_super_secret_refresh_key_here
-
Set up and run the application
With Caddy (recommended - automatic HTTPS):
# Build and start all services docker-compose up --build -dThe application will be available at:
- Frontend: http://localhost (HTTP) or https://localhost (HTTPS with self-signed cert)
- Backend API: http://localhost/docs or https://localhost/docs
- Database: localhost:5432 (PostgreSQL)
- PGAdmin: http://localhost:8080 (if enabled)
Note: Caddy automatically handles HTTPS with self-signed certificates for localhost development and Let's Encrypt certificates for production domains.
-
Production deployment
For production with automatic Let's Encrypt certificates:
# 1. Update your .env file DOMAIN=yourdomain.com VITE_API_BASE_URL=https://yourdomain.com FRONTEND_ORIGINS=https://yourdomain.com # 2. Copy and modify production Caddyfile cp Caddyfile.prod Caddyfile # Edit Caddyfile and replace yourdomain.com with your actual domain # 3. Ensure DNS points to your server # 4. Deploy docker-compose up --build -d
If you prefer to run the services individually for development:
-
Install Python dependencies
cd backend uv sync -
Set up the database (or skip to use local sqlite db)
# Start PostgreSQL with Docker docker run -d --name postgres \ -e POSTGRES_PASSWORD=your_password \ -e POSTGRES_DB=postgres \ -p 5432:5432 postgres:17-alpine -
Run the backend
fastapi run src/main.py
-
Install Bun (if not already installed)
# Windows (PowerShell) powershell -c "irm bun.sh/install.ps1 | iex" # macOS/Linux curl -fsSL https://bun.sh/install | bash
-
Install dependencies and run
cd frontend bun install bun run dev
To use local LLMs with Ollama:
-
Install Ollama
Visit https://ollama.com and download the installer
-
Pull some models (optional, you can also do same from admin panel)
ollama pull qwen3:8b
-
Configure the backend The backend is already configured to connect to Ollama at
http://localhost:11434
| Variable | Description | Example |
|---|---|---|
DATABASE_PASSWORD |
PostgreSQL password | your_secure_password |
DOMAIN |
Your domain name | localhost or yourdomain.com |
FRONTEND_ORIGINS |
CORS origins for frontend | https://localhost |
VITE_API_BASE_URL |
API base URL for frontend | https://localhost |
JWT_SECRET_KEY |
JWT signing key | your_super_secret_key |
JWT_REFRESH_SECRET_KEY |
JWT refresh token key | your_refresh_secret_key |
Common Issues:
- SSL Certificate errors: Make sure your certificates are properly placed in
frontend/certs/ - Database connection issues: Verify PostgreSQL is running and credentials are correct
- Ollama not accessible: Ensure Ollama is running on the host machine
- CORS errors: Check that
FRONTEND_ORIGINSmatches your domain
Useful Commands:
# View logs
docker-compose logs -f
# Restart services
docker-compose restart
# Rebuild after code changes
docker-compose up --build
# Stop all services
docker-compose down
# Remove all containers and volumes
docker-compose down -v
# Reset database and start fresh
docker-compose down -v && docker-compose up -d --buildrovertChat implements three user roles:
- Guest: Can do nothing except discovering fancy home page
- User: Has persistent chat history and personalized settings
- Admin: Can upload and manage LLM models, configure system settings, and manage users (first user is automatically created as admin)
These features will be implemented in future releases:
- Real-time Text-to-Speech and Speech-to-Text: Voice interactions with AI
- Advanced Markdown support: Real-time rendering and code highlighting
- Mermaid Diagramming: Create and visualize diagrams directly in chat
- Function Calling: Execute external functions through chat commands
- Image Generation: Create images using text prompts
rovertChat is designed to be flexible in deployment:
- Local development: Run on your personal machine
- Self-hosted server: Deploy on your own server infrastructure
- Docker containers: Simple containerized deployment
- HTTPS: SSL/TLS encryption is enabled by default
- JWT Authentication: Secure token-based authentication
- Password Hashing: Bcrypt for secure password storage
- CORS Protection: Configurable origin restrictions
- Rate Limiting: API rate limiting to prevent abuse
- SQL Injection Protection: SQLAlchemy ORM with parameterized queries
Once the backend is running, you can access the interactive API documentation:
- Swagger UI:
https://yourdomain.com/docs - ReDoc:
https://yourdomain.com/redoc - OpenAPI Schema:
https://yourdomain.com/openapi.json
Run the test suite:
# Backend tests
cd backend
uv run pytest
# Frontend tests (when implemented)
cd frontend
bun testWe welcome contributions! Please follow these steps:
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature) - Make your changes
- Add tests for new functionality
- Ensure all tests pass
- Commit your changes (
git commit -m 'Add amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request on GitHub
- Code Style: Follow PEP 8 for Python, ESLint for JavaScript/Vue
- Commit Messages: Use conventional commit format
- Testing: Write tests for new features
- Documentation: Update documentation for API changes
- GitHub Repository: R-udren/rovertAIChat
- Documentation: Check this README and API docs
- Issues: Report bugs and request features
- Discussions: Join community discussions
- FastAPI: For the excellent Python web framework
- Vue.js: For the reactive frontend framework
- Ollama: For making local LLM hosting accessible
- PostgreSQL: For reliable data storage
- Tailwind CSS: For beautiful, responsive styling
Made with ❤️ for the self-hosting and privacy-conscious AI community
rovertChat - Where your AI conversations stay yours ( ͡° ͜ʖ ͡°)


