Skip to content

Self-hosted AI Chat. Relies on locally installed Ollama.

Notifications You must be signed in to change notification settings

R-udren/rovertAIChat

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

195 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

rovertChat ( ͡° ͜ʖ ͡°)

GitHub Repository Stars License Issues

rovertChat is a powerful, self-hosted AI platform that prioritizes privacy and user experience. It seamlessly integrates with LLM runners like Ollama and supports OpenAI-compatible APIs, giving you complete control over your AI interactions.

📸 Preview

Chat with markdown

Rich markdown support for enhanced conversations

Chat Model Select

Easily switch between different AI models

Chat Loading

Smooth loading experience during AI responses

🚀 Key Features

  • Self-hosted & Privacy-focused: Keep your data on your own infrastructure
  • Real-time AI chat: Smooth, responsive conversations with AI models
  • Model flexibility: Connect to various LLM backends (Ollama, OpenAI API compatible services)
  • Multi-user support: Role-based access control (guest, user, admin)
  • Complete chat history: All conversations securely stored and easily accessible
  • Responsive design: Works on desktop and mobile devices
  • Authentication: JWT-based secure authentication system
  • User profiles: Personalize your experience with custom settings and preferences
  • Markdown support: Write and render Markdown in chat

🔧 Technical Stack

  • Backend: Python with FastAPI for API endpoints
  • Frontend: Vue.js with Tailwind CSS 4 for a modern, responsive UI and Markdown support with Bun as the build tool
  • Database: PostgreSQL/SQLite database for storing user auth, user profiles, chat history, and configurations
  • Reverse Proxy: Caddy with automatic HTTPS (Let's Encrypt) and modern security headers
  • Deployment: Docker containers with docker-compose orchestration

🆕 Recent Updates

  • Migrated from nginx to Caddy: Automatic HTTPS with Let's Encrypt, simplified configuration, no manual certificate management required
  • Simplified deployment: One-command deployment with automatic SSL certificates
  • Enhanced security: Modern TLS configuration and security headers included by default

See CADDY_MIGRATION.md for detailed migration information.

🛠️ Installation

Prerequisites

Before getting started, ensure you have the following installed:

  • Git: To clone the repository
  • Docker & Docker Compose: For containerized deployment
  • Ollama: For local LLM running
  • Python 3.9+: For backend development (if running locally)
  • Node.js: For frontend development (if running locally)
  • Bun: For frontend package management and build (if running locally)
  • UV: For backend dependency management (if running locally)
  • PostgreSQL: For database (SQLite if running locally)

Quick Start with Docker 🐳

  1. Clone the repository

    git clone https://github.com/R-udren/rovertAIChat.git
    cd rovertAIChat
  2. Set up environment variables

    cp .env.example .env
  3. Configure your environment Edit the .env file with your settings:

    # Database Configuration
    DATABASE_PASSWORD=your_secure_password
    
    # Domain Configuration
    DOMAIN=localhost  # or your domain name
    FRONTEND_ORIGINS=https://localhost  # or https://yourdomain.com
    VITE_API_BASE_URL=https://localhost  # or https://yourdomain.com
    
    # JWT Configuration (generate secure random strings)
    JWT_SECRET_KEY=your_super_secret_jwt_key_here
    JWT_REFRESH_SECRET_KEY=your_super_secret_refresh_key_here
  4. Set up and run the application

    With Caddy (recommended - automatic HTTPS):

    # Build and start all services
    docker-compose up --build -d

    The application will be available at:

    Note: Caddy automatically handles HTTPS with self-signed certificates for localhost development and Let's Encrypt certificates for production domains.

  5. Production deployment

    For production with automatic Let's Encrypt certificates:

    # 1. Update your .env file
    DOMAIN=yourdomain.com
    VITE_API_BASE_URL=https://yourdomain.com
    FRONTEND_ORIGINS=https://yourdomain.com
    
    # 2. Copy and modify production Caddyfile
    cp Caddyfile.prod Caddyfile
    # Edit Caddyfile and replace yourdomain.com with your actual domain
    
    # 3. Ensure DNS points to your server
    # 4. Deploy
    docker-compose up --build -d

Local Development Setup 💻

If you prefer to run the services individually for development:

Backend Setup

  1. Install Python dependencies

    cd backend
    uv sync
  2. Set up the database (or skip to use local sqlite db)

    # Start PostgreSQL with Docker
    docker run -d --name postgres \
      -e POSTGRES_PASSWORD=your_password \
      -e POSTGRES_DB=postgres \
      -p 5432:5432 postgres:17-alpine
  3. Run the backend

    fastapi run src/main.py

Frontend Setup

  1. Install Bun (if not already installed)

    # Windows (PowerShell)
    powershell -c "irm bun.sh/install.ps1 | iex"
    
    # macOS/Linux
    curl -fsSL https://bun.sh/install | bash
  2. Install dependencies and run

    cd frontend
    bun install
    bun run dev

Setting up Ollama Integration 🦙

To use local LLMs with Ollama:

  1. Install Ollama

    Visit https://ollama.com and download the installer

  2. Pull some models (optional, you can also do same from admin panel)

    ollama pull qwen3:8b
  3. Configure the backend The backend is already configured to connect to Ollama at http://localhost:11434

Environment Variables Reference 📋

Variable Description Example
DATABASE_PASSWORD PostgreSQL password your_secure_password
DOMAIN Your domain name localhost or yourdomain.com
FRONTEND_ORIGINS CORS origins for frontend https://localhost
VITE_API_BASE_URL API base URL for frontend https://localhost
JWT_SECRET_KEY JWT signing key your_super_secret_key
JWT_REFRESH_SECRET_KEY JWT refresh token key your_refresh_secret_key

Troubleshooting 🔧

Common Issues:

  • SSL Certificate errors: Make sure your certificates are properly placed in frontend/certs/
  • Database connection issues: Verify PostgreSQL is running and credentials are correct
  • Ollama not accessible: Ensure Ollama is running on the host machine
  • CORS errors: Check that FRONTEND_ORIGINS matches your domain

Useful Commands:

# View logs
docker-compose logs -f

# Restart services
docker-compose restart

# Rebuild after code changes
docker-compose up --build

# Stop all services
docker-compose down

# Remove all containers and volumes
docker-compose down -v

# Reset database and start fresh
docker-compose down -v && docker-compose up -d --build

🔐 Role-Based Access

rovertChat implements three user roles:

  • Guest: Can do nothing except discovering fancy home page
  • User: Has persistent chat history and personalized settings
  • Admin: Can upload and manage LLM models, configure system settings, and manage users (first user is automatically created as admin)

🌟 Upcoming Features

These features will be implemented in future releases:

  • Real-time Text-to-Speech and Speech-to-Text: Voice interactions with AI
  • Advanced Markdown support: Real-time rendering and code highlighting
  • Mermaid Diagramming: Create and visualize diagrams directly in chat
  • Function Calling: Execute external functions through chat commands
  • Image Generation: Create images using text prompts

🖥️ Deployment Options

rovertChat is designed to be flexible in deployment:

  • Local development: Run on your personal machine
  • Self-hosted server: Deploy on your own server infrastructure
  • Docker containers: Simple containerized deployment

🔒 Security Considerations

  • HTTPS: SSL/TLS encryption is enabled by default
  • JWT Authentication: Secure token-based authentication
  • Password Hashing: Bcrypt for secure password storage
  • CORS Protection: Configurable origin restrictions
  • Rate Limiting: API rate limiting to prevent abuse
  • SQL Injection Protection: SQLAlchemy ORM with parameterized queries

📖 API Documentation

Once the backend is running, you can access the interactive API documentation:

  • Swagger UI: https://yourdomain.com/docs
  • ReDoc: https://yourdomain.com/redoc
  • OpenAPI Schema: https://yourdomain.com/openapi.json

🧪 Testing

Run the test suite:

# Backend tests
cd backend
uv run pytest

# Frontend tests (when implemented)
cd frontend
bun test

🤝 Contributing

We welcome contributions! Please follow these steps:

  1. Fork the repository
  2. Create a feature branch (git checkout -b feature/amazing-feature)
  3. Make your changes
  4. Add tests for new functionality
  5. Ensure all tests pass
  6. Commit your changes (git commit -m 'Add amazing feature')
  7. Push to the branch (git push origin feature/amazing-feature)
  8. Open a Pull Request on GitHub

📝 Development Guidelines

  • Code Style: Follow PEP 8 for Python, ESLint for JavaScript/Vue
  • Commit Messages: Use conventional commit format
  • Testing: Write tests for new features
  • Documentation: Update documentation for API changes

🆘 Support

🙏 Acknowledgments

  • FastAPI: For the excellent Python web framework
  • Vue.js: For the reactive frontend framework
  • Ollama: For making local LLM hosting accessible
  • PostgreSQL: For reliable data storage
  • Tailwind CSS: For beautiful, responsive styling

Made with ❤️ for the self-hosting and privacy-conscious AI community

rovertChat - Where your AI conversations stay yours ( ͡° ͜ʖ ͡°)

About

Self-hosted AI Chat. Relies on locally installed Ollama.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published