Skip to content

Lightweight chat interface for local AI models for your browser. Built with vanilla JavaScript for maximum performance and simplicity.

License

Notifications You must be signed in to change notification settings

jenissimo/ollama-js-template

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Ollama Chat 🚀

Lightweight, fast web interface for Ollama - your local AI companion

A minimal, feature-rich chat interface that brings the power of local AI models to your browser. Built with vanilla JavaScript for maximum performance and simplicity.

Just 3 small files - index.html, code.js, styles.css - that's all you need for a full-featured Ollama chat interface!

✨ Features

  • Real-time streaming - Watch AI responses generate live
  • Markdown support - Full Markdown rendering with syntax highlighting
  • Code highlighting - Automatic language detection and syntax highlighting
  • Responsive design - Works perfectly on desktop, tablet, and mobile
  • Dark/Light themes - Automatic theme switching with persistent settings
  • Model management - Easy switching between different Ollama models
  • System prompts - Customizable system prompts for different use cases
  • Context management - Configurable context window sizes

🛠 Quick Start

Prerequisites

  • Ollama installed and running
  • A modern web browser

Installation

  1. Clone the repository

    git clone https://github.com/jenissimo/ollama-js-template.git
    cd ollama-js-template
  2. Start Ollama server

    ollama serve
  3. Launch the web interface

    python3 -m http.server 8000
    # or
    npx http-server -p 8000
    # or
    php -S localhost:8000
  4. Open your browser Navigate to http://localhost:8000

📖 Usage

  1. Start a conversation - Type your message in the input field
  2. Send messages - Press Enter to send or Shift+Enter for new lines
  3. Switch modes - Toggle between streaming and waiting modes in settings

⚙️ Configuration

Access settings via the gear icon in the header:

  • Model Selection - Choose from available Ollama models
  • System Prompt - Customize the AI's behavior
  • Context Size - Adjust memory window (512-32768 tokens)
  • Temperature - Control response creativity (0-2)
  • Streaming - Enable/disable real-time responses

🏗 Architecture

File Structure

ollama-js-template/
├── index.html          # Main HTML file
├── styles.css          # Styles and themes
├── code.js            # Core JavaScript logic
├── README.md          # Documentation
└── LICENSE            # MIT License

Technology Stack

📄 License

This project is licensed under the MIT License - see the LICENSE file for details.


Made with ❤️ for the AI community

About

Lightweight chat interface for local AI models for your browser. Built with vanilla JavaScript for maximum performance and simplicity.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published