A lightweight, fast, and beautiful terminal chat interface for Ollama with MCP (Model Context Protocol) integration.
- 🚀 Fast: Starts in milliseconds, not seconds
- 💡 Smart: Optional integration with system-prompt-composer for enhanced prompts
- 🔧 Extensible: MCP integration for dynamic tool discovery and execution
- 🎨 Beautiful: Rich terminal interface with syntax highlighting and native terminal appearance
- ⌨️ Keyboard-first: Efficient navigation designed for developers
- 📦 Lightweight: No Electron overhead - pure Python performance
- 🔄 Cross-platform: Works on Linux, macOS, and Windows terminals
# Install from PyPI
pip install lit-tui
# Or install from source
git clone https://github.com/Positronic-AI/lit-tui.git
cd lit-tui
python3 -m venv .venv
source .venv/bin/activate # On Windows: .venv\Scripts\activate
pip install -e ".[dev]"- Python 3.8+
- Ollama running locally
- A terminal with Unicode support
# Start lit-tui
lit-tui
# Or with specific model
lit-tui --model llama2
# With debug logging
lit-tui --debuglit-tui stores its configuration in ~/.lit-tui/config.json. On first run, it will create a default configuration.
{
"ollama": {
"host": "http://localhost:11434",
"default_model": "llama2"
},
"ui": {
"font_size": "medium",
"show_token_count": true
},
"storage": {
"max_sessions": 100,
"auto_save": true
},
"mcp": {
"enabled": true,
"servers": []
}
}lit-tui supports the Model Context Protocol for dynamic tool integration:
{
"mcp": {
"enabled": true,
"servers": [
{
"name": "filesystem",
"command": "mcp-server-filesystem",
"args": ["--root", "/home/user/projects"]
},
{
"name": "git",
"command": "mcp-server-git"
}
]
}
}| Key | Action |
|---|---|
Ctrl+N |
New chat session |
Ctrl+O |
Open session |
Ctrl+Q |
Quit |
ESC |
Quit with confirmation |
Enter |
Send message |
Shift+Enter |
New line in message |
Ctrl+/ or F1 |
Show help |
lit-tui uses your terminal's default theme and colorscheme for a native look and feel. The interface features transparent backgrounds that blend seamlessly with your terminal environment, respecting your personal terminal configuration and color preferences.
lit-tui is designed as a reference implementation showcasing:
- Clean async architecture using Python's asyncio
- Direct protocol integration with Ollama and MCP
- Terminal-native UI with Textual framework
- Modular design with clear separation of concerns
- Performance optimization for responsive chat experience
We welcome contributions! Please see CONTRIBUTING.md for guidelines.
git clone https://github.com/Positronic-AI/lit-tui.git
cd lit-tui
python3 -m venv .venv
source .venv/bin/activate # On Windows: .venv\Scripts\activate
pip install -e ".[dev]"
pre-commit install# Use the development script for easy setup
./dev.shpytestMIT License - see LICENSE for details.
- Built with Textual - an amazing Python TUI framework
- Inspired by lazygit, k9s, and other excellent TUI applications
- MCP integration follows the Model Context Protocol specification
Made with ❤️ by LIT - Advancing the field of AI through open-source innovation
