A modern, feature-rich terminal UI chat application with OpenAI integration, built with OpenTUI and React.
π New here? Start with the Quick Start Guide | β Read our Code of Conduct before contributing
- π€ OpenAI/Azure Integration - Streaming responses with OpenAI and Azure OpenAI support
- π LiteLLM-Compatible Providers - Use
openai/...style model IDs via LiteLLM proxies - π¬ Session Management - Multiple conversations with persistent history
- β‘ Command System - 10+ built-in commands + custom command support
- π·οΈ Smart Mentions - Context, file, code, and docs references
- π― Autocomplete - Fuzzy suggestions with keyboard navigation
- π¨ Themes - Light/Dark toggle
- πΎ Settings Persistence - Preferences saved across sessions
- π¨ Claude Code-Inspired Design - Clean, professional interface with warm accents
- π± Fully Responsive - Adapts to any terminal size
- β¨οΈ Keyboard-Driven - Efficient workflows without leaving the keyboard
- π Smart Scrolling - Auto-scroll with visual feedback
- π§ Overlay Menus - Settings and session list overlays
/clear, /help, /model, /endpoint, /api-key, /status, /settings, /sessions, /commands, /export, /theme, /terminal-setup, and more
Install from npm:
npm install -g qlaw-cliOr using other package managers:
# Using yarn
yarn global add qlaw-cli
# Using pnpm
pnpm add -g qlaw-cli
# Using bun
bun add -g qlaw-cliThen run from anywhere:
qlawIf you want to contribute or develop locally:
# Clone the repository
git clone https://github.com/Qredence/qlaw-cli.git
cd qlaw-cli
# Install dependencies
bun install
# Copy environment template
cp .env.example .env
# Add your OpenAI API key to .env
# OPENAI_API_KEY=your-key-here
# LiteLLM (default provider) example
# LITELLM_BASE_URL=http://localhost:4000/v1
# LITELLM_API_KEY=your-key-here
# LITELLM_MODEL=openai/gpt-4o-mini
# LITELLM_MODELS=openai/gpt-4o-mini,openai/gpt-4o
# Run locally
bun run start
# Or with auto-reload during development
bun run dev- Type your message in the input field
- Press Enter to send
- AI responds with streaming support
- Use commands by typing
/for quick actions - Add mentions by typing
@for context
Type / to see available commands:
/clear- Clear chat history (with confirmation)/help- Show help information/provider- Set provider (openai/azure/litellm/custom)/model- Set the model name/endpoint- Set the API endpoint base URL/api-key- Set the API key (masked in status)/tools- Toggle tool execution (read/list/write/run)/tools perm- Set tool permissions (allow/ask/deny)/status- Show current configuration/settings- Print current settings summary/settings panel- Open the interactive settings menu/sessions- View recent sessions/commands- List custom commands/export- Export current chat to JSON/theme- Toggle light/dark theme/terminal-setup- Terminal keybinding tips/keybindings- Inspect or edit suggestion navigation shortcuts/mode- Switch between standard/workflow modes/workflow- Workflow controls reference/agents- Show current agent-fleet roles/run- Kick off the workflow in the active mode/continue- Continue a workflow handoff/judge- Invoke the judge agent for a decision/af-bridge- Configure the Agent Framework bridge base URL/af-model- Configure the Agent Framework model identifier
- Run
/settings panelto open the interactive settings overlay (Core API, UI, Agent Framework sections) - Run
/settingsalone to print the current configuration in the transcript - Use
ββorTabto highlight a row,Enterto edit/toggle, andEscto close - Text fields launch inline prompts; updates persist to
~/.qlaw-cli/qlaw_settings.json - Agent Framework rows mirror
/af-bridge+/af-model, and workflow mode can stay enabled by default - Update suggestion navigation shortcuts with
/keybindings set <action> <binding>or/keybindings reset
Type @ for contextual references:
@context <text>- Add contextual information to your message@file <path>- Reference a file in your message@code <snippet>- Include a code snippet in your message@docs <topic>- Reference documentation in your message@code function example() { return true; }will format as a code snippet
Mentions are automatically formatted to provide structured context to the AI. For example:
@docs API authenticationwill format as a documentation reference@file src/index.tswill inline the file contents (truncated if needed)
When /tools is enabled, the assistant can request tool execution using fenced tool blocks.
Permissions follow allow | ask | deny and can be configured via /tools perm.
run_command executes shell commands and should only be enabled when you trust the model.
ββ- Navigate suggestionsTab- Autocomplete suggestionEnter- Send message / Select suggestionEsc- Cancel input / Close overlays / ExitCtrl+C- Force exit
- Quick Start Guide - Get up and running in 3 minutes
- Architecture - Technical design and structure
- UI Reference - Visual interface guide
- Design System - Colors, typography, and components
- Changelog - Version history
- API Integration - Configure OpenAI/Azure/custom backends
- Agent Bridge Example - Python Agent Framework bridge used by AF modes
The next release will focus on the following key areas:
- Improved command autocomplete and suggestions
- Interactive prompts and confirmations
- Enhanced keyboard navigation and shortcuts
- Real-time feedback and visual indicators
- Comprehensive settings panel
- User preferences persistence
- Configurable themes and colors
- API configuration management
- Custom keybindings
- Native integration of agent-framework as core framework
- Native integration of agentic-fleet as core framework
- Seamless agent orchestration and management
- Multi-agent conversation support
- Agent capability discovery
- Streamlined onboarding flow
- Enhanced error messages and help system
- Performance optimizations
- Accessibility enhancements
- Improved session management UI
- Multi-model support (Claude, Gemini, etc.)
- Local LLM integration (Ollama)
- Voice input support
- Image analysis capabilities
- Advanced RAG with vector search
- Plugin marketplace
- Collaborative sessions
- Cloud sync capabilities
We welcome contributions! Please see:
- Contributing Guide - How to contribute
- Code of Conduct - Community standards
- Security Policy - Reporting vulnerabilities
- Publishing Guide - For maintainers: How to publish releases to npm
MIT License - see LICENSE file for details.
Inspiration:
- Claude Code - Design inspiration for the clean, minimal interface and warm accent color scheme
- Cursor - Terminal integration patterns and workflow concepts
Built With:
- OpenTUI - The excellent terminal UI framework that powers this application
- React - Component library for building the interface
- Bun - Fast JavaScript runtime and package manager
- TypeScript - Type-safe development
Special thanks to the OpenTUI team for creating such a powerful and elegant framework for building terminal UIs.
Made with β€οΈ by Qredence