ai-cli is a command-line interface (CLI) tool that lets you go from zero to AI-powered in seconds in a safe, automated, and tailored way.
✨ Features | 🚀 Getting Started | 🎥 Demos
- Policies: set rules for AI interactions and tool usage
- Discovery: find and use inference providers and tools automatically
- Configure AI-powered editors*: help configure standard AI-powered editors with discovered tools
- Setup environment: setup environemnt with necessary credentials, etc
- Extensible: add plugins to extend functionality
- Multi-model support: support for different LLM inference providers (Google Gemini, LMStudio, Ollama, Ramalama)
The quickest way to get started is by exposing an API key from Google Gemini (export GEMINI_API_KEY=$YOUR_KEY
) or by pulling one of the supported Ollama models.
If you have Node.js installed, you can run the CLI directly by using npx
:
# Show the available commands
npx npm-ai-cli@latest help
# Start a TUI-based chat session
npx npm-ai-cli@latest chat
# Discover available tools and providers
npx npm-ai-cli@latest discover
If you have Python installed, you can run the CLI directly by using uvx
:
# Show the available commands
uvx python-ai-cli@latest help
# Start a TUI-based chat session
uvx python-ai-cli@latest chat
# Discover available tools and providers
uvx python-ai-cli@latest discover
If you have Go installed, you can install the CLI by running:
go install github.com/manusa/ai-cli/cmd/ai-cli@latest
After installation, make sure your $GOPATH/bin
is in your system's PATH
to run the ai-cli
command directly from your terminal.
# Show the available commands
ai-cli help
# Start a TUI-based chat session
ai-cli chat
# Discover available tools and providers
ai-cli discover
You can also install the CLI manually by downloading a binary compatible with your OS from the latest release.
Note
For macOS users: you might need to run xattr -rc /path/to/ai-cli
to remove the quarantine attribute.
We're still not signing the binaries, but it's on our roadmap.