Don is an AI agent that connects Large Language Models (LLMs) directly to command-line tools, enabling autonomous task execution without requiring a separate MCP client.
- Direct LLM connectivity: Connect to OpenAI, Anthropic, Ollama, and other LLM providers
- Multi-agent architecture: Uses orchestrator and tool-runner agents for complex task execution
- RAG support: Retrieval-Augmented Generation for document-based Q&A
- Flexible configuration: YAML-based configuration with environment variable substitution
- Multiple retrieval strategies: Chunked embeddings, BM25, and hybrid search
-
Create an agent configuration file at
~/.don/agent.yaml:agent: models: - model: "gpt-4o" class: "openai" name: "gpt-4o" default: true api-key: "${OPENAI_API_KEY}" api-url: "https://api.openai.com/v1"
-
Set your API key:
export OPENAI_API_KEY="sk-..."
-
Run the agent with a tools configuration:
don --tools=examples/tools.yaml "Help me debug this issue"
go install github.com/inercia/don@latestOr build from source:
git clone https://github.com/inercia/don
cd don
make build# Run with a specific model
don --tools=tools.yaml --model gpt-4o "Your question here"
# One-shot mode (exit after response)
don --tools=tools.yaml --once "What's the disk usage?"
# Interactive mode
don --tools=tools.yaml# Enable RAG sources from config
don --tools=tools.yaml --rag=docs "What does the documentation say about X?"# Show agent configuration
don info
# Create default configuration
don config create
# Show current configuration
don config showSee the Configuration Guide for detailed configuration options.
- Usage Guide - Getting started and basic usage
- Configuration - Agent and model configuration
- RAG Guide - Using Retrieval-Augmented Generation
- Architecture - Technical architecture and design
This project is licensed under the MIT License - see the LICENSE file for details.