This is an enhanced fork of the original Task Master project by @eyaltoledano & @RalphEcom. We extend our sincere thanks to the original authors for their excellent work in creating the foundation for this project.
This fork enhances Task Master with a robust, OpenAI-compatible AI client system that supports both cloud providers and local LLM endpoints. Key improvements include:
- Flexible AI Provider Support: Works with OpenAI, local endpoints (Ollama, LocalAI), and other OpenAI-compatible APIs
- Advanced Configuration Management: Environment-based configuration with support for multiple providers
- Robust Error Handling: Implements retry mechanisms, circuit breakers, and detailed error reporting
- Token Usage Tracking: Built-in monitoring and reporting of token usage across providers
- Streaming Support: Efficient handling of streaming responses with cancellation capabilities
- Provider-Agnostic Interface: Unified API for seamless switching between different AI providers
- Ollama installed locally (for running Qwen and other models)
- Optional: API keys for cloud providers (Anthropic, OpenAI, etc.)
- Node.js 18 or higher
MCP (Model Control Protocol) provides the easiest way to get started with Local Task Master directly in your editor.
- Install and Start Ollama
# Install Ollama (if not already installed)
# Visit https://ollama.ai for installation instructions
# Pull the Qwen model
ollama pull qwen3
# Start Ollama in a separate terminal
ollama serve- Add the MCP config to your editor (Cursor recommended, but it works with other text editors):
{
"mcpServers": {
"taskmaster": {
"command": "npx",
"args": ["-y", "--package=local-task-master", "local-task-master"],
"env": {
"AI_PROVIDER": "ollama",
"AI_BASE_URL": "http://localhost:11434",
"MODEL": "qwen3",
"MAX_TOKENS": "40960",
"TEMPERATURE": "0.7",
"DEFAULT_SUBTASKS": "5",
"DEFAULT_PRIORITY": "medium",
"DEBUG": "false",
"LOG_LEVEL": "info",
"TRACK_USAGE": "true",
"MAX_RETRIES": "3",
"RETRY_DELAY": "1000",
"MAX_CONCURRENT": "4"
}
}
}
}-
Enable the MCP in your editor
-
Prompt the AI to initialize Local Task Master:
Can you please initialize taskmaster into my project?
- Use common commands directly through your AI assistant:
Can you parse my PRD at scripts/prd.txt?
What's the next task I should work on?
Can you help me implement task 3?
Can you help me expand task 4?# Install globally
npm install -g local-task-master
# OR install locally within your project
npm install local-task-master# If installed globally
task-master init
# If installed locally
npx task-master-initThis will prompt you for project details and set up a new project with the necessary files and structure.
# Initialize a new project
task-master init
# Parse a PRD and generate tasks
task-master parse-prd your-prd.txt
# List all tasks
task-master list
# Show the next task to work on
task-master next
# Generate task files
task-master generateIn addition to the standard Local Task Master environment variables, this fork supports the following AI-related configurations:
-
AI Provider Configuration:
ANTHROPIC_API_KEY: Your Anthropic API keyOPENAI_API_KEY: Your OpenAI API keyAI_BASE_URL: Base URL for custom OpenAI-compatible endpointsOPENAI_MODEL: Default model for OpenAI requestsLOCAL_MODEL: Model name for local LLM endpoints
-
Performance Settings:
MAX_RETRIES: Maximum retry attempts for failed API callsRETRY_DELAY: Initial delay between retries (ms)MAX_CONCURRENT: Maximum concurrent API requestsCIRCUIT_BREAKER_THRESHOLD: Failure threshold for circuit breaker
-
Monitoring:
TRACK_USAGE: Enable/disable token usage trackingUSAGE_LOG_PATH: Path for token usage logsDEBUG: Enable detailed debug loggingLOG_LEVEL: Set logging verbosity
For more detailed information, check out the documentation in the docs directory:
- Configuration Guide - Set up environment variables and customize Local Task Master
- Tutorial - Step-by-step guide to getting started with Local Task Master
- Command Reference - Complete list of all available commands
- Task Structure - Understanding the task format and features
- Example Interactions - Common Cursor AI interaction examples
- AI Client Guide - Detailed guide for the enhanced AI client system
- Provider Integration - Adding support for new AI providers
Try running it with Node directly:
node node_modules/local-task-master/scripts/init.jsOr clone the repository and run:
git clone https://github.com/eyaltoledano/claude-task-master.git
cd claude-task-master
node scripts/init.jsLocal Task Master is licensed under the MIT License with Commons Clause. This means you can:
✅ Allowed:
- Use Local Task Master for any purpose (personal, commercial, academic)
- Modify the code
- Distribute copies
- Create and sell products built using Local Task Master
❌ Not Allowed:
- Sell Local Task Master itself
- Offer Local Task Master as a hosted service
- Create competing products based on Local Task Master
See the LICENSE file for the complete license text and licensing details for more information.