Built with Unix philosophy at its core, you can pass any content by pipeline.
(To be honest, I admit this README is a little showy, and there are many features of TCP that need to be added or improved🥺, but ai feels this project good)
- 🧠 Multi-Model Mastery
-m qwen2.5|--model deepseek-r1:32b
Switch between cutting-edge LLMs like changing shells - 🕵️ Reasoning Mode (
-r)
Two-stage critical thinking:
Analyze → Executewith different temperature settings - 💼 PPT Gen Mode (
-p)
Create presentation-ready markdown in terminal - 🎭 Dual Model Dialogues
--model2 llama3.1:8bfor model vs model debates - 🛠️ Adding any other features by your own
Build your personalized AI assistant based on TCP
- 🛠️ Pipeline Power
ls -la | tcp -m qwen2.5 "Explain these file permissions" man grep | tcp -r "Summarize key flags"
- 🔄 Conversation Mode (
-t)tcp -t "Debug this Python script" 5 - 📜 Context-Aware
Maintains session history like your favorite shell
Prerequisites: Python 3.9+, Ollama running
# Clone with speed
git clone https://github.com/bingyang-lei/TerminalCopilot.git && cd TerminalCopilot
# Install dependencies (virtualenv recommended)
pip install -r requirements.txt
# Start your AI engine
ollama serve & # Keep running in background
# Pull models you like
ollama pull qwen2.5 deepseek-r1:32b
# (Optional) set alias
echo 'alias tcp="python /yourpath/main.py"' >> ~/.bashrc
source ~/.bashrcCommand Parameters
usage: main.py [-h] [-m MODEL] [--model2 MODEL2] [-r] [-p] [-t PROMPT ROUNDS]
options:
-h, --help show this help message and exit
-m MODEL, --model MODEL
Designate a model to use, default is qwen2.5
--model2 MODEL2 (option) Choosing another model to talk with final model
-r, --reasoning Enable reasoning mode(more powerful but slower)
-p, --ppt enter PPT generation mode
-t PROMPT ROUNDS, --talk PROMPT ROUNDS
Start dialogue mode with initial prompt and max rounds
# Chain with classic tools
find . -name "*.py" | tcp -r "Analyze code patterns"
netstat -tulpn | tcp "Explain these network connections"
man sh | tcp "Give me a more friendly manual"
