An intelligent Python debugging assistant powered by AI agents using LangChain and Ollama Llama 3.2. Fixie automatically detects bugs, analyzes code logic, and suggests fixes through a coordinated multi-agent workflow.
- 🤖 Multi-Agent Architecture: Specialized agents for syntax checking, logic reasoning, and fix suggestions
- 🔍 Intelligent Bug Detection: Identifies syntax errors, runtime issues, and logic problems
- 📍 Line-by-Line Analysis: Pinpoints exact locations of bugs with line numbers
- 🎯 Confidence Scoring: Provides confidence levels for suggested fixes
- 🔄 LangGraph Workflow: Orchestrated agent coordination using LangGraph
- 🦙 Local AI: Powered by Ollama Llama 3.2 for privacy and offline usage
- Python 3.8+ (3.10+ preferred)
- Ollama installed and running
- Git (for cloning)
-
Clone the repository
git clone https://github.com/kawish918/Fixie-AI-Agent-Debugger.git cd fixie-ai-debugger -
Create virtual environment
python -m venv fixie fixie\Scripts\activate
-
Install dependencies
pip install langchain langgraph
-
Install and setup Ollama
Visit Ollama's official website for installation instructions, or:
Windows/Mac/Linux:
# Download from: https://ollama.ai/download # Then pull the Llama model: ollama pull llama3.2
-
Verify Ollama is running
ollama list # Should show llama3.2 in the list
-
Run on the example buggy code
python main.py
-
Add your own buggy Python files
# Place your .py file in the examples/ directory # Update main.py to point to your file: code = read_python_file('examples/your_buggy_file.py')
fixie-ai-debugger/
├── agents/
│ ├── fix_suggester.py # AI agent for generating fixes
│ ├── logic_reasoner.py # AI agent for understanding code logic
│ └── syntax_checker.py # AI agent for detecting syntax errors
├── core/
│ ├── agent_runner.py # Simple agent orchestration
│ ├── input_handler.py # File reading utilities
│ ├── langgraph_runner.py # LangGraph workflow management
│ └── llama_interface.py # Ollama API interface
├── examples/
│ └── buggy_code.py # Sample buggy code for testing
├── main.py # Main application entry point
└── README.md
Fixie uses a coordinated multi-agent workflow:
-
Syntax Checker Agent 🔍
- Analyzes code for syntax and runtime errors
- Identifies problematic lines and severity levels
-
Logic Reasoner Agent 🧠
- Understands the intended purpose of the code
- Provides context for fix generation
-
Fix Suggester Agent 🛠️
- Combines bug report and logic analysis
- Generates complete, executable fix suggestions
- Provides confidence scores
-
LangGraph Orchestration 🔄
- Manages agent workflow and data flow
- Ensures proper sequencing and state management
--- Fixie AI Debugger ---
Original Code:
def add_nums(a, b):
return a + b + c
🔍 Debug Results:
==================================================
🐛 Bug Found: NameError - variable 'c' is not defined
📍 Line Number: 2
⚠️ Severity: HIGH
--------------------------------------------------
📊 Fix Confidence: 0.95
💡 Explanation: Variable 'c' is undefined in the function
🔧 Suggested Fix:
def add_nums(a, b):
return a + b
Update the model in any agent:
# In agents/fix_suggester.py
class FixSuggester:
def __init__(self, model="llama3.2"): # Change model here
self.model = modelModify prompts in each agent class to suit your debugging needs:
# In agents/syntax_checker.py
def check(self, code: str) -> dict:
prompt = f"""Your custom prompt here..."""- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature) - Commit your changes (
git commit -m 'Add amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
- Support for multiple programming languages
- Integration with popular IDEs
- Web interface for easier usage
- Code execution validation
- Integration with static analysis tools
- Custom rule definitions
- Batch processing of multiple files
Ollama not found:
# Make sure Ollama is installed and in PATH
ollama --versionModel not available:
# Pull the required model
ollama pull llama3.2LangChain import errors:
# Install missing dependencies
pip install langchain langgraphPermission errors on Windows:
# Run as administrator or check antivirus settings- LangChain Documentation
- LangGraph Documentation
- Ollama Documentation
- Llama 3.2 Model Card
- Get LangSmith API Key
- LangGraph Local Server
This project is licensed under the MIT License - see the LICENSE file for details.
Star ⭐ this repo if you find it helpful!