Skip to content

A production-ready, scalable multi-agent system built with LangGraph Platform, featuring specialized agents, multi-provider LLM support, and complete LangGraph Studio compatibility.

Notifications You must be signed in to change notification settings

shamspias/langgraph-agent-system

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

🚀 LangGraph Agent System

A production-ready, scalable multi-agent system built with LangGraph Platform, featuring specialized agents, multi-provider LLM support, and complete LangGraph Studio compatibility.

✨ Key Features

🎯 Multi-Agent Architecture

  • Orchestrator Agent: Intelligent routing to specialized sub-agents
  • Multipurpose Bot: Math calculations, chitchat, and headphones expertise
  • Hungry Services: Food search, recipes, and restaurant recommendations
  • Embedding Service: Vector storage and knowledge management

🔧 Multi-Provider LLM Support

  • OpenAI (GPT-4.1, GPT-5)
  • Anthropic (Claude 4)
  • Google (Gemini)
  • Ollama (Local models)
  • Groq, Cohere, Mistral

🏗️ Production Features

  • LangGraph Studio compatibility
  • Persistent checkpointing (Memory, SQLite, PostgreSQL)
  • Vector storage (Chroma, Pinecone, Weaviate)
  • Streaming support
  • Human-in-the-loop capabilities
  • Time-travel debugging
  • Docker deployment ready

🚀 Quick Start

Prerequisites

  • Python 3.11+
  • LangSmith API key (free at smith.langchain.com)
  • At least one LLM provider API key

1. Clone & Setup

# Clone the repository
git clone https://github.com/shamspias/langgraph-agent-system.git
cd langgraph-agent-system

# Create virtual environment
python -m venv venv
source venv/bin/activate  # On Windows: venv\Scripts\activate

# Install dependencies
pip install -e .

2. Configure Environment

# Copy example environment
cp .env.example .env

# Edit .env with your API keys
nano .env  # or use your preferred editor

Required settings:

LANGSMITH_API_KEY=your-langsmith-key-here
PRIMARY_PROVIDER=openai  # or anthropic, ollama, etc.
OPENAI_API_KEY=your-openai-key-here  # or other provider key

3. Run with LangGraph Studio

# Run with LangGraph Studio (recommended)
langgraph dev --tunnel

# This will output:
# Ready!
# - API: http://localhost:2024
# - Docs: http://localhost:2024/docs
# - LangGraph Studio Web UI: https://smith.langchain.com/studio/?baseUrl=...

Click the Studio URL to open the visual interface!

🎮 Usage Modes

1. LangGraph Studio (Visual Interface)

langgraph dev --tunnel

Then open the provided Studio URL in your browser.

2. CLI Mode (Interactive Terminal)

python -m src.main cli

Example session:

[orchestrator]> What's 234 * 567?
[Multipurpose Response]
Let me calculate that for you:
Expression: 234*567
Result: **132678**

[orchestrator]> /switch hungry
✓ Switched to hungry agent

[hungry]> Find me Italian recipes
[Hungry Response]
Here are some delicious Italian recipes...

3. API Server Mode

python -m src.main api
# Or with uvicorn
uvicorn src.api.server:create_app --reload --host 0.0.0.0 --port 8000

API endpoints:

  • POST /chat - Chat with agents
  • POST /embed - Embed content
  • POST /search - Search vector store
  • GET /health - Health check
  • GET /agents - List available agents

4. Docker Deployment

# Build and run with Docker Compose
docker-compose up -d

# Run specific services
docker-compose up -d langgraph-app postgres redis

# With Ollama for local models
docker-compose --profile ollama up -d

🏗️ Architecture

┌─────────────────────┐
│   LangGraph Studio  │
│    (Visual UI)      │
└──────────┬──────────┘
           │
┌──────────▼──────────┐
│   Orchestrator      │
│   (Main Router)     │
└──────────┬──────────┘
           │
    ┌──────┴──────┬──────────┬────────────┐
    │             │          │            │
┌───▼──┐    ┌────▼───┐  ┌───▼───┐  ┌─────▼────┐
│Multi │    │ Hungry │  │Embed  │  │ General  │
│purpose│   │Services│  │Service│  │Assistant │
└───┬──┘    └────────┘  └───────┘  └──────────┘
    │
┌───┴───┬─────────┬──────────┐
│ Math  │Chitchat │Headphones│
│Agent  │ Agent   │  Agent   │
└───────┴─────────┴──────────┘

🔧 Configuration

LLM Providers

Configure your preferred provider in .env:

OpenAI

PRIMARY_PROVIDER=openai
OPENAI_API_KEY=sk-...
DEFAULT_MODEL=gpt-4o-mini

Anthropic

PRIMARY_PROVIDER=anthropic
ANTHROPIC_API_KEY=sk-ant-...
DEFAULT_MODEL=claude-3-5-sonnet-latest

Ollama (Local)

PRIMARY_PROVIDER=ollama
OLLAMA_BASE_URL=http://localhost:11434
OLLAMA_MODEL=llama3.2

Memory & Storage

PostgreSQL (Production)

MEMORY_TYPE=postgres
POSTGRES_HOST=localhost
POSTGRES_PORT=5432
POSTGRES_DB=langgraph_memory
POSTGRES_USER=postgres
POSTGRES_PASSWORD=secure-password

SQLite (Development)

MEMORY_TYPE=sqlite
MEMORY_DB_PATH=./data/memory.db

Vector Storage

VECTOR_STORE_TYPE=chroma
VECTOR_DB_PATH=./data/vector_store
CHUNK_SIZE=500
CHUNK_OVERLAP=50

📚 API Examples

Python Client

import httpx

async with httpx.AsyncClient() as client:
    # Chat with orchestrator
    response = await client.post(
        "http://localhost:8000/chat",
        json={
            "message": "What's the best Italian restaurant nearby?",
            "agent": "orchestrator"
        }
    )
    print(response.json())
    
    # Embed content
    response = await client.post(
        "http://localhost:8000/embed",
        json={
            "content": "The Sony WH-1000XM5 are premium headphones...",
            "collection_name": "headphones_knowledge"
        }
    )

JavaScript/TypeScript

// Chat with agent
const response = await fetch('http://localhost:8000/chat', {
    method: 'POST',
    headers: { 'Content-Type': 'application/json' },
    body: JSON.stringify({
        message: 'Calculate 15% tip on $85',
        agent: 'multipurpose'
    })
});

const data = await response.json();
console.log(data.response);

cURL

# Chat request
curl -X POST http://localhost:8000/chat \
  -H "Content-Type: application/json" \
  -d '{"message": "Find me a pasta recipe", "agent": "hungry"}'

# Embed content
curl -X POST http://localhost:8000/embed \
  -H "Content-Type: application/json" \
  -d '{"content": "https://example.com/article", "collection_name": "articles"}'

🧪 Testing

# Test all graphs
python -m src.main test

# Run unit tests
pytest tests/

# Test with coverage
pytest --cov=src tests/

🔍 Debugging with LangGraph Studio

  1. Visual Graph Inspection: See your agent's decision flow in real-time
  2. Time Travel: Step through past executions and modify state
  3. State Inspection: View complete state at any point
  4. Tool Visualization: See tool calls and results
  5. Human-in-the-Loop: Pause and modify execution

📊 Monitoring

LangSmith Integration

LANGSMITH_TRACING=true
LANGSMITH_PROJECT=my-agent-project

View traces at smith.langchain.com

Structured Logging

from src.utils.logging import get_logger

logger = get_logger("my_module", user_id="123")
logger.info("Processing request", agent="multipurpose", action="calculate")

🚢 Production Deployment

1. LangGraph Platform (Recommended)

# Deploy to LangGraph Platform
# 1. Push to GitHub
# 2. Connect repo in LangSmith Deployments
# 3. Click Deploy

2. Docker Deployment

# Production build
docker build -t langgraph-agent-system:prod .

# Run with environment
docker run -d \
  --name langgraph-prod \
  -p 8000:8000 \
  -p 2024:2024 \
  --env-file .env.prod \
  langgraph-agent-system:prod

3. Kubernetes

apiVersion: apps/v1
kind: Deployment
metadata:
  name: langgraph-agents
spec:
  replicas: 3
  selector:
    matchLabels:
      app: langgraph
  template:
    metadata:
      labels:
        app: langgraph
    spec:
      containers:
      - name: agent-system
        image: langgraph-agent-system:prod
        ports:
        - containerPort: 8000
        - containerPort: 2024
        envFrom:
        - configMapRef:
            name: langgraph-config

📝 Adding Custom Agents

  1. Create agent graph in src/graphs/:
# src/graphs/custom_agent.py
from langgraph.graph import StateGraph, START, END
from src.core.providers import get_model, get_checkpointer

def build_custom_graph() -> StateGraph:
    workflow = StateGraph(YourState)
    # Add nodes and edges
    return workflow

# Required for langgraph.json
graph = build_custom_graph().compile(checkpointer=get_checkpointer())
  1. Register in langgraph.json:
{
  "graphs": {
    "custom": "./src/graphs/custom_agent:graph"
  }
}

🤝 Contributing

  1. Fork the repository
  2. Create a feature branch (git checkout -b feature/amazing-feature)
  3. Commit changes (git commit -m 'Add amazing feature')
  4. Push to branch (git push origin feature/amazing-feature)
  5. Open a Pull Request

📄 License

MIT License - see LICENSE file for details.

🙏 Acknowledgments

📧 Support

About

A production-ready, scalable multi-agent system built with LangGraph Platform, featuring specialized agents, multi-provider LLM support, and complete LangGraph Studio compatibility.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published