Skip to content

LLM Agent Template utilizing the LangChain ecosystem including an agent, LangChain's agent-chat-ui, mcp integration for tools, LangSmith integration for observability

License

Notifications You must be signed in to change notification settings

GetsEclectic/langgraph-agent-template

Repository files navigation

LangGraph Agent Template

A complete starter template for building AI agents with LangGraph, featuring MCP integration, chat UI, and LangSmith observability.

✨ Features

  • πŸ€– LangGraph ReAct Agent - Ready-to-use agent with hardcoded prompts
  • πŸ”Œ MCP Integration - Model Context Protocol server support (filesystem included)
  • πŸ’¬ Agent Chat UI - Next.js-based web interface for agent interaction
  • πŸ“Š LangSmith Tracing - Built-in observability and monitoring
  • 🐳 Docker Ready - Containerized development with hot reload
  • ⚑ Fast Setup - One command to start everything

πŸš€ Quick Start

1. Clone and Configure

git clone <your-repo>
cd <your-repo>

# Create environment file
cp .env.example .env
# Edit .env and add your API keys:
# ANTHROPIC_API_KEY=your_anthropic_api_key_here
# LANGSMITH_API_KEY=your_langsmith_api_key_here

2. Start with Docker (Recommended)

# Start both agent and chat UI
docker compose up -d

# View logs
docker compose logs -f

# Stop services
docker compose down

Access the chat UI at: http://localhost:40004 πŸŽ‰

πŸ› οΈ Customization Guide

Adding Your Own MCP Servers

  1. MCP Configuration Options

    The system uses agent/mcp_integration/servers.json by default. To customize without affecting the template:

    # Create your own servers.json at project root (gitignored)
    cp agent/mcp_integration/servers.json servers.json
    # Edit servers.json with your configuration

    Configuration priority:

    • servers.json at project root (if exists) - your custom config
    • agent/mcp_integration/servers.json - default template config
  2. Edit MCP Configuration

    {
      "servers": {
        "filesystem": {
          "command": "npx",
          "args": ["-y", "@modelcontextprotocol/server-filesystem", "."],
          "transport": "stdio"
        },
        "your_server": {
          "command": "your-command",
          "args": ["your", "args"],
          "transport": "stdio"
        }
      }
    }
  3. Add Environment Variables (if needed)

    # .env
    YOUR_SERVER_TOKEN=your_token_here

Customizing the Agent

  1. Agent Behavior - Edit agent/prompts.py
  2. Agent State - Modify agent/state.py
  3. Agent Logic - Update agent/graph.py

Modifying the Chat UI

The chat UI lives in agent-chat-ui/. For development and builds, use the provided containers.

Adding Approval Workflow (Optional)

The template uses direct tool execution for simplicity. If you need approval gates for write operations:

  1. Create an approval wrapper class in agent/graph.py
  2. Wrap tools during initialization based on operation type
  3. Use LangGraph interrupts to pause execution for approval
  4. Add approval handling in your UI or CLI

Example approval wrapper:

from langgraph.types import interrupt

class ApprovalTool(BaseTool):
    def _run(self, **kwargs):
        if self._is_write_operation():
            approval = interrupt({"type": "approval", "tool": self.name})
            if not approval.get("approved"):
                return "Operation cancelled"
        return self.wrapped_tool.run(**kwargs)

πŸ“ Project Structure

/
β”œβ”€β”€ agent/                   # Core agent implementation
β”‚   β”œβ”€β”€ graph.py             # LangGraph agent definition  
β”‚   β”œβ”€β”€ prompts.py           # System prompts (hardcoded)
β”‚   β”œβ”€β”€ config.py            # Agent configuration options
β”‚   └── mcp_integration/     # MCP server configuration
β”œβ”€β”€ agent-chat-ui/           # Next.js chat interface
β”‚   β”œβ”€β”€ Dockerfile           # Chat UI container
β”‚   └── .env                 # Pre-configured for localhost
β”œβ”€β”€ infra/                   # Infrastructure (LangSmith, etc.)
β”œβ”€β”€ Dockerfile               # Agent container
β”œβ”€β”€ docker-compose.yml       # Development with hot reload (default)
β”œβ”€β”€ docker-compose.prod.yml  # Production Docker setup
└── langgraph.json          # LangGraph deployment config

πŸ§ͺ Development

Code Quality

docker compose exec agent black . && \
docker compose exec agent ruff check . && \
docker compose exec agent mypy .

Evaluations

  1. Create a YAML dataset (see sample at infra/langsmith/examples/sample_dataset.yaml).
  2. Run the evaluation inside the agent container:
docker compose exec agent python scripts/run_evaluation.py \
  --dataset-file infra/langsmith/examples/sample_dataset.yaml \
  --json

YAML schema:

dataset:
  name: my-eval-dataset            # required
  description: Optional description
  judge_model: anthropic:claude-3-5-sonnet-latest
  examples:
    - inputs:
        question: "What is 2 + 2?"
      outputs:
        answer: "4"

πŸ“Š Observability

This template includes LangSmith integration for:

  • Tracing - Every agent run is automatically traced
  • Datasets - Manage test cases and evaluations
  • Monitoring - Track performance and costs

View your traces at: https://smith.langchain.com

πŸš€ Deployment

LangGraph Cloud

# Deploy to LangGraph Cloud
langgraph deploy

# Or use the included configuration
langgraph deploy --config langgraph.json

Docker Production

# Production build and run
docker compose -f docker-compose.prod.yml up -d

# Scale services
docker compose -f docker-compose.prod.yml up -d --scale agent=3

Manual Docker Build

# Build agent image
docker build -t my-agent .

# Build chat UI image  
docker build -t my-chat-ui ./agent-chat-ui

# Run with custom configuration
docker run -p 40003:40003 --env-file .env my-agent
docker run -p 40004:40004 my-chat-ui

🀝 Contributing

  1. Fork the repository
  2. Create a feature branch
  3. Make your changes
  4. Add tests for new functionality
  5. Ensure all tests pass
  6. Submit a pull request

πŸ“„ License

MIT License - see LICENSE file for details.

πŸ†˜ Support

  • πŸ“– Documentation: Check the CLAUDE.md file for development context
  • πŸ› Issues: Report bugs via GitHub issues
  • πŸ’¬ Discussions: Use GitHub discussions for questions

About

LLM Agent Template utilizing the LangChain ecosystem including an agent, LangChain's agent-chat-ui, mcp integration for tools, LangSmith integration for observability

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •