A LangGraph-based order management system that can handle order placement and cancellation using AI agents. This system demonstrates how to build complex, multi-step workflows with Large Language Models (LLMs) using LangGraph.
Based on Kshitij Kutumbe's article LangGraph AI agents : Building a Dynamic Order Management System : A Step-by-Step Tutorial.
- Intelligent query categorization (Place Order vs Cancel Order)
- Inventory availability checking
- Dynamic shipping cost calculation based on location and weight
- Payment processing simulation
- Order cancellation handling
- State management across the workflow
- Conditional branching based on user intent
- Python 3.8+
- OpenAI API key (GPT-4 Turbo access required)
ai-driven-order-management/
├── data/
│ ├── inventory.csv # Sample inventory data
│ └── customers.csv # Sample customer data
├── src/
│ ├── __init__.py
│ ├── config.py # Shared configuration and LLM setup
│ ├── main.py # Entry point
│ ├── tools.py # LangChain tools
│ ├── nodes.py # Workflow nodes
│ ├── state.py # State definitions
│ └── workflow.py # Workflow graph definition
├── requirements.txt
├── setup.py
└── README.md
- Clone the repository:
git clone https://github.com/schmitech/ai-driven-order-management.git
cd ai-driven-order-management
- Create and activate a virtual environment:
python -m venv venv
source venv/bin/activate # On Windows: venv\Scripts\activate
- Install dependencies:
pip install -r requirements.txt
- Create a
.env
file in the root directory and add your OpenAI API key (copy template from .env.example):
OPENAI_API_KEY=your_api_key_here
Note: Make sure you have access to GPT-4 Turbo as the system uses gpt-4-turbo-preview
model.
Run the main script from the project root directory:
python -m src.main
This will run two test cases:
- Canceling an order: "I wish to cancel order_id 223"
- Placing a new order: "customer_id: customer_14 : I wish to place order for item_51 with order quantity as 4 and domestic"
The system uses a graph-based workflow with several key components:
- Centralized configuration in
config.py
- Environment variable management
- Shared LLM instance using GPT-4 Turbo
- Tracks order details, inventory status, shipping costs, and payment status
- Maintains conversation history and workflow progress
categorize_query
: Determines user intent (place/cancel order)check_inventory
: Verifies item availabilitycompute_shipping
: Calculates shipping costs based on location and weightprocess_payment
: Simulates payment processingcancel_order
: Handles order cancellation requests
- Routes requests to appropriate handlers based on user intent
- Manages workflow branching for different scenarios
The system includes sample inventory data with:
- Item IDs
- Stock levels
- Item weights
- Prices
Sample customer data includes:
- Customer IDs
- Locations (local/domestic/international)
- Contact information
This script ensures a clean installation by removing all compiled Python files, build artifacts, and the virtual environment before creating a fresh setup.
# Make the script executable
chmod +x clean.sh
# Run the cleaning script
./clean.sh
Feel free to submit issues, fork the repository, and create pull requests for any improvements.
This project uses: