Virtual banking Agent demonstrates how semantic routing can intelligently route queries to the right tools based on the meaning of the user query without relying on expensive models which in turn saves token costs and reduces latency.
- Demonstrate semantic intent routing using RedisVL
- Showcase Redis message history for contextual chat
- Show agentic orchestration with LangGraph
- Illustrate tool execution using LangChain tools
- Python 3.11+
- Node.js 18+
- Docker (for Redis Stack)
- Clone the repository:
git clone <repository-url>
cd banking-agent-semantic-routing-demo- Create a .env file in the project root:
OPENAI_API_KEY=your_openai_api_key_here
REDIS_URL=redis://localhost:6380
HISTORY_INDEX=bank:msg:index
HISTORY_NAMESPACE=bank:chat
HISTORY_TOPK_RECENT=8
HISTORY_TOPK_RELEVANT=6
HISTORY_DISTANCE_THRESHOLD=0.35# Start all services with Docker
docker-compose up --build
# Access the application
# Frontend: http://localhost:3000
# Backend: http://localhost:8000
# RedisInsight: http://localhost:8001# Create virtual environment with Python 3.11
python3.11 -m venv .venv
source .venv/bin/activate # On Windows: .venv\Scripts\activate
# Install dependencies
pip install -r requirements.txt# Option A: Docker (Recommended)
docker run -d --name redis-stack -p 6380:6379 -p 8001:8001 redis/redis-stack:latest
# Option B: Homebrew (macOS)
brew tap redis-stack/redis-stack
brew install redis-stack
redis-stack-server --daemonize yes# Make sure virtual environment is activated
source .venv/bin/activate
# Start FastAPI server
python3 -m uvicorn main:app --reload --port 8000Backend will be available at http://localhost:8000
cd nextjs-app
npm install
npm run devFrontend will be available at http://localhost:3000
- Semantic Routing (RedisVL): Routes queries to appropriate banking intents (loans, cards, FD, forex, etc.)
- Slot-Filling Orchestration (LangGraph): Manages conversation state and collects required information
- Tool Execution (LangChain): Executes banking operations (EMI calculation, card recommendations, etc.)
- Modern Frontend (Next.js 14 + TypeScript + Tailwind): Responsive banking UI with chat interface
- Conversation Memory (RedisVL MessageHistory): Structured conversation tracking
User Query
↓
[Semantic Router] → Intent + Confidence + Required Slots
↓
[Parse Slots] → Extract values from text using LLM
↓
[Decide Next]
├→ Missing slots? → Ask follow-up question
└→ All slots filled? → Call Tool
↓
[Tool Execution] → Calculate/Recommend/Search
↓
[Summarize] → Format response with bullets
↓
Response to User
↓
[Feedback System] → User rates helpfulness
↓
[Memory Management] → Clear conversation if helpful
Intelligent chat endpoint with semantic routing and slot-filling.
Request Body:
{
"userId": "optional_user_id",
"sessionId": "optional_session_id",
"text": "I need a loan",
"meta": {}
}Response:
{
"reply": "What loan amount are you looking for?",
"pending": ["loan_amount", "tenure_months"],
"router": {
"intent": "loan",
"confidence": "high",
"score": 0.92
},
"proposal": null,
"showFeedback": false,
"model": "gpt-3.5-turbo"
}User feedback endpoint for conversation management.
Request Body:
{
"sessionId": "session_xyz",
"helpful": true
}Response:
{
"ok": true,
"message": "Thank you! Conversation cleared for a fresh start.",
"cleared": true
}User: I need a personal loan
Assistant: What loan amount are you looking for?
User: 5 lakhs for 3 years
Assistant: Your EMI will be ₹16,134 per month for 36 months.
- Monthly EMI: ₹16,134
- Total Amount Payable: ₹5,80,832
- Total Interest: ₹80,832
- Principal: ₹5,00,000
Was this helpful? [ Yes] [ No]
User: I want a credit card
Assistant: What is your annual income?
User: 8 lakhs per year
Assistant: Based on your income of ₹8,00,000, we recommend the DemoBank Travel Elite.
- Annual Fee: ₹2,999
- 5X rewards on travel
- Airport lounge access
python3 router_bank.pypython3 orchestrator.pycurl -X POST http://localhost:8000/chat \
-H "Content-Type: application/json" \
-d '{"text": "I need a personal loan"}'python3 test_system.py- Bhavana Giri — bhavanagiri
This project is licensed under the MIT License.