AI-Powered Real-time Emotion Detection for E-commerce Personalization
# One-command setup
git clone https://github.com/your-username/Customer-emotion-recognition.git
cd Customer-emotion-recognition
chmod +x integrated-startup.sh
./integrated-startup.sh startThen visit: http://localhost:3000 🎉
Built with cutting-edge technologies:
• Overview
• Usage
• Testing
Customer-emotion-recognition is an all-in-one developer toolset designed to facilitate the creation of emotion-aware e-commerce experiences. It combines real-time emotion detection, personalized product recommendations, and streamlined deployment workflows into a cohesive architecture. The core features include:
• 🧪 ⚡ System Health Checks: Quickly verify the readiness of frontend, API, and AI services to ensure smooth operation.
• 🚀 📦 Automated Service Orchestration: Simplify startup, build, and deployment processes across multiple components with scripts and configuration files.
• 🎯 🧠 AI-Driven Emotion Analysis: Leverage advanced models for facial emotion detection and user preference prediction to enhance personalization.
• 📦 🔗 Shared Types & UI Components: Maintain consistency and reusability with shared data structures and UI elements across the project.
• ⚙️ 🔧 Dependency & Workflow Management: Use monorepo tools like pnpm and Turbo for reliable builds, caching, and dependency control.
• 🌐 💻 API Endpoints & Integration: Seamlessly connect frontend, backend, and AI services for a scalable, real-time customer engagement platform.
Before running this project, ensure you have the following installed:
- Node.js 18+
- pnpm 8+ (
npm install -g pnpm) - Python 3.9+ (for AI service)
- Git for version control
-
Clone the repository:
git clone https://github.com/JenniferZero/Customer-emotion-recognition.git cd customer-emotion-recognition -
Install dependencies:
pnpm install
-
Build shared packages:
pnpm build
-
Set up environment variables:
- Copy
.env.exampleto.env(or use the existing.envfile) - Update the variables according to your setup
- Copy
-
Set up Python environment for AI service:
cd apps/ai-service/fastapi-service python -m venv venv source venv/bin/activate # On Windows: venv\Scripts\activate pip install -r requirements.txt
-
Start all services in development mode:
pnpm dev
This will start:
- Frontend: http://localhost:3000
- API Service: http://localhost:3001
- AI Service: http://localhost:5000
-
Alternatively, run individual services:
# Frontend pnpm --filter="frontend" dev # API Service pnpm --filter="api-service" dev # AI Service pnpm --filter="ai-service" dev
-
Build all services for production:
# Windows PowerShell ./deploy.ps1 # Unix/Linux/Mac chmod +x ./deploy.sh && ./deploy.sh
-
Start services in production mode:
pnpm start
Run comprehensive tests across all services:
# Run all tests
pnpm test
# Run frontend tests
pnpm --filter="frontend" test
# Run API service tests
pnpm --filter="api-service" testThe project is structured as a monorepo using Turborepo and pnpm workspaces, consisting of:
- Emotion detection UI with webcam integration
- Product recommendations display
- Emotion history visualization
- Responsive layout with dark mode support
- Product recommendation generation
- Emotion history storage and retrieval
- User preference tracking
- RESTful API with Swagger documentation
- Real-time emotion recognition
- Face detection using YOLO models
- Emotion classification and confidence scoring
- API for emotion data processing
- API Service Swagger: http://localhost:3001/api
- AI Service API docs: http://localhost:5000/docs
customer-emotion-recognition/
├── apps/ # Application services
│ ├── frontend/ # Next.js frontend application
│ ├── api-service/ # NestJS recommendation API service
│ └── ai-service/ # FastAPI emotion detection service
├── packages/ # Shared packages
│ ├── shared-types/ # TypeScript types used across services
│ ├── ui/ # Shared UI components
│ ├── emotion-recognition/ # Emotion detection algorithms
│ └── ai-core/ # Core AI utilities
└── README.md # Project documentation
- Frontend: Next.js 15 with App Router, SSR/SSG/ISR capabilities, TailwindCSS v4
- Backend (API): NestJS, TypeScript
- Backend (AI): FastAPI, Python, YOLO, LangChain, LangGraph
- Dashboard: Medusa.js
- Database: PostgreSQL with vector DB capabilities
- Monorepo Management: Turborepo, pnpm
- Set up monorepo structure with Turborepo and pnpm
- Implemented FastAPI backend service for emotion detection with YOLO
- Created shared UI components and types
- Implemented frontend components for emotion detection
- Implemented NestJS backend service for recommendations
- Integrated frontend with both FastAPI and NestJS backends
- Created main page with integrated components
- Added proper error handling and fallback mechanisms
- Set up admin dashboard with Medusa.js
- Implement CI/CD deployment
- Add database integration with PostgreSQL and vector capabilities
- Set up authentication and user management
- Add product details pages and browsing
- Deploy services to cloud providers
- Enhanced emotion detection using more sophisticated models
- A/B testing framework for recommendation algorithms
- Integration with popular e-commerce platforms
- User behavior analytics dashboard
- Multi-language support
- Mobile application
- Fork the repository
- Create a feature branch:
git checkout -b feature-name - Commit changes:
git commit -m 'Add feature' - Push to the branch:
git push origin feature-name - Submit a pull request
This project is licensed under the MIT License.