A hybrid microservices architecture SaaS application that generates personalized learning paths based on trending skills and market demand. Built with Rust, Python, C++, and Next.js.
The application consists of 5 main services:
- Rust Backend (Actix-web): Main API gateway and request orchestrator
- Python AI Service (FastAPI): Generates learning paths using trending data
- Python Trends Ingestion: Collects and processes trend data from various sources
- C++ Optimization Module: Optimizes popular learning paths for performance
- Next.js Frontend: Modern web interface for users
- Docker and Docker Compose
- Node.js 18+ (for local frontend development)
- Rust 1.75+ (for local backend development)
- Python 3.11+ (for local service development)
-
Clone and navigate to the project:
git clone https://github.com/guicybercode/hype-learning cd saas_learning
-
Copy environment configuration:
cp docker/env.example docker/.env
-
Start all services:
cd docker docker-compose up -d
-
Access the application:
- Frontend: http://localhost:3000
- Rust Backend API: http://localhost:8080
- AI Service API: http://localhost:8001
# Start PostgreSQL
docker run -d \
--name learning-path-db \
-e POSTGRES_DB=learning_paths \
-e POSTGRES_USER=user \
-e POSTGRES_PASSWORD=password \
-p 5432:5432 \
postgres:15
# Initialize schema
psql -h localhost -U user -d learning_paths -f database/schema.sql
Rust Backend:
cd backend/rust-backend
export DATABASE_URL="postgresql://user:password@localhost:5432/learning_paths"
export AI_SERVICE_URL="http://localhost:8001"
cargo run
Python AI Service:
cd backend/ai-service
pip install -r requirements.txt
export DATABASE_URL="postgresql://user:password@localhost:5432/learning_paths"
python main.py
Trends Ingestion Service:
cd backend/trends-ingestion
pip install -r requirements.txt
export DATABASE_URL="postgresql://user:password@localhost:5432/learning_paths"
python main.py
C++ Optimizer:
cd backend/cpp-optimizer
mkdir build && cd build
cmake ..
make
./optimizer
Frontend:
cd frontend
npm install
npm run dev
GET /api/v1/health
GET /api/v1/skills
POST /api/v1/generate-path
Content-Type: application/json
{
"skill_id": "uuid",
"difficulty_level": "beginner|intermediate|advanced",
"learning_objective": "optional string",
"user_id": "optional uuid"
}
POST /generate-path
Content-Type: application/json
{
"skill_id": "uuid",
"difficulty_level": "beginner|intermediate|advanced",
"learning_objective": "optional string",
"user_id": "optional uuid"
}
GET /trends/{skill_id}
The application uses PostgreSQL with the following main tables:
- users: User information
- skills: Available skills for learning paths
- trends_data: Collected trend data from various sources
- learning_paths: Generated learning paths
- learning_steps: Individual steps within learning paths
- user_requests: User requests for learning paths
- optimized_paths: Pre-computed optimized paths
- User Request: User selects skill and difficulty level in frontend
- Backend Processing: Rust backend validates request and forwards to AI service
- Trend Analysis: AI service queries trend data and generates personalized path
- Path Generation: AI service creates structured learning path with 6-10 steps
- Response: Generated path is returned to user through backend
- Optimization: C++ optimizer processes popular paths for future requests
- Modern, responsive UI with Tailwind CSS
- Real-time form validation
- Interactive learning path visualization
- Resource links and progress tracking
- JWT-based authentication (ready for implementation)
- Comprehensive error handling
- Health check endpoints
- Structured logging
- Trend-based learning path generation
- Difficulty-appropriate content
- Resource type diversity
- Prerequisite management
- Google Trends integration
- Daily automated collection
- Trend score calculation
- Keyword extraction
- Performance-critical path optimization
- Batch processing of popular skills
- Learning flow optimization
- Resource diversity balancing
saas_learning/
โโโ backend/
โ โโโ rust-backend/ # Main API service
โ โโโ ai-service/ # Learning path generation
โ โโโ trends-ingestion/ # Trend data collection
โ โโโ cpp-optimizer/ # Performance optimization
โโโ frontend/ # Next.js web application
โโโ database/ # Schema and migrations
โโโ docker/ # Docker configurations
โโโ docs/ # Documentation
- Insert skill into database:
INSERT INTO skills (name, description, category)
VALUES ('New Skill', 'Description', 'Category');
- The trends ingestion service will automatically collect data for the new skill
To add new trend sources, modify backend/trends-ingestion/services.py
:
class NewTrendSource:
async def get_trends_data(self, keyword: str):
# Implement trend collection logic
pass
Modify backend/ai-service/services.py
to customize:
- Step templates for different difficulty levels
- Resource type preferences
- Learning objective integration
- Response Time: < 2 seconds for cached paths
- Database: Optimized queries with proper indexing
- Caching: Redis integration recommended for production
- Load Balancing: Multiple service instances for high availability
- Environment variable configuration
- Database connection encryption
- Input validation and sanitization
- Rate limiting (recommended for production)
- HTTPS enforcement (production)
- Environment Variables: Update all secrets in production
- Database: Use managed PostgreSQL service
- Monitoring: Implement logging and metrics collection
- Scaling: Use container orchestration (Kubernetes)
- Backup: Regular database backups
# Build production images
docker-compose -f docker-compose.yml -f docker-compose.prod.yml build
# Deploy with production configuration
docker-compose -f docker-compose.yml -f docker-compose.prod.yml up -d
- Fork the repository
- Create a feature branch
- Make your changes
- Add tests if applicable
- Submit a pull request
This project is licensed under the MIT License - see the LICENSE file for details.
For support and questions:
- Check the documentation in
/docs
- Review the API endpoints
- Check service logs for debugging
- Open an issue for bugs or feature requests
- User authentication and profiles
- Progress tracking and analytics
- Social features (sharing paths)
- Mobile application
- Advanced AI recommendations
- Integration with learning platforms
- Real-time trend notifications