Skip to content

πŸ† 1st Place @ Vultr Track, RAISE YOUR HACK 2025 - Unmask is an AI-powered hiring verification tool that helps HR teams catch CV inconsistencies, auto-call references, and detect real-time lies during interviews.

Notifications You must be signed in to change notification settings

mousberg/le-commit

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

Le Commit

Unmask - AI-Powered Hiring Verification Platform

Trust your hiring process again.

Unmask is an intelligent hiring verification platform that helps you verify candidate authenticity through comprehensive analysis of CVs, LinkedIn profiles, GitHub accounts, and automated reference calls. Built for RAISE YOUR HACK 2025 β€’ Vultr Track.

🌐 Live Demo: unmask.click


✨ Features

πŸ” Multi-Source Profile Analysis

  • CV Processing: Extracts and analyzes professional experience, education, skills, and credentials
  • LinkedIn Integration: Cross-references LinkedIn data with CV information for consistency
  • GitHub Analysis: Evaluates coding activity, repository quality, and technical skills
  • Credibility Scoring: AI-powered authenticity assessment with detailed flags and recommendations

πŸ“ž Automated Reference Calling

  • AI-Powered Calls: Automatically calls references using ElevenLabs Conversational AI
  • Natural Conversations: Professional, human-like interactions with references
  • Transcript Analysis: Real-time transcription and AI-powered summarization
  • Reference Validation: Cross-checks reference feedback with candidate claims

🎯 Real-Time Interview Support

  • Live Feedback: Get real-time prompts during candidate interviews
  • Inconsistency Detection: Flags discrepancies between sources on-the-fly
  • Suggested Questions: AI-generated follow-up questions based on analysis
  • Interview Transcripts: Live transcription with highlighted concerns

πŸ“Š Comprehensive Dashboard

  • Candidate Profiles: Unified view of all candidate information
  • Processing Pipeline: Real-time status tracking from upload to analysis
  • Flag Management: Visual indicators for potential concerns
  • Export Reports: Detailed hiring decision support documents

πŸ› οΈ Technology Stack

Frontend

  • Next.js 15 - React framework with App Router
  • TypeScript - Type-safe development
  • Tailwind CSS - Modern styling framework
  • Radix UI - Accessible component primitives
  • Framer Motion - Smooth animations

Backend & Storage

  • Supabase - PostgreSQL database with real-time capabilities
  • Supabase Storage - Secure file storage for CVs and documents
  • Ashby ATS Integration - Seamless candidate import and sync

AI & Analysis

  • Groq API - Fast AI inference for document analysis
  • OpenAI GPT-4 - Advanced reasoning and summarization
  • ElevenLabs - Natural voice AI for reference calls
  • PDF Processing - Automated document parsing and extraction

Infrastructure

  • Docker - Containerized deployment
  • Vultr - Cloud hosting platform
  • Real-time Processing - Async job processing

πŸš€ Quick Start

Prerequisites

  • Node.js 18+
  • Docker (for production deployment)
  • API keys for external services

Development Setup

  1. Clone the repository

    git clone https://github.com/le-commit/unmask.git
    cd unmask
  2. Install dependencies

    cd frontend
    pnpm install
  3. Configure environment variables

    cp .env.example .env.local

    Required environment variables:

    # Supabase
    NEXT_PUBLIC_SUPABASE_URL=your_supabase_url
    NEXT_PUBLIC_SUPABASE_ANON_KEY=your_anon_key
    SUPABASE_SERVICE_ROLE_KEY=your_service_role_key
    
    # AI Services
    GROQ_API_KEY=your_groq_api_key
    OPENAI_API_KEY=your_openai_api_key
    
    # Reference Calling (ElevenLabs)
    ELEVENLABS_API_KEY=your_elevenlabs_api_key
    ELEVENLABS_AGENT_ID=your_agent_id
    ELEVENLABS_AGENT_PHONE_ID=your_phone_id
    
    # Twilio (via ElevenLabs)
    TWILIO_ACCOUNT_SID=your_twilio_sid
    TWILIO_AUTH_TOKEN=your_twilio_token
    TWILIO_PHONE_NUMBER=your_twilio_number
    
    # Ashby ATS Integration
    ASHBY_API_KEY=your_ashby_api_key
  4. Start local supabase

    supabase start
    supabase db reset --local
  5. Start development server

    pnpm dev
  6. Open your browser

    http://localhost:3000
    

πŸ”„ Webhook Queue Processing with pg_cron

The application uses pg_cron for automated webhook queue processing in both local development and production environments.

Automatic Processing Everywhere

βœ… Local Development: pg_cron extension automatically installed
βœ… Production: pg_cron pre-available in Supabase Cloud
βœ… Queue Processing: Automatic every 2 minutes in both environments

Zero Configuration Required

# Just start development - queue processing works automatically!
pnpm dev

How it works:

  1. Database migrations auto-install pg_cron extension
  2. Cron job created automatically: processes queue every 2 minutes
  3. Local development uses host.docker.internal:3000
  4. Production uses configured webhook base URL

Production Setup

Optional: Configure production webhook URL for external deployments:

-- Set production webhook base URL (optional)
ALTER DATABASE your_production_db_name 
SET app.webhook_base_url = 'https://your-domain.com';

Queue Monitoring

Monitor queue status and cron job health:

-- View pending webhooks
SELECT webhook_type, status, priority, created_at, payload->'applicantId' as applicant_id 
FROM webhook_queue 
WHERE status IN ('pending', 'failed') 
ORDER BY priority DESC, created_at ASC;

-- Check pg_cron job status
SELECT jobid, schedule, active, jobname FROM cron.job 
WHERE jobname = 'process-webhook-queue';

-- Use helper function for detailed status
SELECT * FROM check_webhook_queue_cron_status();

Manual Processing (Optional)

For debugging or immediate processing:

# Local development
curl -X POST "http://localhost:3000/api/webhooks/process-queue" \
  -H "Authorization: Bearer webhook-secret-dev" \
  -H "Content-Type: application/json"

# Production  
curl -X POST "https://your-domain.com/api/webhooks/process-queue" \
  -H "Authorization: Bearer your-webhook-secret" \
  -H "Content-Type: application/json"

Queue Types & Priority

Webhook Type Priority Purpose
score_push Based on AI score (1-100) Push updated credibility scores to Ashby ATS
note_push 90 (high priority) Push analysis notes and red flags to Ashby

Priority Processing: Higher scores processed first (score 85 = priority 85)


🌐 Production Deployment

Vultr Deployment (Recommended)

We provide automated deployment scripts for seamless production deployment:

  1. Make scripts executable

    chmod +x deploy.sh rollback.sh check-status.sh
  2. Deploy to production

    ./deploy.sh
  3. Check deployment status

    ./check-status.sh
  4. Emergency rollback (if needed)

    ./rollback.sh

Docker Deployment

  1. Build the Docker image

    docker build -t unmask:latest .
  2. Configure webhook base URL for production

    -- Required: Set webhook base URL to your production domain
    ALTER DATABASE your_production_db_name SET app.webhook_base_url = 'https://your-domain.com';
  3. Run the container

    docker run -d \
      --name unmask-app \
      -p 3000:3000 \
      --env-file .env.local \
      unmask:latest

Manual Deployment

  1. Build the application

    cd frontend
    npm run build
  2. Configure webhook base URL for production

    -- Required: Set webhook base URL to your production domain
    ALTER DATABASE your_production_db_name SET app.webhook_base_url = 'https://your-domain.com';
  3. Start production server

    npm start

πŸ”Œ API Endpoints

Applicant Management

  • GET /api/applicants - List all applicants
  • POST /api/applicants - Create new applicant with CV/LinkedIn/GitHub
  • GET /api/applicants/[id] - Get specific applicant
  • PUT /api/applicants/[id] - Update applicant information
  • DELETE /api/applicants/[id] - Delete applicant

Ashby ATS Integration

  • GET /api/ashby/candidates - List cached candidates from database with auto-sync
  • POST /api/ashby/candidates - Force refresh candidates from Ashby API
  • POST /api/ashby/files - Download and store CV in Supabase Storage (webhook endpoint)
  • POST /api/ashby/push-score - Send AI analysis score to Ashby custom field

File Management

  • GET /api/files/[fileId] - Get signed URL for file download from storage

Reference Calling

  • POST /api/reference-call - Initiate automated reference call
  • GET /api/get-transcript?conversationId= - Retrieve call transcript
  • POST /api/summarize-transcript - AI analysis of reference call

Processing Pipeline

  • File upload β†’ CV/LinkedIn parsing β†’ GitHub analysis β†’ AI credibility assessment β†’ Reference verification

πŸ“š Documentation

Core Documentation

Development Guides

  • Setup Guides - Historical setup and integration documentation

πŸ“– Usage Guide

Adding a New Candidate

  1. Navigate to the dashboard: /board
  2. Click "Add New Applicant"
  3. Upload required documents:
    • CV (PDF, DOC, DOCX) - Required
    • LinkedIn Profile (PDF, HTML, TXT) - Optional
    • GitHub Profile URL - Optional
  4. Submit and wait for processing

Automated Reference Calling

  1. Open the reference call interface: /call
  2. Fill in reference details:
    • Phone number (with country code)
    • Candidate name
    • Reference name
    • Company context (optional)
    • Role and duration (optional)
  3. Initiate the call
  4. Review transcript and AI summary

Interview Support

  1. Navigate to candidate profile
  2. Click "Start Interview"
  3. Use real-time suggestions during the call
  4. Review flagged inconsistencies

πŸ”§ Configuration

AI Model Configuration

  • Primary Analysis: Groq Llama models for speed
  • Summarization: GPT-4o-mini for cost efficiency
  • Voice AI: ElevenLabs for natural conversations

Processing Limits

  • GitHub Repositories: 50 per analysis
  • Content Analysis: 3 repositories max
  • File Size: 10MB per document
  • Concurrent Processing: 3 applicants

Security Features

  • Environment variable validation
  • File type restrictions
  • Input sanitization
  • Rate limiting on API endpoints

🚨 Troubleshooting

Common Issues

"Permission denied" when running deployment scripts

chmod +x deploy.sh rollback.sh check-status.sh

CV processing fails

  • Ensure PDF is not password protected
  • Check file size is under 10MB
  • Verify GROQ_API_KEY is set correctly

Reference calls not working

  • Verify ElevenLabs agent is configured
  • Check Twilio phone number permissions
  • Ensure all environment variables are set

Webhook queue not processing

-- Check if pg_cron job exists
SELECT * FROM cron.job WHERE jobname = 'process-webhook-queue';

-- Check pending webhooks
SELECT COUNT(*) as pending_count FROM webhook_queue WHERE status = 'pending';

-- Manual queue processing
SELECT net.http_post(
  url => 'https://your-domain.com/api/webhooks/process-queue',
  headers => '{"Authorization": "Bearer your-webhook-secret", "Content-Type": "application/json"}'::jsonb
);

Docker deployment issues

# Check logs
docker logs unmask-app

# Restart container
docker restart unmask-app

# Check environment variables
docker exec unmask-app env | sort

🀝 Contributing

We welcome contributions! Please see our development guide for:

  • Code style guidelines
  • Testing procedures
  • Feature request process
  • Bug reporting

πŸ“š Documentation


πŸ“„ License

This project is built for RAISE YOUR HACK 2025 hackathon submission.


πŸ† Hackathon Details

Event: RAISE YOUR HACK 2025 Track: Vultr Infrastructure Challenge Team: le-commit Live Demo: unmask.click


Built with ❀️ by the le-commit team

GitHub β€’ Live Demo

About

πŸ† 1st Place @ Vultr Track, RAISE YOUR HACK 2025 - Unmask is an AI-powered hiring verification tool that helps HR teams catch CV inconsistencies, auto-call references, and detect real-time lies during interviews.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 5