A production‑quality full‑stack prototype: searchable expert directory + "Get Matched" flow with transparent match scores and "Why this match?" explanations. Containerized, seedable, and demo‑ready.
Stack: Next.js 14 + TypeScript + Tailwind + shadcn/ui • FastAPI (Python) • PostgreSQL + Prisma • FAISS + sentence‑transformers (semantic) with BM25 fallback • Docker Compose • Playwright/Vitest/PyTest
| Service | Platform | URL |
|---|---|---|
| Frontend | Vercel | daidaex-henna.vercel.app |
| Backend API | Railway | expertmatchai-production.up.railway.app |
| Database | Railway PostgreSQL | 12,168+ expert profiles indexed |
| Metric | Value |
|---|---|
| Expert Profiles | 12,168 indexed and searchable |
| Search Latency | < 200ms average response time |
| Semantic Model | sentence-transformers/all-MiniLM-L6-v2 |
| Vector Index | FAISS with cosine similarity |
| Match Accuracy | Hybrid scoring (semantic + BM25 + geo filters) |
| Deployment | Zero-downtime CI/CD via GitHub → Vercel/Railway |
- Deployed frontend to Vercel with automatic GitHub CI/CD
- Containerized FastAPI backend on Railway with Docker
- Provisioned Railway PostgreSQL with 12,168 expert records
- Configured proper CORS, healthchecks, and environment variables
- Fixed Prisma schema validation errors for Vercel builds
- Resolved Keras 3 compatibility issues with sentence-transformers (pinned versions)
- Configured proper PORT binding for Railway healthchecks
- Implemented correct OpenSSL library targets for Prisma binary engines
- Refactored search architecture to call FastAPI directly (bypassing SSR limitations)
- Created batch seed script (100 records/batch) — reduced import time by 90%
- Implemented candidate-based filtering for geo-aware semantic search
- Optimized database queries with proper indexing and connection pooling
- Frontend: Next.js 14 (App Router), TypeScript, TailwindCSS, shadcn/ui primitives, lucide-react
- Auth: NextAuth (Credentials)
- DB/ORM: PostgreSQL + Prisma
- Matching: FastAPI (sentence-transformers/all-MiniLM-L6-v2 + FAISS), TS BM25 fallback (wink-bm25)
- Infra: Docker Compose (db, api, web), Makefile
- Testing: Playwright (E2E), Vitest (unit), PyTest (backend)
- AI Match %: Combines semantic similarity (embeddings), BM25 keywords, and filter bonuses (geo/specialty/license) into one score.
- Explainability: “Why this match?” shows top matched terms and weights.
- Expert Profiles: Qualifications, current/completed projects, ratings, donut gauge.
- Admin CSV Ingest: Upload/ingest experts and rebuild the vector index.
- Resilient: BM25‑only mode works without the Python service.
- Containerized:
docker compose upbrings up web, api, and db.
-
Marketing hero & carousel:
apps/web/app/(marketing)/page.tsx -
Search results with Match % badge:
apps/web/app/search/page.tsx -
Expert profile with donut gauge + Why this match?:
apps/web/app/experts/[id]/page.tsx -
Optional images you can view (example paths):
apps/web/public/images/wireframes/search-wireframe.pngapps/web/public/images/wireframes/profile-wireframe.png
Tip: keep filenames kebab‑case without spaces. Place assets under
apps/web/public/images/....
apps/
web/ # Next.js app (App Router, API routes, UI)
backend/
app/ # FastAPI: /embed, /search, /index/build
scripts/ # build_index.py, ingest helpers
prisma/ # Prisma schema & migrations
scripts/ # seed.ts (CSV import + index build trigger)
data/ # CSV goes here (not committed)
vectorstore/ # FAISS/TF‑IDF artifacts (generated)
Data flow: CSV → Prisma seed → Postgres → FastAPI builds FAISS index → /api/search blends semantic + BM25 + filters → UI renders cards with Match % and explanations.
- Frontend: Next.js 14 (App Router), TypeScript, TailwindCSS, shadcn/ui, lucide-react
- Auth: NextAuth (Credentials)
- DB/ORM: PostgreSQL + Prisma
- Matching: FastAPI +
sentence-transformers/all-MiniLM-L6-v2+ FAISS (cosine), BM25 fallback (wink-bm25) - Infra: Docker Compose (db/api/web), Makefile,
.envtemplates - Testing: Playwright (E2E), Vitest (unit), PyTest (backend)
Create an .env at the repo root (commit a safe .env.example; never commit real secrets). Use db host in Docker; localhost when running everything on your machine.
# Next.js
NEXTAUTH_URL=http://localhost:3000
NEXTAUTH_SECRET=replace-with-random-secret
# Database
# For Docker:
# DATABASE_URL=postgresql://postgres:postgres@db:5432/daidaex?schema=public
# For local (no Docker):
DATABASE_URL=postgresql://postgres:postgres@localhost:5432/daidaex?schema=public
# Recommender API (FastAPI)
RECO_API_URL=http://localhost:8000
# Match weights (semantic / keywords / filters)
MATCH_W_SEM=0.6
MATCH_W_KW=0.25
MATCH_W_FILT=0.15
# Fallback mode (no Python backend)
BM25_ONLY=false
# Admin seed (optional for demo)
ADMIN_EMAIL=admin@example.com
ADMIN_PASSWORD=admin123Create backend/.env for the Python service:
MODEL_NAME=sentence-transformers/all-MiniLM-L6-v2
VECTOR_INDEX_PATH=./vectorstore/experts.index
PORT=8000
ALLOWED_ORIGINS=http://localhost:3000Prereqs: Docker Desktop (or Engine).
pnpmis optional (used inside the containers).
-
Add data
mkdir -p data # place your CSV at: ./data/construction_companies_enriched_v2.csv -
Start services
make up # or: docker compose up --build -d -
Migrate & seed
make seed # (runs prisma migrate + seed + triggers index build) -
Open the app → http://localhost:3000
Services: db (Postgres 15), api (FastAPI :8000), web (Next.js :3000). Health checks ensure web waits for db/api.
Health checks
curl http://localhost:3000/api/health
curl http://localhost:8000/healthPrereqs: Node 20+, pnpm, Python 3.11+, Postgres running locally.
pnpm install
pnpm prisma:generate
pnpm prisma:migrate
# Backend
cd backend
pip install -r requirements.txt
uvicorn app.main:app --port 8000 --reload
# Web (new terminal)
cd ../apps/web
pnpm dev
# Seed (repo root, new terminal)
pnpm seedOpen http://localhost:3000 and try a query like:
Looking for architect that specializes in sandstone in Wilmington, NC
- Place CSV at
./data/construction_companies_enriched_v2.csv - Supported columns:
company_name, contact_number, categories, description, city, state, lat, lon, rating, years_experience, thumbnail_url, certifications, specialties - Multi‑value fields split on
|or, - Missing
thumbnail_url→ ui‑avatars placeholder
Body
{
"query": "architect sandstone in Wilmington, NC",
"filters": {"state": "NC", "city": "Wilmington", "specialties": ["Sandstone", "Masonry"], "minRating": 4.0},
"limit": 24,
"offset": 0
}Response (excerpt)
{
"results": [
{
"id": "exp_123",
"name": "Sarah Carver",
"city": "Charlotte",
"state": "NC",
"specialties": ["Stone Mason", "Restoration"],
"rating": 4.5,
"thumbnailUrl": "/images/experts/sarah.jpg",
"match": { "score": 90.2, "explain": ["sandstone (0.27)", "masonry (0.19)", "Wilmington→NC proximity (0.14)"] }
}
],
"total": 128,
"tookMs": 45
}Returns full profile. Add ?q=... to return query‑specific explanation.
-
Web unit (Vitest)
pnpm --filter @daidaex/web test -
E2E (Playwright) — app must be running
pnpm --filter @daidaex/web test:e2e # If browsers missing: # npx playwright install
-
Backend (PyTest)
cd backend pytest
-
FAISS wheels: if
faiss-cpufails to install, prefer Docker. Otherwise try:pip install faiss-cpu==1.7.4(often better aarch64 support), orconda install -c conda-forge faiss-cpu.
-
Docker build: set explicit platform if needed:
services: api: platform: linux/amd64
-
Slow model download: pre‑warm by hitting
/embedonce, or run backend locally to cache the model.
- Use WSL2 (Ubuntu) with Docker Desktop integration.
- Run commands inside WSL; ensure Node 20+ and pnpm are installed there.
- If volume mounts are slow, keep the repo in the WSL filesystem (
~/), not on a Windows drive. - Playwright E2E may require
npx playwright install.
- Port conflicts: change ports in
docker-compose.ymlor env. - Seed errors: verify CSV path and headers; check
DATABASE_URLconnectivity. - Backend down: set
BM25_ONLY=trueto run without FastAPI.
-
Bring up the stack
make up
-
Seed data & build FAISS
make seed
-
Open the app
open http://localhost:3000
-
Search demo
open "http://localhost:3000/search?q=sandstone%20Wilmington" -
Run tests (web unit, E2E, backend)
pnpm --filter @daidaex/web test pnpm --filter @daidaex/web test:e2e (cd backend && pytest)
-
Drop + reseed
make down make up pnpm prisma:migrate pnpm seed
-
BM25‑only mode
export BM25_ONLY=true make up
MIT. See LICENSE.
Note: This repository is a generic prototype for AI‑driven expert matching in the construction domain. It does not contain any proprietary company code or data.
Gopala Krishna
- GitHub: @igopalakrishna
- Live Demo: daidaex-henna.vercel.app
Built with ❤️ showcasing full-stack development, AI/ML integration, and cloud deployment expertise. PRs and improvements welcome.