A modern fullstack cookbook app showcasing AI recipes with Modular MAX and other AI services. Built with FastAPI (Python) and React (TypeScript) for maximum flexibility and performance.
π¦ Looking for legacy recipes? Older standalone recipes have been moved to the
archivebranch. These are provided as-is for historical reference only and are no longer maintained.
- Python 3.11 or higher; we recommend uv 0.7+ for working with Python
- Node.js 22.x or higher; we recommend pnpm 10.17+ for working with Node.js
git clone https://github.com/modular/max-agentic-cookbook.git
cd max-agentic-cookbookcp backend/.sample.env backend/.env.localOpen .env.local in your favorite text editor and supply a valid MAX or OpenAI-compatible endpoint.
cd backend && uv sync
cd ..
cd frontend && npm install- Open the
max-agentic-cookbookfolder in VS Code - Open the Run & Debug panel
- Choose Full-Stack Debug
Run the backend and frontend separately in two terminals.
Terminal 1 (Backend):
cd backend
uv run devTerminal 2 (Frontend):
cd frontend
npm run devVisit http://localhost:5173 to see the app.
FastAPI backend + React SPA with separate projects for clean separation:
max-recipes/
βββ backend/ # FastAPI Python API (port 8010)
βββ frontend/ # Vite React TypeScript SPA (port 5173 local)
βββ docs/ # Architecture, contributing, Docker guides
Why this architecture?
- Separate projects - First-class ecosystems for AI and UI development
- No SSR needed - Just plain React, copy-paste into any project
Run the complete stack with MAX model serving + backend + frontend:
# Build
docker build -t max-cookbook .
# Run (NVIDIA GPU)
docker run --gpus all \
-v ~/.cache/huggingface:/root/.cache/huggingface \
-e "HF_TOKEN=your-huggingface-token" \
-e "MAX_MODEL=mistral-community/pixtral-12b" \
-p 8000:8000 -p 8010:8010 \
max-cookbook
# Run (AMD GPU)
docker run \
--group-add keep-groups \
--device /dev/kfd --device /dev/dri \
-v ~/.cache/huggingface:/root/.cache/huggingface \
-e "HF_TOKEN=your-huggingface-token" \
-e "MAX_MODEL=mistral-community/pixtral-12b" \
-p 8000:8000 -p 8010:8010 \
max-cookbookVisit http://localhost:8010 to see the app.
Services:
- Port 8000: MAX LLM serving (OpenAI-compatible /v1 endpoints)
- Port 8010: Web app (FastAPI backend + React frontend)
Behind the scenes, PM2 orchestrates startup: MAX β web app with automatic health checks and restarts.
The app uses backend/.env.local to configure LLM endpoints:
COOKBOOK_ENDPOINTS='[
{
"id": "max-local",
"baseUrl": "http://localhost:8000/v1",
"apiKey": "EMPTY"
}
]'See backend/.sample.env for a template.
backend/
βββ src/
β βββ main.py # FastAPI entry point
β βββ core/ # Config, utilities
β β βββ endpoints.py # Endpoint management
β β βββ models.py # Model listing
β β βββ code_reader.py # Source code reader
β βββ recipes/ # Recipe routers
β βββ multiturn_chat.py # Multi-turn chat
β βββ image_captioning.py # Image captioning
βββ pyproject.toml # Python dependencies
frontend/
βββ src/
β βββ recipes/ # Recipe components + registry.ts
β β βββ registry.ts # Recipe metadata
β β βββ multiturn-chat/ # Multi-turn chat UI
β β βββ image-captioning/ # Image captioning UI
β βββ components/ # Shared UI (Header, Navbar, etc.)
β βββ routing/ # Routing infrastructure
β βββ lib/ # Custom hooks, API, types
β βββ App.tsx # Entry point
βββ package.json # Frontend dependencies
- Add entry to
frontend/src/recipes/registry.tswith slug, title, description, and component - Create
backend/src/recipes/[recipe_name].pywith FastAPI router - Include router in
backend/src/main.py - Add UI component to
frontend/src/recipes/[recipe-name]/ui.tsx - Add
README.mdxtofrontend/src/recipes/[recipe-name]/ - Routes auto-generate from registry
See Contributing Guide for detailed instructions.
Backend routes (port 8010):
GET /api/health- Health checkGET /api/recipes- List available recipe slugsGET /api/endpoints- List configured LLM endpointsGET /api/models?endpointId=xxx- List models for endpointPOST /api/recipes/multiturn-chat- Multi-turn chat endpointPOST /api/recipes/image-captioning- Image captioning endpointGET /api/recipes/{slug}/code- Get recipe backend source code
Frontend routes (port 5173 local, 8010 Docker):
/- Recipe cards grid/:slug- Recipe demo (interactive UI)/:slug/readme- Recipe documentation/:slug/code- Recipe source code view
Backend:
- FastAPI - Modern Python web framework
- uvicorn - ASGI server
- uv - Fast Python package manager
- openai - OpenAI Python client for LLM proxying
Frontend:
- React 18 - UI library
- TypeScript - Type safety
- Vite - Build tool and dev server
- React Router v7 - Client-side routing
- Mantine v7 - UI component library
- SWR - Lightweight data fetching with caching
- Vercel AI SDK - Streaming chat UI (multi-turn chat recipe)
Docker:
- PM2 - Process manager for orchestrating services
- MAX - High-performance model serving with GPU support
- Architecture Guide - Design decisions, patterns, technology choices
- Contributing Guide - How to add recipes and contribute
- Docker Deployment Guide - Container deployment with MAX
- Project Context - Comprehensive architecture reference for LLMs
Apache-2.0 WITH LLVM-exception
See LICENSE for details.
