An Open-Source AI Chatbot Template Built With Next.js and the AI SDK by Vercel.
Features · Model Providers · Deploy Your Own · Running locally
- Next.js App Router
- Advanced routing for seamless navigation and performance
- React Server Components (RSCs) and Server Actions for server-side rendering and increased performance
- AI SDK
- Unified API for generating text, structured objects, and tool calls with LLMs
- Hooks for building dynamic chat and generative user interfaces
- Supports Google (default), OpenAI, Anthropic, Cohere, and other model providers
- shadcn/ui
- Styling with Tailwind CSS
- Component primitives from Radix UI for accessibility and flexibility
- Data Persistence
- Vercel Postgres powered by Neon for saving chat history and user data
- Vercel Blob for efficient object storage and direct-to-Blob uploads
- Authentication
- Better Auth + Convex auth component (email + password, exclusive—NextAuth removed)
This template ships with Google Gemini gemini-1.5-pro models as the default. However, with the AI SDK, you can switch LLM providers to OpenAI, Anthropic, Cohere, and many more with just a few lines of code.
You can deploy your own version of the Next.js AI Chatbot to Vercel with one click:
You will need the environment variables defined in .env.example plus the Better Auth and Blob settings below. It's recommended you use Vercel Environment Variables, but a .env.local file is sufficient for dev.
Note: You should not commit your
.envfile or it will expose secrets that will allow others to control access to your various Google Cloud and authentication provider accounts.
- Install Vercel CLI:
npm i -g vercel - Link local instance with Vercel and GitHub accounts (creates
.verceldirectory):vercel link - Download your environment variables:
vercel env pull
Required (local dev examples)
# Auth + Convex
CONVEX_DEPLOYMENT=dev:your-deployment
NEXT_PUBLIC_CONVEX_URL=https://your-deployment.convex.cloud
NEXT_PUBLIC_CONVEX_SITE_URL=https://your-deployment.convex.site
SITE_URL=http://localhost:3000
BETTER_AUTH_SECRET=change_me # e.g. `openssl rand -base64 32`
# Storage
BLOB_READ_WRITE_TOKEN=vercel_blob_rw_...
# Models / tools
OPENROUTER_API_KEY=...
GOOGLE_GENERATIVE_AI_API_KEY=...
EXA_API_KEY=...
HYPERBROWSER_API_KEY=...Run locally
pnpm install
# start convex dev + Next.js
pnpm devYour app should now be running on localhost:3000.
This repo includes railway.toml for Nixpacks. The start command is defined in package.json as next start --hostname 0.0.0.0 --port ${PORT:-3000}, so Railway's deploy.startCommand is simply pnpm start.
Minimal env needed on Railway (example values that are live in production):
NEXT_PUBLIC_CONVEX_URL=https://brilliant-ferret-250.convex.cloud
NEXT_PUBLIC_CONVEX_SITE_URL=https://brilliant-ferret-250.convex.site
SITE_URL=https://chat.opulentia.ai
# Required keys
BETTER_AUTH_SECRET=...
OPENROUTER_API_KEY=...
GOOGLE_GENERATIVE_AI_API_KEY=...
EXA_API_KEY=...
BLOB_READ_WRITE_TOKEN=...
HYPERBROWSER_API_KEY=...Deploy from the project root:
railway up --service chat-opulent