Skip to content

Build your own generative UI chatbot using the Vercel AI SDK and Google Gemini

License

Notifications You must be signed in to change notification settings

OpulentiaAI/gemini-chatbotz

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

209 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Next.js 14 and App Router-ready AI chatbot.

An Open-Source AI Chatbot Template Built With Next.js and the AI SDK by Vercel.

Features · Model Providers · Deploy Your Own · Running locally


Features

  • Next.js App Router
    • Advanced routing for seamless navigation and performance
    • React Server Components (RSCs) and Server Actions for server-side rendering and increased performance
  • AI SDK
    • Unified API for generating text, structured objects, and tool calls with LLMs
    • Hooks for building dynamic chat and generative user interfaces
    • Supports Google (default), OpenAI, Anthropic, Cohere, and other model providers
  • shadcn/ui
  • Data Persistence
  • Authentication
    • Better Auth + Convex auth component (email + password, exclusive—NextAuth removed)

Model Providers

This template ships with Google Gemini gemini-1.5-pro models as the default. However, with the AI SDK, you can switch LLM providers to OpenAI, Anthropic, Cohere, and many more with just a few lines of code.

Deploy Your Own

You can deploy your own version of the Next.js AI Chatbot to Vercel with one click:

Deploy with Vercel

Running locally

You will need the environment variables defined in .env.example plus the Better Auth and Blob settings below. It's recommended you use Vercel Environment Variables, but a .env.local file is sufficient for dev.

Note: You should not commit your .env file or it will expose secrets that will allow others to control access to your various Google Cloud and authentication provider accounts.

  1. Install Vercel CLI: npm i -g vercel
  2. Link local instance with Vercel and GitHub accounts (creates .vercel directory): vercel link
  3. Download your environment variables: vercel env pull

Required (local dev examples)

# Auth + Convex
CONVEX_DEPLOYMENT=dev:your-deployment
NEXT_PUBLIC_CONVEX_URL=https://your-deployment.convex.cloud
NEXT_PUBLIC_CONVEX_SITE_URL=https://your-deployment.convex.site
SITE_URL=http://localhost:3000
BETTER_AUTH_SECRET=change_me # e.g. `openssl rand -base64 32`

# Storage
BLOB_READ_WRITE_TOKEN=vercel_blob_rw_...

# Models / tools
OPENROUTER_API_KEY=...
GOOGLE_GENERATIVE_AI_API_KEY=...
EXA_API_KEY=...
HYPERBROWSER_API_KEY=...

Run locally

pnpm install
# start convex dev + Next.js
pnpm dev

Your app should now be running on localhost:3000.

Railway deployment (production)

This repo includes railway.toml for Nixpacks. The start command is defined in package.json as next start --hostname 0.0.0.0 --port ${PORT:-3000}, so Railway's deploy.startCommand is simply pnpm start.

Minimal env needed on Railway (example values that are live in production):

NEXT_PUBLIC_CONVEX_URL=https://brilliant-ferret-250.convex.cloud
NEXT_PUBLIC_CONVEX_SITE_URL=https://brilliant-ferret-250.convex.site
SITE_URL=https://chat.opulentia.ai

# Required keys
BETTER_AUTH_SECRET=...
OPENROUTER_API_KEY=...
GOOGLE_GENERATIVE_AI_API_KEY=...
EXA_API_KEY=...
BLOB_READ_WRITE_TOKEN=...
HYPERBROWSER_API_KEY=...

Deploy from the project root:

railway up --service chat-opulent

About

Build your own generative UI chatbot using the Vercel AI SDK and Google Gemini

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • TypeScript 96.7%
  • JavaScript 2.4%
  • CSS 0.9%