A generative search application built to use raglib
, featuring document retrieval to ground LLM-powered answer generation.
- Multi-source document retrieval & ranking (web search via SERP API and Exa.ai, with support to extend to other corpuses/document sources)
- Rich answer formatting via full Markdown support
- Syntax highlighting
- Proof of work via citations and source references embedded in Markdown answer
The application consists of two main components:
- Built using
raglib
for document retrieval and answer generation - Combines multiple document sources (SERP API and Exa.ai)
- Handles concurrent document retrieval and processing
- Implements a streaming API using Server-Sent Events
- Uses model facade to easily swap between LLM providers, currently using Anthropic's Haiku
- Real-time result streaming (SSE) and rendering
- Source document display
- Answer attribution via citations
- Code block and snippet highlighting
- Go 1.21 or later
- Node.js 18 or later
- Yarn package manager
- API keys for:
- SERP API
- Exa API
- Anthropic, OpenAI, any other model providers etc
- Clone the repository:
git clone https://github.com/coopslarhette/raglib-demo
cd raglib-demo
- Install backend dependencies:
go mod download
- Install frontend dependencies:
cd web-client
yarn install
- Set up environment variables:
# Backend (.env)
ANTHROPIC_API_KEY=your_anthropic_key
SERP_API_KEY=your_serp_key
EXA_API_KEY=your_exa_key
# Frontend (.env.local)
NEXT_PUBLIC_API_URL=http://localhost:5080
- Start the backend server:
go run main.go
- Start the frontend development server:
cd web-client
yarn dev
The application will be available at http://localhost:3000
.
raglib-demo/
├── api/ # Backend API handlers and server setup
├── search.go # Main search handler/backend entry point
├── web-client/ # Frontend Next.js application
├── src/
├── app/ # Next.js app router components
├── search/ # Search-related components and logic