This repository showcases how to use OpenRouter for interacting with LLMs in a Node.js (TypeScript) environment using streaming, prompt templates, and document-based Q&A.
- 🤖 Call OpenRouter AI models with streaming support.
- 📄 Ask questions from a PDF document.
- 🛠️ Fully typed with TypeScript.
- 📂 Document-based LLM retrieval with
npx
CLI support.
git clone https://github.com/sfvishalgupta/OpenRouterAIExample.git
cd OpenRouterAIExample
npm install
- Set up environment variables
Create a .env file in the src folder:
OPEN_ROUTER_API_KEY=your_openrouter_api_key
Run the below Command
npx ts-node src/askQuestion.ts <question>
Place your pdf inside src/documents folder
Run the below Command
npx ts-node src/askQuestionFromPDF.ts <pdfPath> <question>
Place your pdf inside src/documents folder
Run the below Command
# Run below command to start Docker.
docker run -p 6333:6333 -p 6334:6334 qdrant/qdrant
npx ts-node src/syncDataToVectorDB.ts <pdfPath> <index_name>
Run the below Command
npx ts-node src/askQuestionFromVector.ts <index_name> <question>
🧠 Tech Stack • Node.js • TypeScript • OpenRouter • PDF-parse for reading PDFs
📬 License MIT — feel free to use and adapt!