Skip to content

A lightweight Bun + Express template that connects to the Testune AI API and streams chat responses in real time using Server-Sent Events (SSE)

Notifications You must be signed in to change notification settings

Testune-AI/express-template

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

⚡ Bun + Express Streaming Chat Template

A lightweight Bun + Express template that connects to the Testune AI API and streams chat responses in real time using Server-Sent Events (SSE).

Use this project as a starting point for building AI apps, proxies, or backend services that need live, token-by-token responses from an AI model.


✨ Features

  • ⚡ Built with Bun + Express
  • 🔌 Connects to the Testune AI API for LLM interactions
  • 📡 Supports SSE streaming (just like OpenAI’s streaming responses)
  • 💬 Example /chat endpoint you can call from your frontend
  • 🛠 Easy to extend with your own routes, auth, or business logic

📂 Project Structure

.
├── src/
│   └── index.ts       # Express server with streaming proxy
├── package.json
├── tsconfig.json
└── README.md

About

A lightweight Bun + Express template that connects to the Testune AI API and streams chat responses in real time using Server-Sent Events (SSE)

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published